Post-deployment usability evaluation of a radiology workstation

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

6 Citations (Scopus)


To evaluate the usability of a radiology workstation after deployment in a hospital.

In radiology, it is difficult to perform valid pre-deployment usability evaluations due to the heterogeneity of the user group, the complexity of the radiological workflow, and the complexity of the IT infrastructure in which the workstation has to be embedded. Despite pre-deployment usability engineering efforts, it is therefore likely that usability issues occur when the workstation is used in clinical practice. There are currently no studies that provide insight into the number and nature of usability issues radiologists encounter in clinical practice.

The workstation evaluated in this study consists of three integrated components: an image viewer, a workflow manager and a report editor with speech recognition. The workstation vendor states that the workstation was developed according to the principles of usability engineering.
Twelve radiologists with different specializations participated in this study. A semi-structured interview was used to assess their general opinion about the workstation and to identify the main usability issues they encounter. After the interview, participants were asked to perform their daily work and to vocalize the steps they were taking (think-aloud protocol). They were asked to illustrate usability issues as they were encountered. Video recordings of the screen were made during the observation.
Usability issues were identified and divided into five categories: image arrangement/display, image interaction, workflow, report dictation, and miscellaneous. Positive usability findings were also documented. Excerpts were taken from the video recordings to illustrate the more complex findings.
Each issue was given a severity rating according to the following formula: severity = impact on the user experience (1 = low, 2 = medium, 3 = high) + predicted frequency of occurrence (1 = low, 2 = medium, 3 = high). A severity rating of 2 was considered low, 3 to 4 medium and 5 to 6 high.

Overall, radiologists were moderately satisfied with the workstation, giving it a mean rating of 6.7 out of 10 (range: 6 – 8).
15 positive usability findings were identified, ranging from useful functionalities to interface elements that enable fast workflow patterns.
91 usability issues were identified (14 high, 44 medium, 33 low severity), ranging from issues that cause minor frustration or delay, to issues that cause significant delays or even prevent users from completing tasks. Four issues even pose a potential threat to patient safety.
15 issues were related to image arrangement/display (4 high, 8 medium, 3 low severity), 23 to image interaction (4 high, 8 medium, 11 low severity), 18 to report dictation (3 high, 10 medium, 5 low severity), 23 to workflow (3 high, 11 medium, 9 low severity), and 12 were miscellaneous (7 medium, 5 low severity).

Radiologists encounter a wide variety of usability issues when using the workstation in clinical practice. In addition to pre-deployment usability activities, radiology workstation vendors should devote significant resources to post-deployment usability evaluations in order to identify and fix usability problems that were not discovered in the pre-deployment phase.
Original languageEnglish
Title of host publication6th International Conference on Applied Human Factors and Ergonomics (AHFE 2015) and the Affiliated Conferences
Publication statusPublished - 2015

Cite this