In March 2005, Cincinnati Children’s Hospital Medical Center completed installation of a state-of-the-art integrated RIS-PACS-Voice-Recognition system. Cincinnati Children’s, a 423-bed teaching facility with 16 regional outpatient centers, is one of the nation’s leading pediatric hospitals. The Division of Radiology at Cincinnati Children’s currently employs more than 200 full-time staff and, in 2004, performed over 160,000 examinations. Despite the installation of the new system, there were some persistent problems with workflow, communication, documentation, and patient prioritization. Below is a description of these problems, and the solution we designed and launched to address them.

THE PROBLEM

Our division was still tied to paper requisitions that we used as “tokens” on which technologists wrote patient notes, such as difficulty with positioning, sedation issues, and alternate ordering physician phone numbers. Upon completing studies, the technologists would fax or hand carry paper requisitions to bins next to the radiology workstations, which contained stacks of requisitions in no particular order. At any given time, there were up to 40 requisitions in a stack, with STAT, non-STAT, and various imaging modalities mixed together. One or two radiologists read these cases, in a sequence that was a mixture of first-come, first-served; last-in, first-out; and most urgent, with some randomization noise. The radiologists would pull and interpret stacks of requisitions from the bin at random intervals. It was not clear to the radiologists which requisition should be read next, and this workflow randomized patients so that there was no clear order for interpretation. Once the studies were interpreted, the paper requisitions were shreddedno permanent record of the technologists’ notes were kept. Paper forms with sedation notes were filed separately.

Figure 1. Using the technologist worklist and acuity entry area, the technologist selects appropriate options for each area (Requested Acuity, Subjective Acuity, Patient Waiting, Patient/Parent Anxiety, Requesting MD Anxiety), selects a Service, then Submits. This prioritizes the examination with others waiting to be read and sends it to the radiologist worklist.

At our institution, ordering physicians have the option of requesting a phone call as soon as a radiology report is available. Upon reading an examination, the radiologist would complete a paper form and fax it to a bank of hospital operators, who would then notify the referring physician of negative results or connect the ordering physician and radiologist if there were any positive findings. The radiologists, therefore, had to deal with each case at least twice: once to interpret the study and fax results, and the second time to discuss results with the ordering physician. Frequently, the operator would page the radiologist while the ordering physician was on hold, only to find that the radiologist was tied up with a fluoroscopy study, ultrasound, or another phone call, or was otherwise unavailable. The ordering physician either would be left on hold or would hang up, requiring the operator to repeat the process and interrupt the radiologist again. On busy days, the radiologists found themselves swamped with phone calls, inbound and outbound faxes, and a pile of paper requisitions and fax forms to fill out. The workflow was stressful and inefficient.

We offer patients whose physicians have requested a call report the option of waiting for results in our waiting room. Once radiology staff or the hospital operators have contacted the ordering physician and conveyed study results, the ordering physician is connected to the parents in the waiting room to advise them of the results and an appropriate disposition. Unfortunately, the status of attempts to reach the ordering physicians was sometimes unclear to radiology staff in the waiting room. When anxious parents waiting to hear from the ordering physicians inquired as to whether the physician had been reached, front desk staff sometimes did not know the answer.

Permanent documentation of patient results was also difficult. It was not always clear to whom results had been conveyed, when, and by whom.

Decentralized workflow was a problem. Our radiologists read at different locations, even when working on the same service, such as neuroradiology. The only way to know whether a radiologist had begun to interpret a study was to locate the paper requisition for that casepossession of the requisition was equivalent to ownership of the case. This made load-balancing difficult: if one radiologist was particularly overloaded, ancillary staff had to distribute paper requisitions to other radiologists around the division to help out. This was inefficient and problematic because if one of the radiologists who volunteered to help became especially busy after accepting a stack of requisitions to read, that stack might wait several hours.

Cross-coverage was challenging during conferences or procedures. There was no good way for radiologists temporarily to sign out their service to other staff, given that our department covers multiple outpatient centers with radiologists scattered at multiple satellite locations around the greater metropolitan area.

Some radiologists were struggling to adapt to voice recognition (VR). It was not always clear that the effort required to generate high-quality reports during periods of very high volume was outweighed by the benefit of having results immediately available.

Optimal patient care is not “first-come, first-served.” Since radiology is central to diagnostic and treatment decisions, and the sickest patients require the most rapid diagnoses and treatments, it only makes sense that radiologists should make every effort to interpret the sickest patients’ studies first. However, most RIS-PACS worklists are designed around a first-come, first-served workflow model. While this may be fine in a low-volume practice, in a busy radiology practice environment, particularly if it is decentralized like ours, first-come, first-served is inappropriate.

Unfortunately, however, most RIS-PACS worklists do not consider medical acuity beyond STAT, lumping all examinations so labeled into a single worklist. This approach does not account for the fact that not all STAT examinations are of equal criticality. The term has been overused to the point that it is almost meaningless, because it covers the gamut from “being seen in the emergency department for chronic abdominal pain” to “major trauma.” We found that our radiologists had no good way to decide which STAT examination in the stack of requisitions to read firstand this was a major problem, because that stack was often 30 examinations or more deep. Patients with acute fractures, in significant pain, and needing urgent treatment were sometimes waiting longer for radiology interpretations than comfortable patients with chronic issues who happened to arrive in radiology first. We felt that we could do a better job guiding our staff as they deliver acute radiology care, by getting away from stacks of paper requisitions and providing them with a dynamically sorted electronic reading list, based on the acuity of all cases waiting for interpretation.

Figure 2. The radiologist uses this prioritized list of examinations to select the next case to read. Radiologists log in to a particular service for the day and are shown only cases from this service(s). They can click the Service/Location link at the top of the screen to select and deselect services quickly. This allows the radiologists to easily cross-cover for one another.

ADDITIONAL ISSUES

There were other communication issues we wanted to address in our decentralized practice environment. We read urgent studies from 10 satellite locations in addition to the main hospital. It was often difficult to reach ordering physicians with important results, and to broker communication between technologists and radiologists without disrupting workflow. Moreover, there was no good overview of division operations, so a technologist at an outpatient center, where business was slow, might call the main campus to expedite reading of a concerning case without realizing that the radiologist was already swamped by high traffic from other sites. Well-intentioned phone calls from technologists and ordering physicians caused frequent interruptions to radiologists’ workflow at the busiest timesprecisely when the radiologists needed to work at their peak efficiency. As a result, productivity suffered. Most important, we worried that we were not delivering the best patient care possible.

As an academic institution, we train residents and fellows, and sometimes staff interpretations differ from the preliminary interpretations given by trainees. In such cases, we needed an efficient way to convey and document changes in interpretations. We also needed to be able to determine whether and when an ordering physician, their partner, or their office staff were told of an important radiology result, and by whom. Finally, we needed to improve communication among ourselves, so that it was clear to everyone which technologist and which radiologists were involved in any particular case at any given time. For instance, it is frustrating and wasteful to begin dictating a case you have been thinking about for 10 minutes, only to discover that a colleague has already begun dictating it. We wanted to put a stop to those problems.

THE SOLUTION

The Informatics Research Core of the Division of Radiology at Cincinnati Children’s designed and built the new system to address all these issues and develop an automated system for better prioritizing patients. In addition, the team sought to conduct outcomes studies to determine the impact of its new solution. We collaborated with the University of Cincinnati College of Business, involving researchers who focus on process improvement; for example, working with a shipping company to make sure its trucks run the most efficient routes, and remain full as often as possible. The College of Business helped construct the prioritization algorithm, and helped us to measure whether changes we implemented actually had their desired effects.

Whereas the project began as an effort to triage emergency department radiology examinations, over time it expanded in scope. Since case prioritization must occur throughout radiology, and communication and documentation tools are important for all radiology services, we designed the system so that it could easily be used throughout the entire division. Ultimately, we designed a system that would accurately and automatically triage radiology examinations, broker more efficient communication within and beyond radiology, provide permanent documentation of communications regarding important radiology findings with referring physicians and their staff, and support division-wide paperless workflow.

In designing the prioritization algorithms underlying the triage system, the team considered several major factors, including 1) the patient’s medical condition, such as type of injury or symptoms, 2) psychological factors, such as the anxiety of the patient or referring physician, and 3) operational aspects, such as the department’s goal of a 1-hour maximum turnaround time for all STAT examinations.

Our team designed the system with several goals. First, we wanted it to be highly accurate in ordering cases like an experienced radiologist would. Second, it needed to reflect accurately the mental heuristics used by radiologists in making triage decisions. And third, it had to consider medical, psychological, and operational factors in prioritizing patients.

We examined a large number of potentially influential factors, but we were able to limit the factors actually employed by the automated algorithm after analyzing triage decisions made by a group of experienced radiologists at Cincinnati Children’s. Through quantitative analysis of decisions made by the radiologists, the algorithm was statistically developed and then revalidated. The algorithm’s performance was highly correlated with the physician decision-makers we used as a model, and it is far more consistent given the inherent subjectivity of triage decisions. In the design of the algorithm, we were able to employ a relatively parsimonious number of variables while making high-quality prioritization decisions and preserving the system’s sensitivity to important decision-making factors.

Once we developed and validated the triage algorithm, we incorporated it into the broader workflow solution. Our technologists trigger the algorithm during entry of new cases into the radiology information system. Only five mouse clicks are required to enter the information used by the system to prioritize patients.

The system, now implemented division-wide, takes radiology workflow at Cincinnati Children’s to the next level. For instance, if a radiologist wants to speak to an ordering physician about a case, she simply clicks a button and moves on to the next case in the prioritized worklist. Meanwhile, the software automatically alerts ancillary staff of the requested contact, who then track down the ordering physician, get them on the phone, and connect them to the radiologist.

As another example, rather than having to call or fax reports to clinicians who request a “call report,” radiologists at Cincinnati Children’s now merely dictate the case (as per normal routine using voice recognition) and the software automatically routes the final, signed report to a group of hospital operators. On the operators’ workstation screens, the system shows an alert that a new case requires their attention, and provides them with contact information both for the ordering physician and for the radiologist, should that physician have a follow-up question. The operator then reads the report to the requesting physician and logs this conveyance electronically. Meanwhile, the radiologist has already moved to the next highest priority case on the worklist. By using our system to leverage the power of voice recognition to decrease interruptions to radiologists, we believe that the implementation of our system has significantly improved radiologists’ acceptance of VR.

Our system is web-based and runs in parallel with the integrated RIS-PACS-VR, pulling examination and patient data as well as contact information for ordering physicians and radiologists so that our technologists do not need to enter it manually.

CONCLUSION

The new system has only recently been launched. We have not had to make any staffing changes to accommodate its use. However, anecdotally, the general perception is that the radiologists can now work much more efficiently and with fewer interruptions, and the technologists like the system because they no longer need to hand carry or fax requisitions, and the system makes workflow much more transparent.

Our team is conducting a series of studies measuring the impact of the implementation of combined RIS-PACS, and prior to and following implementation of combined RIS-PACS-VR. For each phase of this series of studies, we are measuring operational metrics such as average turnaround times for various acuity cases; the frequency, duration, and type of interruptions experienced by our staff; radiologist, technologist, and ancillary radiology staff stress and job satisfaction indices; and patient satisfaction. We are now using the same techniques to measure the impact of our paperless workflow, communication, documentation, and case prioritization software on these metrics.

We will present the results of this study at the upcoming annual meeting of the Radiological Society of North America in Chicago on November 28, 2005.

Mark J. Halsted, MD, is chief, informatics research core, department of radiology, Cincinnati Children’s Hospital Medical Center

Craig M. Froehle, PhD, is operations management faculty, College of Business, University of Cincinnati

Hong Yang, MS, is applications specialist II

Laurie A. Perry, RN, is program development specialist, Cincinnati Children’s Hospital Medical Center

Neil D. Johnson, MD, is associate director, radiology informatics, department of radiology, Cincinnati Children’s Hospital Medical Center, Ohio