Using Artificial Intelligence to Make Use of Electronic Health Records Less Painful—Fighting Fire With Fire
Using Artificial Intelligence to Make Use of Electronic Health Records Less Painful—Fighting Fire With Fire
Sometimes it takes a computer to solve problems created by a computer. The travails of using + Related article
electronic health records (EHRs) are well known and widely experienced. Literature identifies EHRs Author affiliations and article information are
both as a major time sink for clinicians1 and as a major driver of burnout and frustration.2 Electronic listed at the end of this article.
health records have improved many things at the point of care, especially when used within the same
practice or system. Being able to see trended laboratory or vital sign data in tabular or graphic format,
seamless prescription ordering, and consistent access to concurrent data, such as telephone calls,
generated in real time by colleagues in practice are all definite improvements over the old days of
paper records. However, one thing that is clearly worse: dealing with scanned copies of old records.
They are unindexed, unlabeled, and, once uploaded as unidentified PDFs, completely inscrutable.
One waits for the PDF to load, hoping against hope that one is not about to see a fax cover sheet or a
copy of an insurance preauthorization. The time and energy demanded for review is quite high, and
one is tempted to ignore them altogether.
The study by Chi et al3 in JAMA Network Open offers a ray of hope that a promising solution
could be on the horizon. Using a clever research design simulating real practice, the authors took
assorted bundles of scanned old records and put them in front of clinicians to review, asking the
clinicians to answer 22 questions about each packet that required information derived from
somewhere in the stack. Once presented with the question, a timer started, so the authors were able
to track how much time the clinicians spent finding the information in the records. The researchers
also built an artificial intelligence (AI) tool that learned how to look at the PDFs and categorize the
documents within them: Is this a pathology report? A procedure? An imaging report? They were also
able to obtain information on the date of the document and combine “pages” into “documents” so
that a multipage report could be recognized as such. They subjected the scanned records to the AI
algorithm and presented the results to the participating clinicians using a web interface that
described the document type and date and hyperlinked to the actual document in the scanned
document. After obliging each clinician in the study to undergo a standardized short web-based
training on how to use the output of the AI tool, they presented to each clinician 1 record that had
been prescreened by the AI algorithm and 1 that had not, distributing their records among 12 willing
clinicians. They compared the time it took to find the information as well as the accuracy of the
information found using the 2 different approaches, one supported by the AI tool and the other the
old-fashioned way. These first-time physician users spent 18% less time finding the information when
supported by the AI tool with comparable accuracy. Eleven of the 12 volunteering clinicians in the
study said they would use this tool if it were available to them, and that they would recommend it to
colleagues.
Of course, there are many limitations to this study. It included 12 willing physicians at 1
institution; it was performed in the specialty of gastroenterology; and there was some rate of
miscategorization of uncertain clinical or patient safety consequence. However, it can and should be
understood as a proof of concept: AI can be used as a first-pass technology to reduce the workload
of a clinician who must wade through voluminous old records. The users had a variety of suggestions
for improving the process; however, the fact that, after only a short online training, the physicians
saved such a meaningful amount of time and were positively disposed to having the tool available in
real life is truly impressive.
Open Access. This is an open access article distributed under the terms of the CC-BY License.
JAMA Network Open. 2021;4(7):e2118298. doi:10.1001/jamanetworkopen.2021.18298 (Reprinted) July 23, 2021 1/2
One can imagine a variety of directions from here: the AI could be embedded in the EHR itself,
someone could develop it as a commercial product, or it could be incorporated in the process of
scanning records. Any of these approaches would be welcome relief to hard-pressed clinicians
drowning in seas of unstructured data.
As much as we can and should celebrate the promise of this breakthrough, it is worth noting
that there are several other, better ways to solve this problem. If we had truly robust standards for
electronic data interchange and less anxiety about privacy, these kinds of data could be moved
around more freely in a structured format. Of course, there are regional exchanges where they do.
The data could also be created in structured format to begin with. The very gastroenterologists
participating in the study are paid to perform procedures regardless of the format in which the
procedure report is produced; one could imagine a world in which failure to produce a machine-
readable structured procedure report precluded being paid at all. There are glimmers and
communities wherein these solutions are more attainable than others; however, as long as we live in
the Babel of free-form non-interoperable medical documentation, it is likely that an AI tool
supporting humans who care for patients by wading through volumes of scanned documents can be
a real contribution.
ARTICLE INFORMATION
Published: July 23, 2021. doi:10.1001/jamanetworkopen.2021.18298
Open Access: This is an open access article distributed under the terms of the CC-BY License. © 2021 Baron RJ.
JAMA Network Open.
Corresponding Author: Richard J. Baron, MD, American Board of Internal Medicine, 510 Walnut St, Ste 1700,
Philadelphia, PA 19106 ([email protected]).
Author Affiliation: American Board of Internal Medicine, Philadelphia, Pennsylvania.
Conflict of Interest Disclosures: None reported.
REFERENCES
1. Sinsky C, Colligan L, Li L, et al. Allocation of physician time in ambulatory practice: a 371 time and motion study
in 4 specialties. Ann Intern Med. 2016;165(11):753-760. doi:10.7326/M16-0961
2. Shanafelt TD, Dyrbye LN, Sinsky C, et al. Relationship between clerical burden and characteristics of the
electronic environment with physician burnout and professional satisfaction. Mayo Clin Proc. 2016;91(7):
836-848. doi:10.1016/j.mayocp.2016.05.007
3. Chi EA, Chi G, Tsui CT, et al. Development and validation of an artificial intelligence system to optimize clinician
review of patient records. JAMA Netw Open. 2021;4(7):e2117391. doi:10.1001/jamanetworkopen.2021.17391
JAMA Network Open. 2021;4(7):e2118298. doi:10.1001/jamanetworkopen.2021.18298 (Reprinted) July 23, 2021 2/2