Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Hot topics in clinical informatics

Hot topics in clinical informatics In my first editorial as Journal of the American Medical Informatics Association Editor-in-Chief, entitled “Doing What Matters Most,” I called for a consequentialist approach to biomedical informatics that is tied to improving our outcome of interest—human health.1 I further noted that this requires focusing our informatics research and its translation in practice on important health issues and the challenges facing our healthcare system. In this editorial, I highlight 4 clinical informatics articles that reflect a consequentialist perspective; 3 address an aspect of healthcare efficiency2–4 and the fourth describes an approach for mitigating the prescription opioid epidemic.5 In my inaugural editorial, I also noted that a consequentialist approach did not mean a lack of attention to methodological rigor. Thus, the fifth article highlighted from this issue concentrates on a methodological concern: predictive model calibration.6 Electronic health record (EHR)–associated clinician burnout remains a hot topic in clinical informatics and beyond7 and will be the focus of a 2021 Special Issue. See for the call for papers at https://academic.oup.com/jamia/pages/call-for-papers-clinician-burnout. Of high relevance to this topic, 2 articles in this issue address the need for standardized EHR metrics related to clinician efficiency.2,3 In a Perspective, Hron and Lourie2 describe the challenges of using vendor metrics in the context of understanding clinician burnout. One such challenge in their experience is the use of proprietary algorithms for clinician metrics that cannot be validated by the institutions using them. In addition, they identify the lack of industry-standard metrics to analyze and report clinician time spent in the EHR as an additional challenge because of the inability to easily compare metrics across vendor EHRs. They argue for partnerships that would increase the transparency through industry standards and achieve 3 goals: improved representation of EHR-related clinician burden, increased perception that vendors are part of the solution and not just the problem, and an opening up of the possibility of cross-platform benchmarks for institutions. Relevant to the challenges identified by Hron and Lourie, Sinsky and a team of national experts3 propose 7 core measures of EHR use that reflect multiple dimensions of practice efficiency: total EHR time, work outside of work, time on documentation, time on prescriptions, inbox time, teamwork for orders, and an aspirational measure for the amount of undivided attention patients receive from their physicians during an encounter, undivided attention. They note as limitations that these measures are best suited for ambulatory care settings, have an initial focus on physicians, and are time-based. However, Sinsky et al argue that these measures provide an important foundation for further expansion. In addition, they delineate a set of challenges, including those related to measure implementation (eg, definitions, mapping of EHR content to work), validation of proposed measures, normalization (eg, denominators and time frames), and generalizability to other settings. Richesson et al4 examine efficiency from the institutional perspective in their article about assessing the feasibility of clinical decision support (CDS) implementation. Their premise is that a measure of CDS feasibility from the perspective of local data availability and readiness will enable organizations to estimate the technical effort required to implement a CDS intervention that will function effectively as intended.4 Using 10 Choosing Wisely recommendations relevant to the emergency department setting,8 they (1) describe key features of clinical concepts and data required to implement the recommendations as CDS; (2) assess the feasibility, data availability, and requirements for additional data collection; and (3) identify features useful for predicting feasibility of implementing automated CDS for the recommendations in EHR systems. Not surprisingly, a linear mixed model showed that the need for new data collection was predictive of lower implementation feasibility. However, the number of clinical concepts in each recommendation, need for historical data, and ambiguity of clinical concepts did not predict implementation feasibility. A critical application domain for clinical informatics is the opioid epidemic, given its public health significance. In a simulation study, Hussain et al5 examined how the presentation of patient information and CDS advisories influences physician opioid prescribing behavior. In a randomized controlled experiment using 4 simulated patient cases, 24 physicians were randomized to the conventional condition (tabular presentation of prescription opioid prescriptions and interruptive CDS [ie, pop-ups]) or the alternative condition (graphical opioid history, a cue to visit that history, and noninterruptive CDS). Demonstrations of the 2 designs are available online (https://www.ics.uci.edu/∼mihussai/demos/2019-simulation-study/). Based on the judgments of 2 attending pain specialists, physicians in the alternative condition wrote more appropriate prescriptions and most preferred the alternative design to the conventional design. Predictive models remain an important research topic while also gaining traction in practice. Model calibration is critical for making individual predictions, yet many studies focus only on model discrimination. Huang et al6 provide an introductory tutorial on calibration measurements and models. Using existing R packages and custom code with real and simulated data, they demonstrate application of selected calibration measurements and models. The code is available online (https://github.com/easonfg/cali_tutorial). They also describe the pros and cons of the various methods and make practical suggestions on how to use them in practice. It is vital that biomedical and health informatics researchers and practitioners concentrate efforts on what matters most, and Journal of the American Medical Informatics Association remains committed to disseminating such work. CONFLICT OF INTEREST STATEMENT None declared. REFERENCES 1 Bakken S. Doing what matters most . J Am Med Inform Assoc 2019 ; 26 ( 1 ): 1 – 2 . Google Scholar Crossref Search ADS PubMed WorldCat 2 Hron J , Lourie E. Have you got the time? Challenges using vendor Electronic Health Record metrics of provider efficiency . J Am Med Inform Assoc 2020 ; 27 ( 4 ): 644 – 46 . Google Scholar Crossref Search ADS WorldCat 3 Sinsky CA , Rule A , Cohen G , et al. . Metrics for assessing physician activity using electronichealth record log data . J Am Med Inform Assoc 2020 ; 27 ( 4 ): 639 – 43 . Google Scholar Crossref Search ADS WorldCat 4 Richesson RL , Staes CJ , Douthit B , et al. . Measuring implementation feasibility of clinical decision support alerts for clinical practice recommendations . J Am Med Inform Assoc 2020 ; 27 ( 4 ): 514 – 21 . Google Scholar Crossref Search ADS WorldCat 5 Huang Y , Li W , Macheret F , Gabriel RA , Ohno-Machado L. A tutorial on calibration measurements and calibration models for clinical prediction models . J Am Med Inform Assoc 2020 ; 27 ( 4 ): 621 – 33 . Google Scholar Crossref Search ADS WorldCat 6 Hussain MI , Nelson AM , Yeung BG , Sukumar L , Zheng K. How the presentation of patient information and decision-support advisories influences opioid prescribing behavior: a simulation study . J Am Med Inform Assoc 2020 ; 27 ( 4 ): 613 – 20 . Google Scholar Crossref Search ADS WorldCat 7 Bakken S. Can informatics innovation help mitigate clinician burnout? J Am Med Inform Assoc 2019 ; 26 ( 2 ): 93 – 4 . Google Scholar Crossref Search ADS PubMed WorldCat 8 American College of Emergency Physicians. Ten things physicians and patients should question. https://www.choosingwisely.org/societies/american-college-of-emergency-physicians/ Accessed February 26, 2020 . © The Author(s) 2020. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For permissions, please email: journals.permissions@oup.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model) http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of the American Medical Informatics Association Oxford University Press

Loading next page...
 
/lp/oxford-university-press/hot-topics-in-clinical-informatics-PeMky6LZMG

References (8)

Publisher
Oxford University Press
Copyright
© The Author(s) 2020. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For permissions, please email: journals.permissions@oup.com
ISSN
1067-5027
eISSN
1527-974X
DOI
10.1093/jamia/ocaa025
Publisher site
See Article on Publisher Site

Abstract

In my first editorial as Journal of the American Medical Informatics Association Editor-in-Chief, entitled “Doing What Matters Most,” I called for a consequentialist approach to biomedical informatics that is tied to improving our outcome of interest—human health.1 I further noted that this requires focusing our informatics research and its translation in practice on important health issues and the challenges facing our healthcare system. In this editorial, I highlight 4 clinical informatics articles that reflect a consequentialist perspective; 3 address an aspect of healthcare efficiency2–4 and the fourth describes an approach for mitigating the prescription opioid epidemic.5 In my inaugural editorial, I also noted that a consequentialist approach did not mean a lack of attention to methodological rigor. Thus, the fifth article highlighted from this issue concentrates on a methodological concern: predictive model calibration.6 Electronic health record (EHR)–associated clinician burnout remains a hot topic in clinical informatics and beyond7 and will be the focus of a 2021 Special Issue. See for the call for papers at https://academic.oup.com/jamia/pages/call-for-papers-clinician-burnout. Of high relevance to this topic, 2 articles in this issue address the need for standardized EHR metrics related to clinician efficiency.2,3 In a Perspective, Hron and Lourie2 describe the challenges of using vendor metrics in the context of understanding clinician burnout. One such challenge in their experience is the use of proprietary algorithms for clinician metrics that cannot be validated by the institutions using them. In addition, they identify the lack of industry-standard metrics to analyze and report clinician time spent in the EHR as an additional challenge because of the inability to easily compare metrics across vendor EHRs. They argue for partnerships that would increase the transparency through industry standards and achieve 3 goals: improved representation of EHR-related clinician burden, increased perception that vendors are part of the solution and not just the problem, and an opening up of the possibility of cross-platform benchmarks for institutions. Relevant to the challenges identified by Hron and Lourie, Sinsky and a team of national experts3 propose 7 core measures of EHR use that reflect multiple dimensions of practice efficiency: total EHR time, work outside of work, time on documentation, time on prescriptions, inbox time, teamwork for orders, and an aspirational measure for the amount of undivided attention patients receive from their physicians during an encounter, undivided attention. They note as limitations that these measures are best suited for ambulatory care settings, have an initial focus on physicians, and are time-based. However, Sinsky et al argue that these measures provide an important foundation for further expansion. In addition, they delineate a set of challenges, including those related to measure implementation (eg, definitions, mapping of EHR content to work), validation of proposed measures, normalization (eg, denominators and time frames), and generalizability to other settings. Richesson et al4 examine efficiency from the institutional perspective in their article about assessing the feasibility of clinical decision support (CDS) implementation. Their premise is that a measure of CDS feasibility from the perspective of local data availability and readiness will enable organizations to estimate the technical effort required to implement a CDS intervention that will function effectively as intended.4 Using 10 Choosing Wisely recommendations relevant to the emergency department setting,8 they (1) describe key features of clinical concepts and data required to implement the recommendations as CDS; (2) assess the feasibility, data availability, and requirements for additional data collection; and (3) identify features useful for predicting feasibility of implementing automated CDS for the recommendations in EHR systems. Not surprisingly, a linear mixed model showed that the need for new data collection was predictive of lower implementation feasibility. However, the number of clinical concepts in each recommendation, need for historical data, and ambiguity of clinical concepts did not predict implementation feasibility. A critical application domain for clinical informatics is the opioid epidemic, given its public health significance. In a simulation study, Hussain et al5 examined how the presentation of patient information and CDS advisories influences physician opioid prescribing behavior. In a randomized controlled experiment using 4 simulated patient cases, 24 physicians were randomized to the conventional condition (tabular presentation of prescription opioid prescriptions and interruptive CDS [ie, pop-ups]) or the alternative condition (graphical opioid history, a cue to visit that history, and noninterruptive CDS). Demonstrations of the 2 designs are available online (https://www.ics.uci.edu/∼mihussai/demos/2019-simulation-study/). Based on the judgments of 2 attending pain specialists, physicians in the alternative condition wrote more appropriate prescriptions and most preferred the alternative design to the conventional design. Predictive models remain an important research topic while also gaining traction in practice. Model calibration is critical for making individual predictions, yet many studies focus only on model discrimination. Huang et al6 provide an introductory tutorial on calibration measurements and models. Using existing R packages and custom code with real and simulated data, they demonstrate application of selected calibration measurements and models. The code is available online (https://github.com/easonfg/cali_tutorial). They also describe the pros and cons of the various methods and make practical suggestions on how to use them in practice. It is vital that biomedical and health informatics researchers and practitioners concentrate efforts on what matters most, and Journal of the American Medical Informatics Association remains committed to disseminating such work. CONFLICT OF INTEREST STATEMENT None declared. REFERENCES 1 Bakken S. Doing what matters most . J Am Med Inform Assoc 2019 ; 26 ( 1 ): 1 – 2 . Google Scholar Crossref Search ADS PubMed WorldCat 2 Hron J , Lourie E. Have you got the time? Challenges using vendor Electronic Health Record metrics of provider efficiency . J Am Med Inform Assoc 2020 ; 27 ( 4 ): 644 – 46 . Google Scholar Crossref Search ADS WorldCat 3 Sinsky CA , Rule A , Cohen G , et al. . Metrics for assessing physician activity using electronichealth record log data . J Am Med Inform Assoc 2020 ; 27 ( 4 ): 639 – 43 . Google Scholar Crossref Search ADS WorldCat 4 Richesson RL , Staes CJ , Douthit B , et al. . Measuring implementation feasibility of clinical decision support alerts for clinical practice recommendations . J Am Med Inform Assoc 2020 ; 27 ( 4 ): 514 – 21 . Google Scholar Crossref Search ADS WorldCat 5 Huang Y , Li W , Macheret F , Gabriel RA , Ohno-Machado L. A tutorial on calibration measurements and calibration models for clinical prediction models . J Am Med Inform Assoc 2020 ; 27 ( 4 ): 621 – 33 . Google Scholar Crossref Search ADS WorldCat 6 Hussain MI , Nelson AM , Yeung BG , Sukumar L , Zheng K. How the presentation of patient information and decision-support advisories influences opioid prescribing behavior: a simulation study . J Am Med Inform Assoc 2020 ; 27 ( 4 ): 613 – 20 . Google Scholar Crossref Search ADS WorldCat 7 Bakken S. Can informatics innovation help mitigate clinician burnout? J Am Med Inform Assoc 2019 ; 26 ( 2 ): 93 – 4 . Google Scholar Crossref Search ADS PubMed WorldCat 8 American College of Emergency Physicians. Ten things physicians and patients should question. https://www.choosingwisely.org/societies/american-college-of-emergency-physicians/ Accessed February 26, 2020 . © The Author(s) 2020. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For permissions, please email: journals.permissions@oup.com This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic.oup.com/journals/pages/open_access/funder_policies/chorus/standard_publication_model)

Journal

Journal of the American Medical Informatics AssociationOxford University Press

Published: Apr 1, 2020

There are no references for this article.