Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Development and preliminary evidence for the validity of an instrument assessing implementation of human-factors principles in medication-related decision-support systems—I-MeDeSA

Development and preliminary evidence for the validity of an instrument assessing implementation... AbstractBackground Medication-related decision support can reduce the frequency of preventable adverse drug events. However, the design of current medication alerts often results in alert fatigue and high over-ride rates, thus reducing any potential benefits.Methods The authors previously reviewed human-factors principles for relevance to medication-related decision support alerts. In this study, instrument items were developed for assessing the appropriate implementation of these human-factors principles in drug–drug interaction (DDI) alerts. User feedback regarding nine electronic medical records was considered during the development process. Content validity, construct validity through correlation analysis, and inter-rater reliability were assessed.Results The final version of the instrument included 26 items associated with nine human-factors principles. Content validation on three systems resulted in the addition of one principle (Corrective Actions) to the instrument and the elimination of eight items. Additionally, the wording of eight items was altered. Correlation analysis suggests a direct relationship between system age and performance of DDI alerts (p=0.0016). Inter-rater reliability indicated substantial agreement between raters (κ=0.764).Conclusion The authors developed and gathered preliminary evidence for the validity of an instrument that measures the appropriate use of human-factors principles in the design and display of DDI alerts. Designers of DDI alerts may use the instrument to improve usability and increase user acceptance of medication alerts, and organizations selecting an electronic medical record may find the instrument helpful in meeting their clinicians' usability needs. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of the American Medical Informatics Association Oxford University Press

Development and preliminary evidence for the validity of an instrument assessing implementation of human-factors principles in medication-related decision-support systems—I-MeDeSA

Loading next page...
 
/lp/oxford-university-press/development-and-preliminary-evidence-for-the-validity-of-an-instrument-PJW04xGC3v

References (22)

Publisher
Oxford University Press
Copyright
© 2011, Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
ISSN
1067-5027
eISSN
1527-974X
DOI
10.1136/amiajnl-2011-000362
pmid
21946241
Publisher site
See Article on Publisher Site

Abstract

AbstractBackground Medication-related decision support can reduce the frequency of preventable adverse drug events. However, the design of current medication alerts often results in alert fatigue and high over-ride rates, thus reducing any potential benefits.Methods The authors previously reviewed human-factors principles for relevance to medication-related decision support alerts. In this study, instrument items were developed for assessing the appropriate implementation of these human-factors principles in drug–drug interaction (DDI) alerts. User feedback regarding nine electronic medical records was considered during the development process. Content validity, construct validity through correlation analysis, and inter-rater reliability were assessed.Results The final version of the instrument included 26 items associated with nine human-factors principles. Content validation on three systems resulted in the addition of one principle (Corrective Actions) to the instrument and the elimination of eight items. Additionally, the wording of eight items was altered. Correlation analysis suggests a direct relationship between system age and performance of DDI alerts (p=0.0016). Inter-rater reliability indicated substantial agreement between raters (κ=0.764).Conclusion The authors developed and gathered preliminary evidence for the validity of an instrument that measures the appropriate use of human-factors principles in the design and display of DDI alerts. Designers of DDI alerts may use the instrument to improve usability and increase user acceptance of medication alerts, and organizations selecting an electronic medical record may find the instrument helpful in meeting their clinicians' usability needs.

Journal

Journal of the American Medical Informatics AssociationOxford University Press

Published: Dec 21, 2011

There are no references for this article.