Published:
2014-11-05
Proceedings:
Proceedings of the AAAI Conference on Human Computation and Crowdsourcing, 2
Volume
Issue:
Vol. 2 (2014): Second AAAI Conference on Human Computation and Crowdsourcing
Track:
Workshop Citizen + X
Downloads:
Abstract:
In this study, a system for allowing combination of textmining and crowdsourcing of annotation approaches for detection of DDIs from drug package inserts is presented. An annotation study was designed to evaluate expert versus non-expert curation performance, and the impact of NLP pre-annotation on precision and recall on both groups. The design and development of the system and annotation study, consisted of three stages. First, our existing NLP pipeline for DDI extraction was improved, and it was used to preannotate 208 drug product labels with drug mentions and DDIs. Secondly, a DDI machine readable representation scheme was created using the Annotation Ontolgy. This model allowed us to load the NLP preannotated drug label sections into our plugin for human curation created using the Annotation tool DOMEO. Finally, the annotation study was performed along with usability questionnaires for collecting qualitative feedback. To our knowledge, this is the first study in comparing experts and non-experts for pharmacokinetic DDI annotation. Results showed lower performance on non-experts compared with expert annotation without the use of NLP,and an improvement of non-expert annotation performance using the NER module of the NLP assistance. Simplification of the workflow for NLP assisted annotation is necessary for scaling ourapproach.
DOI:
10.1609/hcomp.v2i1.13213
HCOMP
Vol. 2 (2014): Second AAAI Conference on Human Computation and Crowdsourcing
ISBN 978-1-57735-682-0