Infusing Disease Knowledge into BERT for Health Question Answering, Medical Inference and Disease Name Recognition Academic Article uri icon

abstract

  • Knowledge of a disease includes information of various aspects of the disease, such as signs and symptoms, diagnosis and treatment. This disease knowledge is critical for many health-related and biomedical tasks, including consumer health question answering, medical language inference and disease name recognition. While pre-trained language models like BERT have shown success in capturing syntactic, semantic, and world knowledge from text, we find they can be further complemented by specific information like knowledge of symptoms, diagnoses, treatments, and other disease aspects. Hence, we integrate BERT with disease knowledge for improving these important tasks. Specifically, we propose a new disease knowledge infusion training procedure and evaluate it on a suite of BERT models including BERT, BioBERT, SciBERT, ClinicalBERT, BlueBERT, and ALBERT. Experiments over the three tasks show that these models can be enhanced in nearly all cases, demonstrating the viability of disease knowledge infusion. For example, accuracy of BioBERT on consumer health question answering is improved from 68.29% to 72.09%, while new SOTA results are observed in two datasets. We make our data and code freely available.

published proceedings

  • Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)

author list (cited authors)

  • He, Y., Zhu, Z., Zhang, Y., Chen, Q., & Caverlee, J.

citation count

  • 16

complete list of authors

  • He, Yun||Zhu, Ziwei||Zhang, Yin||Chen, Qin||Caverlee, James

publication date

  • January 2020