Download PDFOpen PDF in browser

BertMCN: Mapping Colloquial Phrases to Standard Medical Concepts Using BERT and Highway Network

EasyChair Preprint no. 2268, version 3

Versions: 123history
29 pagesDate: January 5, 2021

Abstract

In the last few years, people started to share lot of information related to health in the form of tweets, reviews and blog posts. All these user generated clinical texts can be mined to generate useful insights. However, automatic analysis of clinical text requires identication of standard medical concepts. Most of the existing deep learning based  medical concept normalization systems  are based on CNN or RNN. Performance of these models is limited as they have to be trained from scratch (except embeddings). In this work, we propose a normalization system based on pre-trained BERT and highway layer. BERT, a pre-trained context sensitive language representation model advanced the state-of-the-art performance in many NLP tasks and gating mechanism in highway layer helps the model to choose only important information. Experimental results show that our model outperformed all existing methods on two standard datasets.

Keyphrases: BERT, Highway layer, medical concept normalization, Natural Language Processing, Social Media Text

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@Booklet{EasyChair:2268,
  author = {Katikapalli Subramanyam Kalyan and S. Sangeetha},
  title = {BertMCN: Mapping Colloquial Phrases to Standard Medical Concepts Using BERT and Highway Network},
  howpublished = {EasyChair Preprint no. 2268},

  year = {EasyChair, 2021}}
Download PDFOpen PDF in browser