TY - JOUR
T1 - Simple question answering over a domain-specific knowledge graph using BERT by transfer learning
AU - Vegupatti, Mani
AU - Nickles, Matthias
AU - Chakravarthi, Bharathi Raja
N1 - Publisher Copyright:
© 2020 for this paper by its authors. Use permitted under Creative Commons License Attribution 4.0 International (CC BY 4.0).
PY - 2020
Y1 - 2020
N2 - We build and evaluate a baseline for simple question answering over a domain-specific knowledge graph by using a pretrained open-domain language model BERT. Training a neural network from scratch needs a large annotated dataset whereas transfer learning adapts a pretrained language model and allows task-specific fine-tuning with limited-data. However, building a domain-specific language model needs a large amount of domain-specific text, resource, and time for pretraining. But open-domain language models such as BERT are readily available for use. Hence, we evaluate the open-domain pretrained BERT for creating a domain-specific question answering baseline model that requires less amount of training data. In this work, we built a BioMed domain simple question answering system by fine-tuning the open-domain BERT with a manually curated dataset of -600 questions from the Drugbank knowledge graph published by Bio2RDF.
AB - We build and evaluate a baseline for simple question answering over a domain-specific knowledge graph by using a pretrained open-domain language model BERT. Training a neural network from scratch needs a large annotated dataset whereas transfer learning adapts a pretrained language model and allows task-specific fine-tuning with limited-data. However, building a domain-specific language model needs a large amount of domain-specific text, resource, and time for pretraining. But open-domain language models such as BERT are readily available for use. Hence, we evaluate the open-domain pretrained BERT for creating a domain-specific question answering baseline model that requires less amount of training data. In this work, we built a BioMed domain simple question answering system by fine-tuning the open-domain BERT with a manually curated dataset of -600 questions from the Drugbank knowledge graph published by Bio2RDF.
KW - BERT
KW - Knowledge Graph
KW - Question Answering
KW - Transfer Learning
UR - https://www.scopus.com/pages/publications/85099350031
M3 - Conference article
AN - SCOPUS:85099350031
SN - 1613-0073
VL - 2771
SP - 289
EP - 300
JO - CEUR Workshop Proceedings
JF - CEUR Workshop Proceedings
T2 - 28th Irish Conference on Artificial Intelligence and Cognitive Science, AICS 2020
Y2 - 7 December 2020 through 8 December 2020
ER -