Simple question answering over a domain-specific knowledge graph using BERT by transfer learning

Research output: Contribution to a Journal (Peer & Non Peer)Conference articlepeer-review

1 Citation (Scopus)

Abstract

We build and evaluate a baseline for simple question answering over a domain-specific knowledge graph by using a pretrained open-domain language model BERT. Training a neural network from scratch needs a large annotated dataset whereas transfer learning adapts a pretrained language model and allows task-specific fine-tuning with limited-data. However, building a domain-specific language model needs a large amount of domain-specific text, resource, and time for pretraining. But open-domain language models such as BERT are readily available for use. Hence, we evaluate the open-domain pretrained BERT for creating a domain-specific question answering baseline model that requires less amount of training data. In this work, we built a BioMed domain simple question answering system by fine-tuning the open-domain BERT with a manually curated dataset of -600 questions from the Drugbank knowledge graph published by Bio2RDF.

Original languageEnglish
Pages (from-to)289-300
Number of pages12
JournalCEUR Workshop Proceedings
Volume2771
Publication statusPublished - 2020
Event28th Irish Conference on Artificial Intelligence and Cognitive Science, AICS 2020 - Dublin, Ireland
Duration: 7 Dec 20208 Dec 2020

Keywords

  • BERT
  • Knowledge Graph
  • Question Answering
  • Transfer Learning

Fingerprint

Dive into the research topics of 'Simple question answering over a domain-specific knowledge graph using BERT by transfer learning'. Together they form a unique fingerprint.

Cite this