Powered by OpenAIRE graph

Turing AI Fellowship: Event-Centric Framework for Natural Language Understanding

Funder: UK Research and InnovationProject code: EP/V020579/1
Funded under: EPSRC Funder Contribution: 1,269,620 GBP
visibility
download
views
OpenAIRE UsageCountsViews provided by UsageCounts
downloads
OpenAIRE UsageCountsDownloads provided by UsageCounts
253
56

Turing AI Fellowship: Event-Centric Framework for Natural Language Understanding

Description

Natural language understanding (NLU) aims to allow computers to understand text automatically. NLU may seem easy to humans, but it is extremely difficult for computers because of the variety, ambiguity, subtlety, and expressiveness of human languages. Recent efforts to NLU have been largely exemplified in tasks such as natural language inference, reading comprehension and question answering. A common practice is to pre-train a language model such as BERT on large corpora to learn word representations and fine-tune on task-specific data. Although BERT and its successors have achieved state-of-the-art performance in many NLP tasks, it has been found that pre-trained language models mostly only reason about the surface form of entity names and fail to capture rich factual knowledge. Moreover, NLU models built on such pre-trained language models are susceptible to adversarial attack that even a small perturbation of an input (e.g., paraphrase questions and/or answers in QA tasks) would result in dramatic decrease in models' performance, showing that such models largely rely on shallow cues. In human reading, successful reading comprehension depends on the construction of an event structure that represents what is happening in text, often referred to as the situation model in cognitive psychology. The situation model also involves the integration of prior knowledge with information presented in text for reasoning and inference. Fine-tuning pre-trained language models for reading comprehension does not help in building such effective cognitive models of text and comprehension suffers as a result. In this fellowship, I aim to develop a knowledge-aware and event-centric framework for natural language understanding, in which event representations are learned from text with the incorporation of prior background and common-sense knowledge; event graphs are built on-the-fly as reading progresses; and the comprehension model is self-evolved to understand new information. I will primarily focus on reading comprehension and my goal is to enable computers to solve a variety of cognitive tasks that mimic human-like cognitive capabilities, bringing us a step closer to human-like intelligence.

Data Management Plans
  • OpenAIRE UsageCounts
    Usage byUsageCounts
    visibility views 253
    download downloads 56
  • 253
    views
    56
    downloads
    Powered byOpenAIRE UsageCounts
Powered by OpenAIRE graph

Do the share buttons not appear? Please make sure, any blocking addon is disabled, and then reload the page.

All Research products
arrow_drop_down
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=ukri________::b7d8d7cd21b9dc3d13f2614ab4cc3725&type=result"></script>');
-->
</script>
For further information contact us at helpdesk@openaire.eu

No option selected
arrow_drop_down