Topic aware probing: From sentence length prediction to idiom identification how reliant are neural language models on topic?

  • Vasudevan Nedumpozhimana
  • , John D. Kelleher

Research output: Contribution to journalArticlepeer-review

Abstract

Transformer-based neural language models achieve state-of-the-art performance on various natural language processing tasks. However, an open question is the extent to which these models rely on word-order/syntactic or word co-occurrence/topic-based information when processing natural language. This work contributes to this debate by addressing the question of whether these models primarily use topic as a signal, by exploring the relationship between Transformer-based models’ (BERT and RoBERTa’s) performance on a range of probing tasks in English, from simple lexical tasks such as sentence length prediction to complex semantic tasks such as idiom token identification, and the sensitivity of these tasks to the topic information. To this end, we propose a novel probing method which we call topic-aware probing. Our initial results indicate that Transformer-based models encode both topic and non-topic information in their intermediate layers, but also that the facility of these models to distinguish idiomatic usage is primarily based on their ability to identify and encode topic. Furthermore, our analysis of these models’ performance on other standard probing tasks suggests that tasks that are relatively insensitive to the topic information are also tasks that are relatively difficult for these models.

Original languageEnglish
Pages (from-to)936-964
Number of pages29
JournalNatural Language Processing
Volume31
Issue number3
DOIs
Publication statusPublished - 1 May 2025

Keywords

  • idiom token identification
  • neural language model
  • Semantics
  • topic modelling
  • transformer

Fingerprint

Dive into the research topics of 'Topic aware probing: From sentence length prediction to idiom identification how reliant are neural language models on topic?'. Together they form a unique fingerprint.

Cite this