Piqa: Reasoning about physical commonsense in natural language
To apply eyeshadow without a brush, should I use a cotton swab or a toothpick? Questions
requiring this kind of physical commonsense pose a challenge to today's natural language …
requiring this kind of physical commonsense pose a challenge to today's natural language …
Experience grounds language
Language understanding research is held back by a failure to relate language to the
physical world it describes and to the social interactions it facilitates. Despite the incredible …
physical world it describes and to the social interactions it facilitates. Despite the incredible …
oLMpics-on what language model pre-training captures
Recent success of pre-trained language models (LMs) has spurred widespread interest in
the language capabilities that they possess. However, efforts to understand whether LM …
the language capabilities that they possess. However, efforts to understand whether LM …
Representing numbers in NLP: a survey and a vision
NLP systems rarely give special consideration to numbers found in text. This starkly
contrasts with the consensus in neuroscience that, in the brain, numbers are represented …
contrasts with the consensus in neuroscience that, in the brain, numbers are represented …
Birds have four legs?! numersense: Probing numerical commonsense knowledge of pre-trained language models
Recent works show that pre-trained language models (PTLMs), such as BERT, possess
certain commonsense and factual knowledge. They suggest that it is promising to use …
certain commonsense and factual knowledge. They suggest that it is promising to use …
Things not written in text: Exploring spatial commonsense from visual signals
Spatial commonsense, the knowledge about spatial position and relationship between
objects (like the relative size of a lion and a girl, and the position of a boy relative to a bicycle …
objects (like the relative size of a lion and a girl, and the position of a boy relative to a bicycle …
Do language embeddings capture scales?
Pretrained Language Models (LMs) have been shown to possess significant linguistic,
common sense, and factual knowledge. One form of knowledge that has not been studied …
common sense, and factual knowledge. One form of knowledge that has not been studied …
Temporal common sense acquisition with minimal supervision
Temporal common sense (eg, duration and frequency of events) is crucial for understanding
natural language. However, its acquisition is challenging, partly because such information is …
natural language. However, its acquisition is challenging, partly because such information is …
Commonsense reasoning for natural language processing
Commonsense knowledge, such as knowing that “bumping into people annoys them” or
“rain makes the road slippery”, helps humans navigate everyday situations seamlessly. Yet …
“rain makes the road slippery”, helps humans navigate everyday situations seamlessly. Yet …
Transomcs: From linguistic graphs to commonsense knowledge
Commonsense knowledge acquisition is a key problem for artificial intelligence.
Conventional methods of acquiring commonsense knowledge generally require laborious …
Conventional methods of acquiring commonsense knowledge generally require laborious …