Multimodal integration of human-like attention in visual question answering
Human-like attention as a supervisory signal to guide neural attention has shown significant
promise but is currently limited to unimodal integration-even for inherently multimodal tasks …
promise but is currently limited to unimodal integration-even for inherently multimodal tasks …
Eyes Show the Way: Modelling Gaze Behaviour for Hallucination Detection
Detecting hallucinations in natural language processing (NLP) is a critical undertaking that
demands a deep understanding of both the semantic and pragmatic aspects of languages …
demands a deep understanding of both the semantic and pragmatic aspects of languages …
A cross-lingual comparison of human and model relative word importance
F Morger, S Brandl, L Beinborn… - Proceedings of the 2022 …, 2022 - aclanthology.org
Relative word importance is a key metric for natural language processing. In this work, we
compare human and model relative word importance to investigate if pretrained neural …
compare human and model relative word importance to investigate if pretrained neural …
An Eye Opener Regarding Task-Based Text Gradient Saliency
Eye movements in reading reveal humans' cognitive processes involved in language
understanding. The duration a reader's eyes fixate on a word has been used as a measure …
understanding. The duration a reader's eyes fixate on a word has been used as a measure …
Gaze-infused BERT: Do human gaze signals help pre-trained language models?
This research delves into the intricate connection between self-attention mechanisms in
large-scale pre-trained language models, like BERT, and human gaze patterns, with the aim …
large-scale pre-trained language models, like BERT, and human gaze patterns, with the aim …
Seeing Eye to AI: Human Alignment via Gaze-Based Response Rewards for Large Language Models
Advancements in Natural Language Processing (NLP), have led to the emergence of Large
Language Models (LLMs) such as GPT, Llama, Claude, and Gemini, which excel across a …
Language Models (LLMs) such as GPT, Llama, Claude, and Gemini, which excel across a …
Every word counts: A multilingual analysis of individual human alignment with model attention
S Brandl, N Hollenstein - arXiv preprint arXiv:2210.04963, 2022 - arxiv.org
Human fixation patterns have been shown to correlate strongly with Transformer-based
attention. Those correlation analyses are usually carried out without taking into account …
attention. Those correlation analyses are usually carried out without taking into account …
EMTeC: A Corpus of Eye Movements on Machine-Generated Texts
The Eye Movements on Machine-Generated Texts Corpus (EMTeC) is a naturalistic eye-
movements-while-reading corpus of 107 native English speakers reading machine …
movements-while-reading corpus of 107 native English speakers reading machine …
Evaluating Webcam-based Gaze Data as an Alternative for Human Rationale Annotations
Rationales in the form of manually annotated input spans usually serve as ground truth
when evaluating explainability methods in NLP. They are, however, time-consuming and …
when evaluating explainability methods in NLP. They are, however, time-consuming and …
Self-Attention in Transformer Networks Explains Monkeys' Gaze Pattern in Pac-Man Game
Z Lin, Y Li, T Yang - arXiv preprint arXiv:2406.14100, 2024 - arxiv.org
We proactively direct our eyes and attention to collect information during problem solving
and decision making. Understanding gaze patterns is crucial for gaining insights into the …
and decision making. Understanding gaze patterns is crucial for gaining insights into the …