[PDF][PDF] On position embeddings in bert

B Wang, L Shang, C Lioma, X Jiang… - International …, 2020 - drive.google.com
ABSTRACT Various Position Embeddings (PEs) have been proposed in Transformer based
architectures (eg BERT) to model word order. These are empirically-driven and perform well …

Two simple ways to learn individual fairness metrics from data

D Mukherjee, M Yurochkin… - … on Machine Learning, 2020 - proceedings.mlr.press
Individual fairness is an intuitive definition of algorithmic fairness that addresses some of the
drawbacks of group fairness. Despite its benefits, it depends on a task specific fair metric that …

Frontiers: Determining the validity of large language models for automated perceptual analysis

P Li, N Castelo, Z Katona, M Sarvary - Marketing Science, 2024 - pubsonline.informs.org
This paper explores the potential of large language models (LLMs) to substitute for human
participants in market research. Such LLMs can be used to generate text given a prompt. We …

Improved confidence bounds for the linear logistic model and applications to bandits

KS Jun, L Jain, B Mason… - … Conference on Machine …, 2021 - proceedings.mlr.press
We propose improved fixed-design confidence bounds for the linear logistic model. Our
bounds significantly improve upon the state-of-the-art bound by Li et al.(2017) via recent …

Insights into ordinal embedding algorithms: A systematic evaluation

LC Vankadara, M Lohaus, S Haghiri, FU Wahab… - Journal of Machine …, 2023 - jmlr.org
The objective of ordinal embedding is to find a Euclidean representation of a set of abstract
items, using only answers to triplet comparisons of the form" Is item i closer to item j or item …

[PDF][PDF] One for All: Simultaneous Metric and Preference Learning over Multiple Users.

G Canal, B Mason, RK Vinayak, R Nowak - NeurIPS, 2022 - proceedings.neurips.cc
This paper investigates simultaneous preference and metric learning from a crowd of
respondents. A set of items represented by d-dimensional feature vectors and paired …

Dimensions underlying the representational alignment of deep neural networks with humans

FP Mahner, L Muttenthaler, U Güçlü… - arXiv preprint arXiv …, 2024 - arxiv.org
Determining the similarities and differences between humans and artificial intelligence is an
important goal both in machine learning and cognitive neuroscience. However, similarities …

Generalization bounds for graph embedding using negative sampling: Linear vs hyperbolic

A Suzuki, A Nitanda, L Xu… - Advances in Neural …, 2021 - proceedings.neurips.cc
Graph embedding, which represents real-world entities in a mathematical space, has
enabled numerous applications such as analyzing natural languages, social networks …

Lens depth function and k-relative neighborhood graph: versatile tools for ordinal data analysis

M Kleindessner, U Von Luxburg - Journal of Machine Learning Research, 2017 - jmlr.org
In recent years it has become popular to study machine learning problems in a setting of
ordinal distance information rather than numerical distance measurements. By ordinal …

Foundations of comparison-based hierarchical clustering

D Ghoshdastidar, M Perrot… - Advances in neural …, 2019 - proceedings.neurips.cc
We address the classical problem of hierarchical clustering, but in a framework where one
does not have access to a representation of the objects or their pairwise similarities. Instead …