[PDF][PDF] On position embeddings in bert
ABSTRACT Various Position Embeddings (PEs) have been proposed in Transformer based
architectures (eg BERT) to model word order. These are empirically-driven and perform well …
architectures (eg BERT) to model word order. These are empirically-driven and perform well …
Two simple ways to learn individual fairness metrics from data
D Mukherjee, M Yurochkin… - … on Machine Learning, 2020 - proceedings.mlr.press
Individual fairness is an intuitive definition of algorithmic fairness that addresses some of the
drawbacks of group fairness. Despite its benefits, it depends on a task specific fair metric that …
drawbacks of group fairness. Despite its benefits, it depends on a task specific fair metric that …
Frontiers: Determining the validity of large language models for automated perceptual analysis
This paper explores the potential of large language models (LLMs) to substitute for human
participants in market research. Such LLMs can be used to generate text given a prompt. We …
participants in market research. Such LLMs can be used to generate text given a prompt. We …
Improved confidence bounds for the linear logistic model and applications to bandits
We propose improved fixed-design confidence bounds for the linear logistic model. Our
bounds significantly improve upon the state-of-the-art bound by Li et al.(2017) via recent …
bounds significantly improve upon the state-of-the-art bound by Li et al.(2017) via recent …
Insights into ordinal embedding algorithms: A systematic evaluation
The objective of ordinal embedding is to find a Euclidean representation of a set of abstract
items, using only answers to triplet comparisons of the form" Is item i closer to item j or item …
items, using only answers to triplet comparisons of the form" Is item i closer to item j or item …
[PDF][PDF] One for All: Simultaneous Metric and Preference Learning over Multiple Users.
This paper investigates simultaneous preference and metric learning from a crowd of
respondents. A set of items represented by d-dimensional feature vectors and paired …
respondents. A set of items represented by d-dimensional feature vectors and paired …
Dimensions underlying the representational alignment of deep neural networks with humans
FP Mahner, L Muttenthaler, U Güçlü… - arXiv preprint arXiv …, 2024 - arxiv.org
Determining the similarities and differences between humans and artificial intelligence is an
important goal both in machine learning and cognitive neuroscience. However, similarities …
important goal both in machine learning and cognitive neuroscience. However, similarities …
Generalization bounds for graph embedding using negative sampling: Linear vs hyperbolic
Graph embedding, which represents real-world entities in a mathematical space, has
enabled numerous applications such as analyzing natural languages, social networks …
enabled numerous applications such as analyzing natural languages, social networks …
Lens depth function and k-relative neighborhood graph: versatile tools for ordinal data analysis
M Kleindessner, U Von Luxburg - Journal of Machine Learning Research, 2017 - jmlr.org
In recent years it has become popular to study machine learning problems in a setting of
ordinal distance information rather than numerical distance measurements. By ordinal …
ordinal distance information rather than numerical distance measurements. By ordinal …
Foundations of comparison-based hierarchical clustering
D Ghoshdastidar, M Perrot… - Advances in neural …, 2019 - proceedings.neurips.cc
We address the classical problem of hierarchical clustering, but in a framework where one
does not have access to a representation of the objects or their pairwise similarities. Instead …
does not have access to a representation of the objects or their pairwise similarities. Instead …