关注
Jackson Petty
Jackson Petty
Ph.D. Student, NYU
在 nyu.edu 的电子邮件经过验证 - 首页
标题
引用次数
引用次数
年份
Gpqa: A graduate-level google-proof q&a benchmark
D Rein, BL Hou, AC Stickland, J Petty, RY Pang, J Dirani, J Michael, ...
arXiv preprint arXiv:2311.12022, 2023
772023
(QA)^2: Question Answering with Questionable Assumptions
N Kim, PM Htut, S Bowman, J Petty
Proceedings of the 61st Annual Meeting of the Association for Computational …, 2023
21*2023
Debate helps supervise unreliable experts
J Michael, S Mahdi, D Rein, J Petty, J Dirani, V Padmakumar, ...
arXiv preprint arXiv:2311.08702, 2023
182023
Transformers generalize linearly
J Petty, R Frank
arXiv preprint arXiv:2109.12036, 2021
172021
The illusion of state in state-space models
W Merrill, J Petty, A Sabharwal
arXiv preprint arXiv:2404.08819, 2024
112024
How abstract is linguistic generalization in large language models? Experiments with argument structure
M Wilson, J Petty, R Frank
Transactions of the Association for Computational Linguistics 11, 1377-1395, 2023
112023
In-context Learning Generalizes, But Not Always Robustly: The Case of Syntax
A Mueller, A Webson, J Petty, T Linzen
arXiv preprint arXiv:2311.07811, 2023
32023
Do language models learn position-role mappings?
J Petty, M Wilson, R Frank
arXiv preprint arXiv:2202.03611, 2022
32022
The Optimal Double Bubble for Density
J Hirsch, K Li, J Petty, C Xue
Rose-Hulman Undergraduate Mathematics Journal 22 (2), 2021
32021
The Impact of Depth on Compositional Generalization in Transformer Language Models
J Petty, S Steenkiste, I Dasgupta, F Sha, D Garrette, T Linzen
Proceedings of the 2024 Conference of the North American Chapter of the …, 2024
22024
The impact of depth and width on transformer language model generalization
J Petty, S van Steenkiste, I Dasgupta, F Sha, D Garrette, T Linzen
arXiv preprint arXiv:2310.19956, 2023
22023
Sequence-to-sequence networks learn the meaning of reflexive anaphora
R Frank, J Petty
arXiv preprint arXiv:2011.00682, 2020
22020
Certain hyperbolic regular polygonal tiles are isoperimetric
J Hirsch, K Li, J Petty, C Xue
Geometriae Dedicata, 1-13, 2021
12021
How Does Code Pretraining Affect Language Model Task Performance?
J Petty, S van Steenkiste, T Linzen
arXiv preprint arXiv:2409.04556, 2024
2024
The Optimal Double Bubble for Density 𝑟ᵖ
J Hirsch, K Li, J Petty, C Xue
Rose-Hulman Undergraduate Mathematics Journal 22 (2), 4, 2021
2021
Optimal monohedral tilings of hyperbolic surfaces
L Di Giosia, J Habib, J Hirsch, L Kenigsberg, K Li, D Pittman, J Petty, ...
arXiv preprint arXiv:1911.04476, 2019
2019
Probing language models’ knowledge of position-role mappings with novel word learning
M Wilson, J Petty, R Frank
系统目前无法执行此操作,请稍后再试。
文章 1–17