Specter: Document-level representation learning using citation-informed transformers

A Cohan, S Feldman, I Beltagy, D Downey… - arXiv preprint arXiv …, 2020 - arxiv.org
Representation learning is a critical ingredient for natural language processing systems.
Recent Transformer language models like BERT learn powerful textual representations, but
these models are targeted towards token-and sentence-level training objectives and do not
leverage information on inter-document relatedness, which limits their document-level
representation power. For applications on scientific documents, such as classification and
recommendation, the embeddings power strong performance on end tasks. We propose …

[引用][C] SPECTER: document-level representation learning using citation-informed transformers. 2020

A Cohan, S Feldman, I Beltagy, D Downey, DS Weld - arXiv preprint arXiv …, 2004
以上显示的是最相近的搜索结果。 查看全部搜索结果