Graph representation learning in biomedicine and healthcare

MM Li, K Huang, M Zitnik - Nature Biomedical Engineering, 2022 - nature.com
Networks—or graphs—are universal descriptors of systems of interacting elements. In
biomedicine and healthcare, they can represent, for example, molecular interactions …

Knowledge graphs: Opportunities and challenges

C Peng, F Xia, M Naseriparsa, F Osborne - Artificial Intelligence Review, 2023 - Springer
With the explosive growth of artificial intelligence (AI) and big data, it has become vitally
important to organize and represent the enormous volume of knowledge appropriately. As …

[PDF][PDF] 知识图谱技术综述

徐增林, 盛泳潘, 贺丽荣, 王雅芳 - 电子科技大学学报, 2016 - researchgate.net
知识图谱技术是人工智能技术的重要组成部分, 其建立的具有语义处理能力与开放互联能力的
知识库, 可在智能搜索, 智能问答, 个性化推荐等智能信息服务中产生应用价值 …

Holistic evaluation of language models

P Liang, R Bommasani, T Lee, D Tsipras… - arXiv preprint arXiv …, 2022 - arxiv.org
Language models (LMs) are becoming the foundation for almost all major language
technologies, but their capabilities, limitations, and risks are not well understood. We present …

Unifying large language models and knowledge graphs: A roadmap

S Pan, L Luo, Y Wang, C Chen… - IEEE Transactions on …, 2024 - ieeexplore.ieee.org
Large language models (LLMs), such as ChatGPT and GPT4, are making new waves in the
field of natural language processing and artificial intelligence, due to their emergent ability …

Deep bidirectional language-knowledge graph pretraining

M Yasunaga, A Bosselut, H Ren… - Advances in …, 2022 - proceedings.neurips.cc
Pretraining a language model (LM) on text has been shown to help various downstream
NLP tasks. Recent works show that a knowledge graph (KG) can complement text data …

Knowledge graph contrastive learning for recommendation

Y Yang, C Huang, L Xia, C Li - … of the 45th international ACM SIGIR …, 2022 - dl.acm.org
Knowledge Graphs (KGs) have been utilized as useful side information to improve
recommendation quality. In those recommender systems, knowledge graph information …

Linkbert: Pretraining language models with document links

M Yasunaga, J Leskovec, P Liang - arXiv preprint arXiv:2203.15827, 2022 - arxiv.org
Language model (LM) pretraining can learn various knowledge from text corpora, helping
downstream tasks. However, existing methods such as BERT model a single document, and …

Long range graph benchmark

VP Dwivedi, L Rampášek, M Galkin… - Advances in …, 2022 - proceedings.neurips.cc
Abstract Graph Neural Networks (GNNs) that are based on the message passing (MP)
paradigm generally exchange information between 1-hop neighbors to build node …

Compute trends across three eras of machine learning

J Sevilla, L Heim, A Ho, T Besiroglu… - … Joint Conference on …, 2022 - ieeexplore.ieee.org
Compute, data, and algorithmic advances are the three fundamental factors that drive
progress in modern Machine Learning (ML). In this paper we study trends in the most readily …