[PDF][PDF] What does Chinese BERT learn about syntactic knowledge?

J Zheng, Y Liu - PeerJ Computer Science, 2023 - peerj.com
Pre-trained language models such as Bidirectional Encoder Representations from
Transformers (BERT) have been applied to a wide range of natural language processing …

Minimally-supervised relation induction from pre-trained language model

L Sun, Y Shen, W Lu - Findings of the Association for …, 2022 - aclanthology.org
Relation Induction is a very practical task in Natural Language Processing (NLP) area. In
practical application scenarios, people want to induce more entity pairs having the same …

MLPs Compass: What is learned when MLPs are combined with PLMs?

L Zhou, W Chen, Y Cao, D Zeng… - ICASSP 2024-2024 …, 2024 - ieeexplore.ieee.org
While Transformer-based pre-trained language models and their variants exhibit strong
semantic representation capabilities, the question of comprehending the information gain …

Social media Q&A text semantic similarity and corporate fraud: evidence from the Shanghai Stock Exchange E-interaction platform in China

Q Xu, N Xie, C Jiang, S Yang - Asia-Pacific Journal of Accounting …, 2024 - Taylor & Francis
We develop a social media Q&A text semantic similarity (QATSS) measure to distinguish the
quality of management responses on the Shanghai Stock Exchange E-interaction (SSEEI) …

Exploring the Word Structure of Ancient Chinese Encoded in BERT Models

J Zheng, J Sun - 2023 16th International Conference on …, 2023 - ieeexplore.ieee.org
In recent years, ancient Chinese processing has gradually become an emerging field. With
the advent of pre-trained language models such as BERT, various language models …

Le rôle des inférences pour la fouille d'opinion: applications aux réseaux sociaux en langue chinoise

L Yan - 2021 - theses.hal.science
Cette thèse s' intéresse à l'inférence linguistique dans la fouille d'opinion dans un corpus
des commentaires touristiques en chinois. Les techniques existantes qui sont bien …