作者
Amin Beheshti, Jian Yang, Quan Z Sheng, Boualem Benatallah, Fabio Casati, Schahram Dustdar, Hamid Reza Motahari Nezhad, Xuyun Zhang, Shan Xue
发表日期
2023/7/2
研讨会论文
2023 IEEE International Conference on Web Services (ICWS)
页码范围
731-739
出版商
IEEE
简介
Generative Pre-trained Transformer (GPT) is a state-of-the-art machine learning model capable of generating human-like text through natural language processing (NLP). GPT is trained on massive amounts of text data and uses deep learning techniques to learn patterns and relationships within the data, enabling it to generate coherent and contextually appropriate text. This position paper proposes using GPT technology to generate new process models when/if needed. We introduce ProcessGPT as a new technology that has the potential to enhance decision-making in data-centric and knowledge-intensive processes. ProcessGPT can be designed by training a generative pre-trained transformer model on a large dataset of business process data. This model can then be fine-tuned on specific process domains and trained to generate process flows and make decisions based on context and user input. The model …
引用总数
学术搜索中的文章
A Beheshti, J Yang, QZ Sheng, B Benatallah, F Casati… - 2023 IEEE International Conference on Web Services …, 2023