Generative Pretrained Transformer
Generative Pretrained Transformer
A type of Large Language Model based on a transformer architecture and pre-trained on large data sets of unlabelled text. They have a very high public profile in early 2023 for their ability to generate novel human-like content.
The inventors of the first Generative Pretrained Transformer model, OpenAI, asserted in 2023 that "GPT" is a brand owned by the company.
Criticism
Critics of these models have asserted that their unsupervised use poses a number of risks, espcially in areas such as:
- plagiarism
- fake news
- perpetuating systemic biases in the training content, e.g. a hegemonic view point
- “In accepting large amounts of web text as ‘representative’ of ‘all’ of humanity, we risk perpetuating dominant viewpoints, increasing power imbalances and further reifying inequality” — On the Dangers of Stochastic Parrots: Can Language Models Be Too Big?
GPT-type models are the main focus of calls for AI Regulation