Large Language Model
Large Language Model
A Neural Network trained on large quantities of unlabelled text using self-supervised learning.
Though trained on simple tasks along the lines of predicting the next word in a sentence, neural language models with sufficient training and parameter counts are found to capture much of the syntax and semantics of human language. In addition, large language models demonstrate considerable general knowledge about the world, and are able to "memorize" a great quantity of facts during training
Large Language Models (LLMs) have been the focus of intense public attention in late 2022 and early 2023, in particular the combination of Generative Pretrained Transformer with chatbot-style interfaces, and often augmented with internet search capability and/or access to private or business data.