ChatGPT: The Most Advanced AI chatbot by OpenAI
Aktualisiert: 5. Juli
OpenAI is in the headlines since the last month with the launch of ChatGPT, an advanced AI-based Chatbot that can answer queries and generate relevant text. Since its release, it has gained huge traction causing discussion all over the world. In this article, we will introduce ChatGPT in detail.
Introduction to ChatGPT
ChatGPT is a chatbot launched by OpenAI on 30 November 2022. It is a powerful language model that can perform a wide range of natural languages processing tasks such as text generation, question answering, text completion, text summarization, language translation, dialogue generation, and text-to-speech.
It is built on OpenAI’s GPT-3 (Generative Pre-trained Transformer 3) language model which uses deep learning to produce human-like text. The quality of text produced by ChatGPT is so good that it is difficult to differentiate if it was done by a machine or by a human (Tom Brown).
Steps to use ChatGPT
1. Go to openai.com
2. Under the “Featured” section in the bottom left - Select ChatGPT or simply click here
3. Enter the questions you need to.
Note: Due to the surge in users, you might get the note – “ChatGPT is at Capacity”. If you can login ChatGPT you might experience slower reply generation to your questions. (Based on a comparison of current personal experience with usage in early December 2022).
Some usage examples of ChatGPT
1. ChatGPT can write codes. Here is an example.
2. Write a research note for BMW stock.
The above note can be used by sell-side analysts.
3. Poem by combining Rudyard Kipling’s “If” and “Gunga Din”.
Though it can recreate the poem with rhyming words ChatGPT is smart enough to mention its limitations.
It is important to note that the queries or questions asked should be clear.
Tasks performed by ChatGPT:
1. Text to Speech: By training on speech data, ChatGPT can generate speech in a given voice.
2. Dialogue generation: Using the conversation data, it can generate contextually appropriate and coherent responses in a conversation.
3. Question answering: It can answer questions by extracting relevant information from the input text.
4. Text classification: It can classify text into positive or negative sentiment, spam or not spam, and more.
5. Text completion: Based on the context of the input, it can complete a given text by predicting the subsequent or speech.
6. Text generation: It can generate coherent and contextually correct text, which can be used to do tasks such as writing code, and essays.
7. Language translation: It can be used to translate from one language to another.
8. Text summarization: It can summarize a text by identifying and extracting the most important information.
Technical Principles of ChatGPT
ChatGPT is a language model that uses deep learning techniques to generate text in response to prompts. The model's architecture is based on GPT-3, which is a transformer model that utilizes self-attention to process and generate text.
GPT-3 is a neural network made up of multiple layers of interconnected nodes, each of which is responsible for processing a specific aspect of the input text, such as meaning, syntax, or context. The nodes in the network work together to produce a cohesive and grammatically correct output.
The GPT-3 architecture is known for its capability to learn from vast amounts of data, and ChatGPT has been trained on a substantial corpus of text data that covers various subjects and styles. This allows the model to produce responses that are highly pertinent to the prompt and demonstrate a level of knowledge and comprehension that is comparable to that of a human.
The OpenAI GPT-3 model was trained on a massive 45 TB of text data from various sources, such as Wikipedia and books, according to its creators. The different datasets used to train the model are listed below (Kindra Cooper, 2021):
The Common Crawl corpus is a vast collection of data accumulated over 8 years through web crawling. It includes raw web page data, metadata extracts, and text extracts with minimal filtering.
WebText2 comprises the text of web pages linked from Reddit posts with at least 3 upvotes.
Books1and Books2 are two corpora of internet-based books.
Wikipedia pages in English are also included in the training corpus.
The "Weight in training mix" column in the table refers to the proportion of examples used during training that come from a particular dataset.
Who are the investors?
Investors in OpenAI are Y Combinator, Reid Hoffman Foundation (Reid Hoffman is LinkedIn’s founder), Khosla Ventures (Vinod Khosla), Mathew Brown Companies, Microsoft, Tiger Global Management, Andreessen Horowitz, Sequoia Capital, Adam Juegos and Bedrock Capital (Crunchbase).
Much of the computing power was provided by Microsoft’s cloud service – Azure. In 2019, Microsoft invested $1 billion and is currently in talks to invest $10 billion, valuing OpenAI at $29 billion (Reuters, 2023).
Elon Musk was one of the co-founders of OpenAI and helped establish the company in December 2015. However, Elon Musk stepped down from OpenAI’s board due to the potential of future conflicts of interest between Tesla’s AI developments (Elon Musk’s history with OpenAI—the maker of A.I. chatbot ChatGPT—as told by ChatGPT itself, 2022).
What is the cost of using ChatGPT?
Currently, using ChatGPT is free for users. A professional version is in the making which will offer higher limits and faster performance than the free version which is experiencing delays due to huge user surge (Greg Brockman).
Elon Musk asked CEO of OpenAI, Sam Altman about the average cost per chat. Sam said it is in single-digit cents (in $) per chat (Twitter). An analyst at Morgan Stanley estimated that the higher cost of natural language processing means that answering a query using ChatGPT costs seven times as much as a typical internet search (Richard Waters, Tabby Kinder, 2023).
What is the impact of ChatGPT?
According to Eric Boyd, head of AI platforms at Microsoft, these AI platforms are going to change the way people interact with computers. Talking to a computer will revolutionize the everyday experience of using technology. The cloud customers of Microsoft have been able to pay for access to GPT-3, a text-generating AI model since 2021. Github, a Microsoft service for developers, has turned Codex, a system that provides software developers with suggestions for the next lines of code to write, into a product. OpenAI's focus is largely on the development of large language models, which are trained on enormous amounts of text. This approach to AI, which is different from the traditional machine learning techniques that have been prevalent in the past decade, has led to systems that are more versatile and have greater commercial potential. Microsoft is looking to use the technology in a wide range of products such as Microsoft Office among other productivity applications (Richard Waters, Tabby Kinder, 2023).
Much of the public interest is due to ChatGPT’s ability to generate texts but its biggest impact might come from its ability to interpret them. These models are making semantic search engines promising far more personal and relevant content to users. Google is developing its LaMDA foundation model Bing, which is powered by Microsoft, and may use OpenAI’s tech in near future (John Thornhill, 2022).
If you have any questions or need assistance, please feel free to contact us. Our experts will be happy to help you. Please also visit our homepage at www.idea-bf.de.
(n.d.). Retrieved from Crunchbase: https://www.crunchbase.com/organization/openai/company_financials
(2023, January 10). Retrieved from Reuters: https://www.reuters.com/technology/microsoft-talks-invest-10-bln-chatgpt-owner-semafor-2023-01-10/
Bengio, Y. D. (2003). A neural probabilistic language model. Journal of machine learning research, 1137-1155.
Dai, Z. &. (2019). Transformer-xl: Attentive language models beyond a fixed-length context. Retrieved from arXiv preprint arXiv:1901.02860
Elon Musk’s history with OpenAI—the maker of A.I. chatbot ChatGPT—as told by ChatGPT itself. (2022, December 12). Retrieved from Fortune: https://fortune.com/2022/12/11/elon-musk-history-with-chatgpt-maker-openai-as-told-by-the-ai-chatbot-itself/
Greg Brockman. (n.d.). Retrieved from Twitter: https://twitter.com/gdb/status/1612986134048698369
John Thornhill. (2022, December 9). ChatGPT is less worried about itself than we are. (EU). Financial Times.
Kindra Cooper. (2021, November 1). OpenAI GPT-3: Everything You Need to Know. Retrieved from Springboard: https://www.springboard.com/blog/data-science/machine-learning-gpt-3-open-ai/
Richard Waters, Tabby Kinder. (2023, January 17). Microsoft's planned $10bn punt on research outfit OpenAI in spotlight. (EU Edition). Financial Times.
Tom Brown, B. M.-V. (n.d.). Language Models are Few-Shot Learners. Retrieved from Analytics India Mag: https://analyticsindiamag.com/open-ai-gpt-3-language-model/
Twitter. (n.d.). Retrieved from https://twitter.com/elonmusk/status/1599669552081960960