Overview - ChatGPT and Generative AI
Overview - ChatGPT and Generative AI
solomonduraj@gmail.com
HYML7P0S3U
ChatGPT and Generative AI
● What is ChatGPT?
● What are some applications of ChatGPT? What can it be used for?
● OpenAI, Microsoft and their role in ChatGPT
● The Golden Age of Generative AI - Popular Examples
● Transformers and a High-level Understanding of ChatGPT
solomonduraj@gmail.com
HYML7P0S3U
● Prompt Engineering
● Limitations of ChatGPT
● Beyond ChatGPT
solomonduraj@gmail.com
HYML7P0S3U State of the art Language Model with billions of parameters
Content
solomonduraj@gmail.com Creation Sentiment Analysis
HYML7P0S3U
OpenAI played the role of both the developer and the trainer of the
ChatGPT model, using vast amounts of text scraped from the web.
solomonduraj@gmail.com
HYML7P0S3U
model in a to & fro manner. Conversational AI interfaces and Chatbots have been a
captivating application of AI for various business use cases.
Generative: A class of Deep Learning and AI models that are trained to generate data
G
Pre-trained: A type of training technique used for AI models, where a model may be trained
from scratch on a large initial set of data, and this “pre-trained” model is then further
P
Et voilà!
Improved Algorithms
2 Developments in Deep Learning algorithmic research, such as GANs,
Stable Diffusion and Transformer-based models, have enabled even
solomonduraj@gmail.com
more accurate and diverse outputs.
HYML7P0S3U
Abundant Data
3 The explosive growth of the internet and the quantity & quality of data
freely available on the web have provided all the training data
needed for Generative AI.
A Multitude of Applications
4 Generative AI is being used to automate a wide range of processes
across industry verticals, such as synthetic protein creation, image
generation, programming Thiscopilots and
file is meant chatbots.
for personal use by solomonduraj@gmail.com only.
Sharing or publishing the contents in part or full is liable for legal action.
Pre-Trained models
Pre-training is a technique used with models already trained on large data volumes, to “transfer” their
knowledge to more specific tasks, which may not have the luxury of large data sets for the model to
train.
solomonduraj@gmail.com Smaller
Large Generic Corpus
HYML7P0S3U Specific
of Data Dataset for
some task
Pre-trained Fine-tuned
Language Language
Model Model
GPT 3.5, one of OpenAI’s Large Language Models (LLMs), was trained on the entire corpus of
text data present on the internet (consisting of billions of web pages).
GPT 3.5 - the pre-trained model, was ultimately fine-tuned on a smaller Q&A dataset, to
convert questions into answers. This is the base for how ChatGPT was created.
solomonduraj@gmail.com
HYML7P0S3U
Questions
Chain of Chain of
Question Answers
Encoders Decoders
Embeddings
Model asks
User Feedback is back propagated and used
to fine-tune the weights of the network for User
Feedback
Prompt
solomonduraj@gmail.com
HYML7P0S3U
Generative
Language
Models
Generated Text
● Creative Writing
● Text Summarization
● Idea Generation
solomonduraj@gmail.com
HYML7P0S3U
● ChatGPT has been trained on a vast amount of data, including text from the internet
and books, to improve its ability to understand and respond to natural language.
solomonduraj@gmail.com
● Generative AI has the potential to revolutionize various fields such as language, image
HYML7P0S3U
and music generation, and is an area of growing interest and research.