Generative AI Unit 1 2 3 Questions
Generative AI Unit 1 2 3 Questions
What is AI?
A Brief History of AI
Generative AI is a subset of AI that focuses on creating new content, such as text, images, or
code. It uses algorithms to learn patterns from existing data and generate new, original
content.
AI Prompt Writing
A prompt is a text input that guides an AI model to generate specific content. Effective
prompt writing is crucial for getting the desired output from generative AI models.
Types of Prompts
Descriptive Prompts: Provide a clear and concise description of the desired output.
Instructive Prompts: Give specific instructions on what to generate, such as style,
tone, or format.
Example-Based Prompts: Provide examples of the desired output to guide the
model.
Combination Prompts: Combine descriptive, instructive, and example-based
prompts for more complex outputs.
What is Text-to-Text Generative AI?
Text-to-text generative AI models take text input and generate text output. This can include
tasks like translation, summarization, question answering, and creative writing.
1. Be Specific: The more specific your prompt, the better the output.
2. Use Clear and Concise Language: Avoid ambiguity and unnecessary complexity.
3. Provide Context: Give the model relevant background information.
4. Experiment with Different Prompts: Try different phrasing and styles to see what
works best.
5. Be Patient: Sometimes it takes multiple attempts to get the desired output.
Generative language models are a type of AI model that can generate human-quality text.
Some of the most popular examples include:
Google Bard
Google Bard is another powerful language model developed by Google AI. It can be used for
a variety of tasks, including writing different kinds of creative content, translating languages,
and answering your questions in an informative way.
Ethics in AI
Bias and Fairness: AI models can perpetuate biases present in the data they are
trained on.
Privacy: AI systems can collect and analyze large amounts of personal data.
Job Displacement: AI could potentially automate many jobs, leading to job loss.
Autonomous Weapons: AI-powered weapons raise concerns about the potential for
misuse.
https://thebrandhopper.com/2023/06/08/unveiling-founding-history-and-founding-team-of-
chatgpt/
https://www.outrace.ai/ai-directory
Unit 2: Prompt Engineering - NLP and ML Foundations
Techniques for Prompt Engineering
Prompt engineering is the art of crafting effective prompts to guide AI models to generate
desired outputs. Here are some techniques:
1. Descriptive Prompts: Clearly describe the desired output, such as "Write a poem
about a lonely robot."
2. Instructive Prompts: Provide specific instructions, like "Write a Python script to
calculate factorial."
3. Example-Based Prompts: Give examples of the desired output, such as "Write a
summary of this article: [link]."
4. Combination Prompts: Combine descriptive, instructive, and example-based
prompts for more complex tasks.
5. Iterative Refinement: Continuously refine prompts based on the model's output to
achieve the desired result.
What is NLP?
Natural Language Processing (NLP) is a field of artificial intelligence that focuses on the
interaction between computers and human language. It involves tasks like understanding,
interpreting, and generating human language.
What is ML?
Machine Learning (ML) is a subset of AI that involves training algorithms on large datasets
to make predictions or decisions without explicit programming.
By understanding these NLP tasks and the principles of prompt engineering, you can
effectively harness the power of AI to achieve your desired outcomes.
https://botpress.com/blog/chatgpt
https://github.com/Manjeet-MnB/Data_Science_Masters_Pwskills-Assignments-
https://invocom.io/blog/customer-support-in-ecommerce/
Goal
To create a collection of input texts that help generative AI models produce high-quality,
relevant outputs
Purpose
To improve the model's ability to respond to a wide range of queries, learn from diverse
data, and adapt to minimize biases
Technique
To choose the most appropriate words, phrases, formats, and symbols to guide the AI
Application
Can be used in generators like ChatGPT or DALL-E, or by AI engineers when refining
large language models (LLMs)
Use different prompt patterns: Try least-to-most prompting, which prompts the model to
list sub-problems and solve them in sequence. You can also try few-shot prompting, which
uses in-context learning to allow the model to process examples beforehand
Natural language processing (NLP) and machine learning (ML) are both
subfields of artificial intelligence (AI) with distinct capabilities and use
cases:
Neural networks
Foundation models are based on complex neural networks, such as transformers,
generative adversarial networks (GANs), and variational encoders.
Self-supervised learning
Foundation models use self-supervised learning to create labels from input data.
General purpose
Foundation models are designed to be general purpose and can be applied to a
wide range of tasks.
While foundation models can be powerful tools, they also raise ethical
concerns. For example, they can sometimes generate false or inaccurate
answers, and they can be misused to create harmful content.
Natural language processing (NLP) tasks are techniques that break down
human language into smaller parts that computers can understand. NLP is
a subfield of computer science and artificial intelligence that helps
computers process data in natural language. Some NLP tasks include:
Part-of-speech tagging: Tags words in a sentence with their part of speech, such
as noun, verb, adjective, or adverb
NLP tasks can be used to automate tasks like customer support, data
entry, and document handling. They can also be used in search results,
predictive text, and language translation.
Adding more context: Providing additional information to the model can improve its
understanding of the task.
Using specific keywords: Including keywords can help the model focus on the
desired output.
Adjusting the prompt length: Shorter prompts can be more concise, while longer
prompts can provide more context.
Experimenting with different phrasing: Trying different ways of expressing the
same idea can yield different results.
Providing examples: Giving the model examples of the desired output can help it
learn the pattern.
Using chain-of-thought reasoning: Breaking down complex tasks into smaller steps
can help the model reason through the problem.
Incorporating feedback: Using feedback from previous outputs to refine the prompt
and improve future results.
Filtering and Post-Processing Filtering and post-processing are techniques used to refine
the model's output and improve its quality. This can involve:
Use Cases and Applications Prompt engineering and tuning techniques have a wide range of
applications, including:
Content generation: Creating articles, blog posts, and other creative content.
Code generation: Writing code snippets and entire programs.
Translation: Translating text from one language to another.
Summarization: Summarizing long documents into shorter versions.
Question answering: Answering questions posed in natural language.
Pre-training Pre-training involves training a model on a massive amount of text data to learn
general language patterns. This can significantly improve the model's performance on
downstream tasks.
Designing Effective Prompts Here are some tips for designing effective prompts:
By understanding these techniques and best practices, you can effectively leverage prompt
engineering to unlock the full potential of AI models.
Gradient descent: An algorithm that finds the optimal values for parameters in a
machine learning model
Task-specific context: Prompt tuning provides the model with task-specific context
by using prompts that are either human-engineered or AI-generated.
Cost-effective: Prompt tuning is more cost-effective than other methods like model
or prefix tuning.
Corrects model behaviour: Prompt tuning can correct the model's behavior, such
as mitigating bias.
For example, when using a model like GPT-4 to generate a news article,
you might start the prompt with a headline and a summary to provide more
context for the model.