PromptEngg Mod1
PromptEngg Mod1
Prompt engineering is the practice of designing and refining prompts (inputs) to optimize the
responses generated by AI language models like GPT.
1. Instruction Clarity: Ensuring that the prompt clearly communicates the task to the
model. For example, instead of saying, "Explain photosynthesis," you might say,
"Explain the process of photosynthesis in simple terms suitable for a 10-year-old."
2. Contextual Information: Providing necessary context within the prompt to help the
model understand the scenario. For example, specifying the format or tone: "Write a
formal letter to a potential business partner."
3. Examples and Constraints: Including examples or specifying constraints can help
the model focus on what is required. For example, "Generate a list of three potential
titles for a science fiction novel set on Mars."
4. Iterative Refinement: Often, prompts need to be refined through trial and error. This
iterative process helps in identifying the best phrasing or structure that yields the
desired response from the model.
5. Task-Specific Prompts: Tailoring prompts for specific tasks such as summarization,
translation, question answering, or code generation to get the best possible output for
that particular use case.
Importance:
Prompt engineering is crucial because the quality of the output generated by an AI model is
heavily dependent on the input prompt. Well-engineered prompts can lead to more useful,
accurate, and contextually appropriate outputs, making AI models more effective and reliable
for various applications.
PROMPTING LLM
Large Language Model (LLM) involves crafting a clear and specific input to guide the
model in generating the desired response. Effective prompting is crucial because it directly
influences the quality and relevance of the output. Here’s how to approach prompting an
LLM:
• General Knowledge: LLMs are trained on vast datasets and can provide information
on a wide range of topics.
• Language Understanding: They can comprehend and generate text in multiple
languages and styles.
• Contextual Awareness: LLMs can maintain context over a series of interactions but
may struggle with long or complex dialogues.
2. Types of Prompts
4. Advanced Techniques
• Few-Shot Prompting: Provide a few examples of the desired output within the
prompt to guide the model.
o Example: “Translate the following phrases to French: ‘Hello’ becomes
‘Bonjour.’ ‘Goodbye’ becomes...”
• Chain of Thought Prompting: Ask the LLM to explain its reasoning process before
giving the final answer.
o Example: “How would you calculate the area of a circle? First, explain the
formula, then calculate for a radius of 5.”
• Role-Playing Prompts: Assume a role to get responses tailored to a specific
perspective.
o Example: “As a legal advisor, how would you handle a contract dispute
between two companies?”
5. Evaluating and Refining Prompts
• Test Different Variations: Experiment with different phrasings to see which yields
the best response.
• Feedback Loop: Use the output to refine the prompt iteratively.
• Avoid Ambiguity: If the response is off-target, check if the prompt could be
interpreted in multiple ways.
Scenario: You want the LLM to write a summary of a complex article about quantum
computing.
1. Basic Structure
Example:
2. Labelling Sections
Example:
3. Using Delimiters
• Delimiters like brackets [ ], braces { }, or quotes " " can help clarify specific
sections of the prompt, especially when dealing with input data, expected outputs, or
examples.
Example:
• For prompts that require multiple inputs or steps, use bullet points or numbered lists.
This format is especially helpful for tasks involving multiple requirements or
instructions.
Example:
• Providing examples within the prompt can help the LLM understand the format and
style you expect. This is particularly useful for tasks like translation, code generation,
or creative writing.
Example:
6. Avoiding Ambiguity
Example:
7. Length Management
• Keep prompts concise, especially when interacting with models that have token
limits. However, include enough information to avoid vague or incomplete responses.
Example:
8. Question Formatting
• When asking questions, be direct and ensure the question is fully formed. Use proper
punctuation and specify the type of answer expected (e.g., short answer, multiple
choice).
Example:
9. Multimodal Prompts
• If your LLM supports multimodal input (text, images, etc.), clearly separate and label
different types of input.
Example:
• If the task is complex, use iterative prompting by breaking down the task into smaller,
sequential prompts.
Example:
1. Instruction or Task
• Definition: The core directive that tells the model what to do.
• Purpose: This element defines the action you want the LLM to take, whether it’s
generating text, answering a question, summarizing information, or performing
another task.
• Example: "Summarize the following text."
2. Context or Background
3. Input Data
• Definition: The data or content that the model should process or analyze.
• Purpose: Specifies the information the LLM needs to work with, such as a text
passage, code snippet, or list of items.
• Example: "The quick brown fox jumps over the lazy dog."
4. Output Specification
5. Examples or References
6. Constraints or Guidelines
8. Target Audience
• Definition: Steps or instructions for refining the output based on initial responses.
• Purpose: Guides the model through a process of refining or improving its response
through iterations.
• Example: "If the first explanation is too technical, simplify it further."
• Tip: Make your prompt as clear and unambiguous as possible. Avoid vague or overly
broad questions.
• Example: Instead of asking, “Tell me about energy,” specify, “Explain the difference
between renewable and non-renewable energy.”
• Tip: Clearly state what you want the LLM to do. Whether it’s summarizing, listing,
comparing, or generating content, the task should be explicit.
• Example: “List three benefits of renewable energy” is more effective than simply
saying, “Talk about renewable energy.”
3. Provide Context
• Tip: Give the necessary background information or context to help the model
understand the scenario.
• Example: “Assume the reader is a high school student learning about climate change.
Explain the greenhouse effect.”
• Tip: Specify the desired format, length, or style of the output. This helps in getting a
response that matches your needs.
• Example: “Write a 200-word summary of the provided article.”
5. Use Examples
• Tip: Providing examples can guide the model in understanding the desired output
format or style, especially in complex tasks.
• Example: “Translate the sentence, ‘Hello, how are you?’ into Spanish. Example:
‘Good morning’ becomes ‘Buenos días.’”
• Tip: If the task is complex, break it down into smaller, simpler steps. Use iterative
prompting to refine the response.
• Example: First, ask, “What is blockchain?” and then follow up with, “Explain how
blockchain is used in cryptocurrencies.”
7. Avoid Ambiguity
• Tip: Avoid prompts that can be interpreted in multiple ways. Clarify any potentially
confusing language.
• Example: Instead of “What is Python?” specify, “What is Python, the programming
language?”
8. Be Concise
• Tip: While providing enough detail is important, being concise helps in keeping the
model focused on the task at hand.
• Example: “Summarize the main points of this article in 3-4 sentences.”
• Tip: Keep in mind the token limit, context retention, and potential biases of the
model. Avoid overly long or complex prompts that might exceed the model’s
capabilities.
• Example: If discussing a complex topic, split it into smaller prompts instead of one
long one.
• Tip: Experiment with different phrasings or structures to see which yields the best
results. Refine the prompt based on the output.
• Example: If the initial response is off-target, try rephrasing or adding more context.
• Tip: Frame the prompt from a particular perspective or role to get a tailored response.
• Example: “As a legal advisor, how would you advise a client in a breach of contract
case?”
• Tip: Provide instructions on how the model should proceed based on certain
conditions or responses.
• Example: “If the user asks for more details, provide a longer explanation. Otherwise,
keep it brief.”
• Tip: Define any constraints, such as word count, format, or style, to ensure the output
meets your requirements.
• Example: “Write a formal letter of recommendation in less than 150 words.”
• Tip: Tailor the prompt to generate responses appropriate for the intended audience,
whether it’s beginners, experts, children, etc.
• Example: “Explain quantum computing in simple terms that a high school student
could understand.”
Specificity is key when designing prompts for Large Language Models (LLMs) because it
directly influences the accuracy, relevance, and usefulness of the model's output. Here’s how
you can incorporate specificity into your prompts:
• Tip: Be explicit about what you want the model to do. Avoid general instructions that
could be interpreted in multiple ways.
• Specific Example: Instead of saying, “Describe AI,” use “Describe the difference
between supervised and unsupervised learning in AI.”
• Tip: Narrow down the topic to avoid broad or overly general responses. Define
exactly what aspect of the subject you want the model to focus on.
• Specific Example: Instead of “Explain climate change,” try “Explain how greenhouse
gases contribute to global warming.”
• Tip: Indicate the format in which you want the information presented—whether it’s a
list, a paragraph, a step-by-step guide, or a comparison.
• Specific Example: “List three key features of Python programming in bullet points.”
• Tip: Provide clear instructions on the length of the response, whether it’s in terms of
word count, sentences, or paragraphs.
• Specific Example: “Summarize the article in 150-200 words.”
• Tip: Specify who the response is intended for, which can influence the tone,
complexity, and content of the response.
• Specific Example: “Explain the concept of blockchain to a high school student.”
6. Contextualize the Prompt
• Tip: Provide any necessary background or context that the model needs to understand
to produce a relevant response.
• Specific Example: “Given that the company’s revenue has decreased by 20% over
the last year, suggest three strategies to improve sales.”
• Tip: Avoid vague terms and use precise language to guide the model towards the
exact response you need.
• Specific Example: Instead of “What are the benefits of exercise?” use “What are the
cardiovascular benefits of regular aerobic exercise?”
8. Include Examples
• Tip: Provide examples within the prompt to clarify the expected response, especially
if the task is complex or abstract.
• Specific Example: “Write a metaphor for sadness. For example, ‘Sadness is a dark
cloud that lingers in the sky.’ Now, create your own metaphor.”
• Tip: Use “if-then” statements to guide the model’s response based on different
scenarios or to handle multiple parts of a task.
• Specific Example: “If the customer asks about the return policy, explain the standard
procedure. If they ask about warranty coverage, explain that instead.”
• Tip: If the task is complicated, break it down into smaller, more manageable parts and
specify each step clearly.
• Specific Example: “First, define what a neural network is. Then, explain how neural
networks are used in image recognition. Finally, discuss the challenges of training
neural networks.”
• Tip: Remove any potential for multiple interpretations by being as precise as possible.
• Specific Example: Instead of “Write about renewable energy,” use “Write about the
advantages of solar power over fossil fuels in energy production.”
• Tip: Formulate questions that are direct and have a clear focus to avoid broad or
unfocused answers.
• Specific Example: “What were the three main causes of the 2008 financial crisis?”
• Tip: For tasks requiring specific methodologies or frameworks, guide the model on
how to approach the query.
• Specific Example: “Using the SWOT analysis framework, evaluate the strengths and
weaknesses of Company X in the current market.”
• Tip: If the initial prompt doesn’t yield the desired result, refine the prompt with more
specific instructions.
• Specific Example: “If the summary is too long, shorten it to include only the key
points about the financial impact.”
Examples of prompt:
1. Text Summarization: "Summarize the following article in 3-4 sentences."
o Article: "The Industrial Revolution began in the late 18th century and had a
profound impact on society. It introduced new manufacturing processes,
leading to significant economic and social changes. Cities grew rapidly as
people moved from rural areas to urban centers in search of work, which also
led to the development of new technologies and infrastructure. However, it
also resulted in challenging living conditions and environmental issues."
2. Information Extraction: "Extract the key details from this text, including names,
dates, and locations."
o Text: "Marie Curie was born on November 7, 1867, in Warsaw, Poland. She
conducted much of her groundbreaking research on radioactivity at the
University of Paris."
def factorial(n):
if n == 0 or n == 1:
return 1
else:
return n * factorial(n - 1)
These prompts demonstrate different ways in which AI can be used to process and interact
with text.