0% found this document useful (0 votes)
556 views17 pages

Touchpad AI Supplement Class-9 (AK) Part-B

Uploaded by

NPD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
556 views17 pages

Touchpad AI Supplement Class-9 (AK) Part-B

Uploaded by

NPD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

9

CBSE

Supplement

Beta
=

Class
IX Answer Key .

Artificial Intelligence Supplement


Part-B: Subject Specific Skills

1. AI Reflection, Project Cycle and Ethics


Exercise
Unsolved Questions
SECTION A (Objective Type Questions)
uiz
A. 1. a 2. d 3. c 4. b
B. 1. Evaluation 2. Receiver Operating Characteristic (ROC)
3. Privacy 4. Inclusion
C. 1. False 2. False 3. True 4. True
SECTION B (Subjective Type Questions)
A. 1. At the deployment phase of the AI project cycle, several measures are implemented to ensure
that the AI model can operate effectively in a real-world setting. This involves integrating
the model into existing systems or applications, such as creating Application Programming
Interfaces (APIs) or embedding it directly into software. It also includes setting up the
necessary infrastructure, like servers or cloud services, to support the model. Once integrated,
the model needs to be able to process new data and provide predictions.
2. Monitoring tools are established to track the model’s performance and ensure it works
correctly. Recording and reporting are also important to capture data on how the model is
performing and to identify any issues that might arise. This phase is crucial for making the
AI model functional and useful for end-users.
B. 1. The Receiver Operating Characteristic (ROC) curve is a graphical representation that illustrates
the performance of a binary classifier system at varying threshold values. It plots the True
Positive Rate (TPR) against the False Positive Rate (FPR) at various threshold settings.
This curve plots two parameters:
• T
 rue Positive Rate (Sensitivity) is the proportion of actual positive cases that are correctly
identified by the classifier.

Touchpad AI Supplement-IX (Answer Key) (Ver.3.0) 3


IMPRINT 1
TP
TPR =
(TP + FN )

• F
 alse Positive Rate is the proportion of actual negative cases that are incorrectly classified
as positive.
FP
FPR =
(FP + TN )

To generate an ROC curve, you need to perform the following tasks:


Vary the threshold of your classifier, usually ranging from 0 to 1, and calculate TPR and FPR
at each threshold.
Plot these TPR and FPR values on a graph. TPR is plotted on the y-axis, and FPR is plotted
on the x-axis.
2. The difference between ethics and morals are as follows:
Aspects Ethics Morals
Definition Rules provided by an external source Principles regarding right and
wrong held by an individual
Source Institutions, organisations, societal norms Personal beliefs, cultural norms,
religious teachings
Application Specific situations and professional Personal behaviour and conduct
practices
Objective Maintain order and fairness in society Foster personal integrity and
align with personal values

Examples Medical ethics, business ethics, legal Personal beliefs about honesty,
ethics integrity, kindness
Origin External and often codified Internal and subjective
Scope Consistent within a profession or society Varies between individuals
Enforcement Enforced by external bodies (e.g., Self-governed and enforced by
professional organisations, legal systems) individual conscience
Flexibility Can change over time to reflect new More stable over time, but can
norms or societal changes evolve with personal growth

4 Touchpad AI Supplement-IX (Answer Key) (Ver.3.0)

IMPRINT 1
2. Data Literacy
Exercise
Unsolved Questions
SECTION A (Objective Type Questions)
uiz
A. 1. a 2. c 3. d 4. a 5. b 6. d
7. a 8. c 9. a 10. b
B. 1. Information literacy 2. Critical thinking 3. The DIKW pyramid
4. Secondary data sources 5. Data acquisition
6. Natural Language Processing (NLP) 7. Attributes
8. Skills 9. A strong password 10. Data backup
C. 1. False 2. True 3. True 4. False 5. False
6. False 7. True 8. False 9. True 10. False
D. . 1. Computer Vision - b. Image Data
2. NLP e. Qualitative Data
3. Textual Data - d. Data history
4. Sources of data a. Data scraping
5. Data Discovery c. Dataset search
SECTION B (Subjective Type Questions)
A. 1. i. A data pyramid, also known as the DIKW pyramid, represents the hierarchical relationship
between data, information, knowledge, and wisdom. It illustrates how raw data can
be processed to extract useful information, which in turn can lead to the formation of
knowledge and ultimately wisdom.
ii. ● A: Data
● B: Information
● C: Knowledge
● D: Wisdom
iii. ● Data: Raw facts and figures without context.
● Information: Data processed and organized to be meaningful.
● Knowledge: Information analyzed and applied to make decisions.
● Wisdom: The ability to make sound judgments and decisions based on knowledge.
2. It enables individuals to make informed decisions by understanding and interpreting data
accurately.

Touchpad AI Supplement-IX (Answer Key) (Ver.3.0) 5


IMPRINT 1
It enhances critical thinking skills, allowing individuals to question assumptions and analyze
data effectively.
3. Pie charts visually represent the proportions of different categories within a dataset, making
it easier to compare and understand the relative sizes of each category.
4. Data interpretation provides insights and context to raw data, enabling decision-makers to
understand trends, patterns, and correlations, which leads to more accurate and effective
decisions.

Training Data: It is data on which we train our AI project model. It is basically to fit the
parameters of the project for the model. In training data, the output is available to the model.

Testing Data: It is used to check the performance of an AI model. In testing data, the data
is not seen for which the predictions have to be made.
5.

Quantitative Data Qualitative Data


Data is depicted in numerical terms. Data is not depicted in numerical terms.
Can be shown in numbers and variables Could be about the behavioural
like ratio, percentage, and more. attributes of a person, or things.

Examples: 100%, 1:3, 123 Examples: loud behaviour, fair skin, soft
quality, and more.
6. Data processing involves tasks to refine raw data for analysis or application, including
cleaning, organising, transforming, and summarising information. It ensures data accuracy,
relevance, and accessibility for effective decision-making and analysis. It is crucial across
various sectors like business, science, and technology, facilitating better utilisation of data
assets. Data processing helps computers understand raw data. Use of computers to perform
different operations on data is included under data processing.
7. Kaggle is an online platform for data science and machine learning competitions. It provides
datasets, code, and community discussions, allowing data enthusiasts to practice and improve
their skills, collaborate with others, and gain exposure to real-world problems.
8. The data literacy framework provides a comprehensive and structured approach to develop
the necessary skills for using data efficiently and with all levels of awareness. Each level builds
upon the previous one, fostering a deeper and more understanding ability to work with data.
9. This means the development and enhancement of data literacy skills are not static or one-
time event. Instead, they evolve through continuous cycles of learning, application, and
refinement.
10. ● U
 nderstanding what data you have collected, how it is handled, processed, used, and
where it is stored.

● Only necessary data required for a project should be collected.
B. 1. 
● Take steps to understand and avoid any preferences or partiality in data

6 Touchpad AI Supplement-IX (Answer Key) (Ver.3.0)

IMPRINT 1
● Take necessary permissions before collecting or using an individual's data
● Explain how you intend to use the collected data and do not hide intentions
● Protect the identity of the person who is the source of data
● Take responsibility for your actions in case of misuse of data
2. Cyber attacks are becoming more frequent as a result of the growing volume of data stored
in the cloud. The best course of action given the volume of traffic being produced is to
regulate and secure the transmission of private or sensitive data everywhere, that it is known
to exist. Avoid entering sensitive information, such as your address, PAN, or Aadhar number
on unrecognised and unsafe websites.
The most possible reasons why data security is more important now are:
● A constant fear cyberattacks affect all people.
● The fast-technological changes will boom cyberattacks.
● A persistent fear everyone is impacted by cyberattacks.
● Rapid technical advancements will increase the frequency of cyberattacks.
3. Data backup refers to the process of creating copies of data to ensure that it can be
restored in the event of data loss due to natural disasters, accidents, cyber-attacks, or other
unexpected events. Sometimes physical backup media is used to secure in access-controlled
environments. Another method to secure data can be the cloud backup which is considered
more reliable.
4. AI systems often rely on vast amounts of data for training and operation. Unauthorised access
and tampering could lead to inaccurate AI models and compromised outcomes. Many AI
applications process sensitive data, such as personal, financial, or health-related information.
Strong data security measures can stop data breaches and unauthorised access.
5. ● Use strong, unique passwords with a mix of characters for each account.

● Activate Two-Factor Authentication (2FA) for added security.

● Download software from trusted sources only and scan files before opening.
6.` Numeric data can be further classified as:

Continuous Data Discrete Data

Continuous data can take any numeric value Discrete data refers to distinct single values.
within a specified range. It consists of whole numbers without decimal
parts that represent distinct categories or
values.
Continuous data is measurable. Discrete data is countable.
This type of data can be infinitely subdivided D i s c r e t e d a t a c a n n o t b e s u b d i v i d e d
and often includes decimal points. meaningfully.

Touchpad AI Supplement-IX (Answer Key) (Ver.3.0) 7


IMPRINT 1
Continuous Data Discrete Data

Often used to analyse using statistical It is used to analyse using frequency


techniques such as mean, median, standard distributions, bar charts, and probability
deviation, and correlation. distributions.
Examples: dimensions of classroom, height, Examples: number of girls and boys in class,
weight, temperature, time, etc. number of subjects in class 9th, count of
anything.
7. Natural Language Processing (NLP)
NLP is a subfield of AI that enables computers to understand and process human language.
Types of Data:


● Textual data: Articles, emails, social media posts.

● Audio data: Spoken language recordings transcribed into text.
Computer Vision

Computer Vision uses AI to help computers interpret images and videos.
Types of Data:


● Image data: Photos, satellite images, medical scans.

● Video data: Recorded videos.
Statistical Data

Statistical data analysis involves interpreting data to find patterns and insights for decision-
making.
Types of Data:


● Numeric data: Data from tables and spreadsheets.

● Time series data: Data recorded at specific time intervals, like stock prices and weather
data.
C. Competency-based/Application-based questions:
1. To organise and clean the dataset containing errors, duplicates, and missing values for the
co-curricular activity choices, you can follow these steps:
i. Inspect the Data: Review for errors, duplicates, and missing values.
ii. Remove Duplicates: Eliminate duplicate entries.
iii. Handle Missing Values: Impute or remove rows with missing data.
iv. Correct Errors: Fix invalid activity choices and ensure consistency.
v. Standardize Data: Ensure uniform format for activity names.
vi. Validate Choices: Ensure students select exactly 3 valid activities.
vii. Final Review: Verify the cleaned data is accurate and reliable.

8 Touchpad AI Supplement-IX (Answer Key) (Ver.3.0)

IMPRINT 1
The techniques and methods used to address these issues are:
• S
 preadsheets: Tools like Excel or Google Sheets can handle duplicate removal, data
cleaning, and formatting.
• P
 ython (pandas): For large datasets, using Python's pandas library can automate the
cleaning process through functions like isnull(), fillna(), and drop_duplicates().
Once cleaned, review the dataset thoroughly to ensure that no errors remain and that all
values are reliable and useful for analysis.
2. a. Quantitative data interpretation involves numerical data that can be measured and
quantified, while qualitative data interpretation involves descriptive data that can be
observed but not measured. Quantitative data interpretation methods include statistical
analysis and graphical representation, which provide objective, precise, and comparable
results. However, they may not capture the full context or nuances of the data and
require a good understanding of statistical methods. On the other hand, qualitative data
interpretation methods such as content analysis and thematic analysis provide in-depth
insights and a deeper understanding of context. They capture the complexity of human
experiences and perceptions but can be subjective, harder to generalize, and time-
consuming.
b. Quantitative data interpretation involves using descriptive statistics like mean, median,
mode, and standard deviation, as well as inferential statistics like hypothesis testing and
regression analysis. Visualization techniques such as bar charts, histograms, and scatter
plots are also commonly used. These methods offer objectivity, generalizability, and
precision but may overlook context and complexity.
Qualitative data interpretation involves methods like content analysis, which includes
coding textual data into manageable categories and identifying patterns, themes, and
relationships. Thematic analysis develops themes from the data and analyzes them to
interpret meanings and insights, while narrative analysis focuses on the stories and
personal accounts in the data. These methods provide depth of understanding and
context-rich insights but are subjective, limited in generalizability, and time-consuming.
By employing these methods and techniques, one can effectively interpret both quantitative
and qualitative data, leveraging their respective strengths while being mindful of their
limitations.
3. To present the company's sales performance across different regions during a quarterly
review meeting, I would use bar graphs, line charts, and pie charts to convey trends and
patterns effectively.
A bar graph will be used to compare total sales across different regions, highlighting which
regions are performing well and which are lagging. This visualization will help stakeholders
quickly assess regional performance and facilitate discussions on strategic adjustments.
A line chart will show sales trends over time for each region. This will help stakeholders
observe how sales have fluctuated throughout the year, identify any seasonal patterns, and

Touchpad AI Supplement-IX (Answer Key) (Ver.3.0) 9


IMPRINT 1
understand long-term trends. By seeing the sales trajectory, stakeholders can better predict
future performance and make data-driven decisions.
A pie chart will illustrate the percentage share of total sales by region, providing a quick
visual overview of the sales distribution. This will help stakeholders understand the relative
importance of each region to the company's overall sales.
Using these visualizations, stakeholders will gain a comprehensive understanding of the sales
performance across different regions, enabling them to make informed decisions based on
clear, visual data insights.

3. Maths For AI (Statistics & Probability)


Exercise
Unsolved Questions
SECTION A (Objective Type Questions)
uiz

A. 1. b 2. c 3. b 4. b 5. d
6. d 7. c 8. b
B. 1. AI 2. Meteorologists 3. Equally likely
4. Zero 5. Probabilistic 6. Equal 7. Probability 8. Epidemiologists
C. 1. True 2. False 3. True 4. False 5. True
6. False 7. True 8. True
D. 1. d 2. b
3. e (mapping function) (This is misprinted in your book, please correct it)
4. a 5. c
SECTION B (Subjective Type Questions)
A. 1. Mathematics and AI are interconnected fields, with Mathematics supplying the theoretical
foundations for many AI algorithms. Patterns are repeating designs or sequences that can
be observed in numbers, shapes, images, languages, or objects in our surroundings. They
follow a specific order or arrangement, making them easily recognisable. Mathematics aids
in the study of these patterns. These patterns allow you to solve puzzles. They help identify
an order or arrangement in lists of images or numbers. They are present everywhere around
us.
2. Patterns are regular and repeated ways in which data or events occur. For example, the
sequence of even numbers (2, 4, 6, 8) or the seasonal patterns in weather data.
3. Equal probability events are events that have the same chance of occurring. For example,

10 Touchpad AI Supplement-IX (Answer Key) (Ver.3.0)


IMPRINT 1
when flipping a fair coin, the probability of getting heads or tails is equal.
4. Collecting data is the first step in statistics and involves gathering relevant information from
various sources to analyze and draw conclusions.
5. Two applications of statistics in real life are:

● Analyzing consumer behavior in marketing to improve product sales.

● Assessing the effectiveness of medical treatments in healthcare.
6. The probability of wearing a white dress is 313\frac{3}{13}133.
7. One use of statistics in disaster management is to analyze past disaster data to predict and
prepare for future events.
8. One use of probability in finance is to assess the risk of investment portfolios and predict
future market trends.
B. 1. "Statistics is used for collecting, exploring, and analyzing the data." Statistics involves
several key steps: First, data collection involves gathering relevant information from various
sources such as surveys, experiments, or observational studies. For instance, if a company
wants to understand customer satisfaction, it would collect data through customer feedback
surveys. Next, exploring the data involves summarizing and visualizing it to uncover patterns
and trends. This could mean creating charts or tables to see the distribution of satisfaction
levels. Finally, analyzing the data involves applying statistical methods to draw conclusions
and make predictions. For example, statistical tests might reveal that customers who receive
timely support are more satisfied. Thus, statistics helps in making informed decisions based
on data.
2. Three uses of statistics in education:

● Analysing test scores and grades to evaluate student learning, identify areas for
improvement, and allocate resources effectively.

● Using data to identify gaps in the curriculum and areas where students need more support.

● Analysing how students and teachers use educational technology for future
implementations.
3. Concept of probability with a deck of 52 cards: Probability measures the likelihood of an
event occurring. In a standard deck of 52 cards, there are 4 suits (hearts, diamonds, clubs,
spades) with 13 cards each. If you want to calculate the probability of drawing a card from
a particular suit, say hearts, you would use the formula for probability:
Probability=Total number of outcomes/Number of favorable outcomes = 13/52=1/4
So, the probability of drawing a heart from the deck is 1/4, or 25%.
4. Likelihood of an event with examples: The term "likely" describes events that have a high
chance of occurring but are not guaranteed. For example, consider the likelihood of drawing
a card from a standard deck of 52 cards and it being a face card (Jack, Queen, or King). There
are 12 face cards in a deck:

Touchpad AI Supplement-IX (Answer Key) (Ver.3.0) 11


IMPRINT 1
Probability=Number of face cards/Total number of cards= 12/52
This probability suggests that while it’s not certain, it’s relatively likely to draw a face card
compared to other outcomes.
5. Role of probability in estimating road traffic:
Predicting Peak Traffic Hours: Probability models help forecast times of day when traffic
● 
congestion is most likely based on historical data. For instance, if data shows heavy traffic
between 8-9 AM, probability helps in planning routes to avoid congestion.
● Traffic Light Timing: Probability helps in optimizing traffic light timings to minimize

delays. If traffic data indicates high vehicle volume at certain times, light cycles can be
adjusted to improve flow.
Assessing Traffic Jam Risks: Probability estimates the likelihood of traffic jams during
● 
special events or adverse weather conditions. For example, if an event is expected to draw
large crowds, probability models can predict increased traffic and help in managing road
usage.
6. Likely, unlikely, impossible, and equal probability events:
● Tossing a Coin: Equal probability (both heads and tails have a 50% chance of occurring).

Rolling an 8 on a Standard Die: Impossible (a standard die only has faces numbered 1
● 
to 6).
Throwing Ten 5’s in a Row: Unlikely (the probability is very low, as the chance of getting
● 
a 5 on a single throw is 1/6, and this event occurring consecutively ten times is rare).
● Drawing a Card of Any Suit: Likely (every card drawn from the deck will be of one of

the four suits, so it's guaranteed that a suit will be drawn).
7. Examples of impossible and equal probability events:
● Impossible Events:

§ Rolling a 7 on a Standard Die: A standard die has only six faces, so rolling a 7 is
impossible.
§ Drawing a Card from an Empty Deck: If a deck has no cards, drawing one is impossible.
● Equal Probability Events:

§ Tossing a Fair Coin: Each side (heads or tails) has an equal chance of landing face up.
§ Rolling a Fair Six-Sided Die: Each number (1 through 6) has an equal chance of
appearing.
8. Certain Events and Likely Events with Examples:
Certain Events: These are events that are guaranteed to happen. For example, the sun
● 
rising in the east each morning is a certain event.
Likely Events: These are events that have a high chance of occurring but are not
● 
guaranteed. For example, during the winter season in a cold climate, it is likely to snow,
but it is not certain every day.

12 Touchpad AI Supplement-IX (Answer Key) (Ver.3.0)


IMPRINT 1
C. Competency-based/Application-based questions:
1. How will the student use the estimated probabilities to prepare for the exam?
The student can prioritize studying the topics with the highest probabilities of appearing on
the exam. For instance, Topic A with a probability of 0.8 and Topic D with a probability of 0.7
are more likely to be on the exam, so the student should focus more on these topics. This
targeted preparation can increase the chances of performing well in the exam by ensuring
the student is well-prepared for the most likely topics.
2. Role of statistics in launching a new smartphone:
Statistics help the company analyze market research data to understand consumer
preferences, potential demand, and market trends. By using statistical techniques such as
surveys, regression analysis, and forecasting models, the company can make data-driven
decisions about product features, pricing, and marketing strategies. This reduces the risk of
product failure and helps in aligning the product with market needs.
3. Applications of probability in predicting earthquakes:
Seismic Risk Assessment: Probability models estimate the likelihood of earthquakes
● 
occurring in different regions based on historical data and fault lines. This helps in
identifying areas at higher risk and planning for mitigation.
● Aftershock Forecasting: After a significant earthquake, probability models predict the

likelihood and intensity of aftershocks, helping in emergency response and preparedness.
4. Examples of probability theory in artificial intelligence:
● Spam Filtering: AI uses probability to classify emails as spam or not based on the
likelihood of certain words or patterns appearing in spam emails. This helps in effectively
filtering out unwanted messages.
● Recommendation Systems: Probability models predict which products or content a user
is likely to be interested in based on their past behavior and preferences, enhancing the
accuracy of recommendations.

4. Introduction to Generative AI
Exercise
Unsolved Questions
SECTION A (Objective Type Questions)
uiz
A. 1. c 2. b 3. c 4. c 5. b
6. c 7. b 8. a 9. b

Touchpad AI Supplement-IX (Answer Key) (Ver.3.0) 13


IMPRINT 1
B. 1. Unsupervised 2. Ethical 3. Variational Autoencoders
4. Bias 5. Generative AI 6. Deepfakes
7. Sequential 8. Real 9. Images 10. Runway ML
C. 1. True 2. True 3. False 4. True 5. True
6. False 7. False 8. True 9. False 10. False
D. 1. e 2. c 3. a 4. b 5. d
SECTION B (Subjective Type Questions)
A. 1. Autoencoders compress data into a latent space and then reconstruct the input data, focusing
on dimensionality reduction. VAEs, on the other hand, generate new data points by learning
the distribution of the input data and sampling from this distribution.
2. Example 1: VAE for generating handwritten digits (MNIST dataset).
Example 2: VAE for generating faces using the CelebA dataset.

3. AlphaGo was introduced in October 2015.
4. Example 1: GPT-3 (by OpenAI)
Example 2: DALL-E (by OpenAI)
Example 3: StyleGAN (by NVIDIA)
Example 4: Artbreeder
5. Generative AI: Used for creating new content such as text, images, music, and videos.
Examples include generating realistic human faces, writing essays, and composing music.
Conventional AI: Typically used for tasks such as classification, regression, prediction, and

optimization. Examples include fraud detection, recommendation systems, and speech
recognition.
6. Generative AI models require large datasets to capture the variability and complexity of
the data distribution they aim to model. The quality and diversity of the generated outputs
depend on the richness of the training data. Insufficient data can lead to overfitting and
poor generalization, resulting in less realistic or diverse generated content.
7. Generative AI models can perpetuate and amplify biases present in the training data. If the
training data contains biased representations or stereotypes, the generated outputs can
reflect and reinforce these biases. This can lead to unfair or harmful outcomes, especially in
sensitive applications like hiring, law enforcement, or healthcare.
8. Feature 1: Allows users to blend and evolve images to create unique artworks.
Feature 2: Provides sliders for users to adjust different attributes of images, such as age,

gender, and art style.
9. Consideration 1: The potential for misuse in creating deepfakes, which can be used for
malicious purposes such as misinformation, fraud, and invasion of privacy.
Consideration 2: The need for transparency and accountability in the use of generative AI,

ensuring that users are aware when they are interacting with AI-generated content.

14 Touchpad AI Supplement-IX (Answer Key) (Ver.3.0)


IMPRINT 1
10. Generative AI can pose privacy risks by generating realistic synthetic data that can be used
to impersonate individuals or reconstruct private information. Additionally, if generative
models are trained on sensitive data without proper anonymization, they can inadvertently
leak confidential information, leading to data breaches and misuse.
B. 1. Generative AI has transformed music creation and production. Tools like OpenAI's MuseNet
can compose music in various genres, while Amper Music generates custom tracks by setting
parameters such as mood and tempo, making music production more accessible. Platforms
like Endel create personalized soundscapes based on user activities, enhancing listening
experiences. In addition, AI can suggest different arrangements and instrumentations, aiding
composers in exploring new sounds. AI also improves old recordings by removing noise and
filling gaps. AIVA (Artificial Intelligence Virtual Artist) is an example, composing symphonic
music for films, ads, and games.
2. Autoencoders (AEs) are neural networks that learn to compress data into a latent space and
then reconstruct it. They are used for tasks like dimensionality reduction and feature learning.
Key features include:
Dimensionality Reduction: Compresses data to lower dimensions for easier visualization
● 
and reduced computational cost.
● Data Denoising: Removes noise from data, improving quality.

● Anomaly Detection: Identifies outliers by reconstructing normal data poorly.

● Feature Learning: Learns useful features for tasks like classification.

Examples include image compression, denoising photographs, and fraud detection.
3. Recurrence

Output Layer

Input Layer

Hidden Layer

4. a. Generative AI image generation tools: Midjourney and Magic Studio


b. Generative AI text generation tools: Notion AI and Compose
c. Generative AI audio generation tools: FineShare and Boomy AI
5. Architecture: Generates multiple design variations and optimizes plans, enhancing creativity
and efficiency.

Touchpad AI Supplement-IX (Answer Key) (Ver.3.0) 15


IMPRINT 1
Coding: Generates boilerplate code and suggests completions, increasing productivity and

reducing development time.
Music: Composes original music and generates background scores, providing new tools for

creativity.
Content Creation: Produces high-quality written content, realistic images, and videos,

enabling quick production of diverse media.

C. Competency-based/Application-based questions:
1. Verify Sources: Ensure the AI-generated content is cross-referenced with credible sources.
Avoid Plagiarism: Use AI tools to generate ideas and outlines but write the essay in your

own words.
Understand the Topic: Use AI for research and learning, but make sure you understand the

material thoroughly.
Cite Properly: If using AI-generated content, ensure it is properly cited to avoid plagiarism.

Ethical Use: Avoid using AI to generate the entire essay; instead, use it to enhance your

understanding and provide additional perspectives.
2. Generative AI can revolutionize the creative industry by enabling the generation of unique
and innovative designs. In art, AI can create new styles and compositions, offering artists
novel ideas and expanding their creative horizons. In fashion, AI can design clothing and
accessories, predict trends, and customize designs for individual preferences. This technology
fosters creativity by providing diverse and original concepts that may not have been conceived
by human designers alone.
3. To ensure responsible use of generative AI, it is crucial to establish guidelines and regulations
that balance benefits and risks. These guidelines could include:
Transparency: Ensuring AI-generated content is clearly labeled.
Accountability: Implementing accountability measures for creators and users of AI.
Bias Mitigation: Developing methods to detect and mitigate biases in AI-generated content.
Privacy Protection: Safeguarding personal data used in AI training and generation.
Ethical Standards: Encouraging adherence to ethical standards in AI development and usage.
Balancing the potential benefits and risks involves fostering innovation while protecting
individuals and society from potential harm.
4. Parameter 1: Human-Like Response
ChatGPT: Highly conversational and natural language generation.
Gemini: Focuses on natural language understanding with precise and coherent responses.
Copilot: Provides code suggestions with a conversational aspect for coding assistance.
Parameter 2: Training Dataset and Underlying Technology
ChatGPT: Trained on a diverse dataset using GPT architecture.

16 Touchpad AI Supplement-IX (Answer Key) (Ver.3.0)


IMPRINT 1
Gemini: Uses proprietary datasets and technology optimized for dialogue.
Copilot: Based on OpenAI Codex, trained on a large dataset of code from GitHub.
Parameter 3: Authenticity of Response
ChatGPT: High-quality responses but may occasionally generate plausible-sounding incorrect
information.
Gemini: Emphasizes accurate and reliable information.
Copilot: Focused on accurate code generation and documentation.
Parameter 4: Access to the Internet
ChatGPT: No real-time internet access.
Gemini: Typically does not access the internet in real-time.
Copilot: No real-time internet access, trained on static data.
Parameter 5: User Friendliness and Interface
ChatGPT: User-friendly with intuitive interfaces across platforms.
Gemini: Designed for ease of use in conversational contexts.
Copilot: Integrated into code editors for seamless coding assistance.
Parameter 6: Text Processing: Summarisation, Paragraph Writing, Etc.
ChatGPT: Excellent at summarization and generating coherent paragraphs.
Gemini: Strong in generating concise and relevant text.
Copilot: Primarily focuses on code-related text generation.
Parameter 7: Charges and Price
ChatGPT: Various pricing tiers, including free access with limitations.
Gemini: Pricing depends on usage and integration specifics.
Copilot: Subscription-based model, typically around $10/month.
5. Idea Generation: Use AI to generate initial concepts and inspiration for the bridge design.
Sketch Variations: Generate multiple design variations to explore different possibilities quickly.
Refinement: Use AI-generated sketches as a base to refine and develop unique designs.
Collaborative Tool: Collaborate with classmates or mentors to review and improve AI-
generated concepts.
Ethical Use: Ensure originality by not solely relying on AI-generated designs; use them as a
tool for inspiration and enhancement.

Touchpad AI Supplement-IX (Answer Key) (Ver.3.0) 17


IMPRINT 1

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy