AWS Certified Data Analytics Specialty Exam Guide
AWS Certified Data Analytics Specialty Exam Guide
Introduction
The AWS Certified Data Analytics – Specialty (DAS-C01) exam is intended for
individuals who perform a data analytics role. The exam validates a candidate’s ability
to use AWS services to design, build, secure, and maintain analytics solutions that
provide insight from data.
The exam also validates a candidate’s ability to complete the following tasks:
• Define AWS data analytics services and understand how they integrate with
each other.
• Explain how AWS data analytics services fit in the data lifecycle of collection,
storage, processing, and visualization.
Job tasks that are out of scope for the target candidate
The following list contains job tasks that the target candidate is not expected to be
able to perform. This list is non-exhaustive. These tasks are out of scope for the exam:
Refer to the Appendix for a list of in-scope AWS services and features and a list of
out-of-scope AWS services and features.
• Multiple choice: Has one correct response and three incorrect responses
(distractors)
• Multiple response: Has two or more correct responses out of five or more
response options
Select one or more responses that best complete the statement or answer the
question. Distractors, or incorrect answers, are response options that a candidate with
incomplete knowledge or skill might choose. Distractors are generally plausible
responses that match the content area.
Unanswered questions are scored as incorrect; there is no penalty for guessing. The
exam includes 50 questions that affect your score.
Unscored content
The exam includes 15 unscored questions that do not affect your score. AWS collects
information about performance on these unscored questions to evaluate these
questions for future use as scored questions. These unscored questions are not
identified on the exam.
Exam results
The AWS Certified Data Analytics – Specialty (DAS-C01) exam has a pass or fail
designation. The exam is scored against a minimum standard established by AWS
professionals who follow certification industry best practices and guidelines.
Your results for the exam are reported as a scaled score of 100–1,000. The minimum
passing score is 750. Your score shows how you performed on the exam as a whole
and whether you passed. Scaled scoring models help equate scores across multiple
exam forms that might have slightly different difficulty levels.
Each section of the exam has a specific weighting, so some sections have more
questions than other sections have. The table of classifications contains general
information that highlights your strengths and weaknesses. Use caution when you
interpret section-level feedback.
Content outline
This exam guide includes weightings, content domains, and task statements for the
exam. This guide does not provide a comprehensive list of the content on the exam.
However, additional context for each task statement is available to help you prepare
for the exam.
Domain 1: Collection
• Confirm that data loss is within tolerance limits in the event of failures.
• Evaluate costs associated with data acquisition, transfer, and provisioning
from various sources into the collection system (for example, networking,
bandwidth, ETL, data migration).
• Assess the failure scenarios that the collection system may experience, and
take remediation actions based on impact.
• Determine data persistence at various points of data capture.
• Identify the latency characteristics of the collection system.
Task Statement 1.3: Select a collection system that addresses the key properties of
data, such as order, format, and compression.
Task Statement 2.1: Determine the operational characteristics of the storage solution
for analytics.
Task Statement 2.3: Select appropriate data layout, schema, structure, and format.
Task Statement 2.4: Define data lifecycles based on usage patterns and business
requirements.
Domain 3: Processing
Task Statement 3.2: Design a solution to transform and prepare data for analysis.
• Apply appropriate ETL and ELT techniques for batch workloads and real-
time workloads.
• Implement failover, scaling, and replication mechanisms.
• Implement techniques to address concurrency needs.
• Implement techniques to improve cost-optimization efficiencies.
• Orchestrate workflows.
• Aggregate and enrich data for downstream consumption.
Task Statement 4.2: Select the appropriate data analysis solution for a given scenario.
Task Statement 4.3: Select the appropriate data visualization solution for a given
scenario.
The following list contains AWS services and features that are in scope for the exam.
This list is non-exhaustive and is subject to change. AWS offerings appear in
categories that align with the offerings’ primary functions:
Analytics:
• Amazon Athena
• Amazon CloudSearch
• Amazon EMR
• AWS Glue
• Amazon Kinesis
• AWS Lake Formation
• Amazon Managed Streaming for Apache Kafka (Amazon MSK)
• Amazon OpenSearch Service
• Amazon QuickSight
Application Integration:
• Amazon MQ
• Amazon Simple Notification Service (Amazon SNS)
• Amazon Simple Queue Service (Amazon SQS)
• AWS Step Functions
Compute:
• AWS CloudFormation
• AWS CloudTrail
• Amazon CloudWatch
• AWS Trusted Advisor
Machine Learning:
• Amazon SageMaker
• AWS Artifact
• AWS Certificate Manager (ACM)
• AWS CloudHSM
• Amazon Cognito
• AWS IAM Identity Center (successor to AWS Single Sign-On)
• AWS Identity and Access Management (IAM)
• AWS Key Management Service (AWS KMS)
• Amazon Macie
• AWS Secrets Manager
Storage:
The following list contains AWS services and features that are out of scope for the
exam. This list is non-exhaustive and is subject to change. AWS offerings that are
entirely unrelated to the target job roles for the exam are excluded from this list:
Internet of Things:
Media Services:
Survey
How useful was this exam guide? Let us know by taking our survey.