Amazon SAA-C02 AWS Certified Solutions Architect – Associate Online Training
Amazon SAA-C02 Online Training
The questions for SAA-C02 were last updated at Feb 28,2026.
- Exam Code: SAA-C02
- Exam Name: AWS Certified Solutions Architect – Associate
- Certification Provider: Amazon
- Latest update: Feb 28,2026
A company is implementing new data retention policies for all databases that run on Amazon RDS DB instances. The company must retain daily backups for a minimum period of 2 years. The backups must be consistent and restorable.
Which solution should a solutions architect recommend to meet these requirements?
- A . Create a backup vault in AWS Backup to retain RDS backups. Create a new backup plan with a daily schedule and an expiration period of 2 years after creation. Assign the RDS DB instances to the backup plan.
Configure a backup window for the RDS DB Instances for daily snapshots. Assign a snapshot retention policy of 2 years to each RDS DB instance. Use Amazon Data Lifecycle Manager (Amazon DLM) - B . to schedule snapshot deletions.
- C . Configure database transaction logs to be automatically backed up to Amazon CloudWatch Logs with an expiration period of 2 years
- D . Configure an AWS Database Migration Service (AWS DMS) replication task. Deploy a replication instance, and configure a change data capture (CDC) task to stream database changes to Amazon S3 as the target Configure S3 Lifecycle policies to delete the snapshots after 2 years.
A company is hosting an application in its own data center. The application uses Amazon S3 for data storage. The application transfers several hundred terabytes of data every month to and from Amazon S3. The company needs to minimize the cost of this data transfer
Which solution meets this requirement?
- A . Establish an AWS Direct Connect connection between the AWS Region in use and the company’s data center Route traffic to Amazon S3 over the Direct Connect connection
- B . Establish an AWS Site-to-Site VPN connection between the company’s data center and a VPC in the AWS Region in use. Create a VPC endpoint for Amazon S3 in the VPC. Route traffic to Amazon S3 over the VPN connection to the S3 endpoint.
- C . Create an AWS Storage Gateway file gateway Deploy the software appliance in the company’s data center Configure the application to use the file gateway to store and retrieve files
- D . Create an FTPS server by using AWS Transfer Family. Configure the application to use the FTPS server to store and retrieve files
A company needs a storage solution for an application that runs on a high performance computing (HPC) cluster. The cluster is hosted on AWS Fargate for Amazon Elastic Container Service (Amazon ECS). The company needs a mountable file system that provides concurrent access to files while delivering hundreds of GBps of throughput at sub-millisecond latencies
Which solution meets these requirements?
- A . Create an Amazon FSx for Lustre file share for the application data Create an IAM role that allows Fargate to access the FSx for Lustre file share
- B . Create an Amazon Elastic File System (Amazon EFS) file share for the application data. Create an IAM role that allows Fargate to access the EFS file share.
- C . Create an Amazon S3 bucket for the application data. Create an S3 bucket policy that allows Fargate to access the S3 bucket
- D . Create an Amazon Elastic Block Store (Amazon EBS) Provisioned IOPS SSD (io2) volume for the application data Create an IAM role that allows Fargate to access the volume.
A company has applications that are deployed in multiple AWS Regions. The applications use an architecture that is based on Amazon EC2, Amazon Elastic Block Store (Amazon EBS), Amazon Elastic File System (Amazon EFS). and Amazon DynamoDB
The company lacks a mechanism for centralized data backup. A solutions architect must centralize data backup with the least possible operational effort.
What should the solutions architect do to meet these requirements?
- A . Tag all resources by project Use AWS Systems Manager to set up snapshots by project and set DynamoDB incremental backups.
- B . Tag all resources by project. Create backup plans in AWS Backup to back up the data by tag name according to each project’s needs.
- C . Tag all resources by project Create an AWS Lambda function to run on schedule and take snapshots of each EC2 instance. EBS volume, and EFS file system by project Configure the function to invoke DynamoDB on-demand backup.
- D . Use AWS CloudFormation to create a template for every new project so that all resources can be recreated at any time. Set the template to take daily snapshots of each EC2 instance r EBS volume and EFS file system Set the template to use DynamoDB on-demand backup for daily backups
An application runs on Amazon EC2 instances across multiple Availability Zones. The instances run in an Amazon EC2 Auto Scaling group behind an Application Load Balancer. The application performs best when the CPU utilization of the EC2 instances is at or near 40%.
What should a solutions architect do to maintain the desired performance across all instances in the group?
- A . Use a simple scaling policy to dynamically scale the Auto Scaling group
- B . Use a target tracking policy to dynamically scale the Auto Scaling group
- C . Use an AWS Lambda function to update the desired Auto Scaling group capacity.
- D . Use scheduled scaling actions to scale up and scale down the Auto Scaling group
A company runs a web application that is backed by Amazon RDS. A new database administrator caused data loss by accidentally editing information in a database table To help recover from this type of incident, the company wants the ability to restore the database to its state from 5 minutes before any change within the last 30 days.
Which feature should the solutions architect include in the design to meet this requirement?
- A . Read replicas
- B . Manual snapshots
- C . Automated backups
- D . Multi-AZ deployments
A company is running an application on Amazon EC2 instances. Traffic to the workload increases substantially during business hours and decreases afterward. The CPU utilization of an EC2 instance is a strong indicator of end-user demand on the application. The company has configured an Auto Scaling group to have a minimum group size of 2 EC2 instances and a maximum group size of 10 EC2 instances.
The company is concerned that the current scaling policy that is associated with the Auto Scaling group might not be correct. The company must avoid over-provisioning EC2 instances and incurring unnecessary costs.
What should a solutions architect recommend to meet these requirements?
- A . Configure Amazon EC2 Auto Scaling to use a scheduled scaling plan and launch an additional 8 EC2 instances during business hours.
- B . Configure AWS Auto Scaling to use a scaling plan that enables predictive scaling. Configure predictive scaling with a scaling mode of forecast and scale, and to enforce the maximum capacity setting during scaling.
- C . Configure a step scaling policy to add 4 EC2 instances at 50% CPU utilization and add another 4 EC2 instances at 90% CPU utilization. Configure scale-in policies to perform the reverse and remove EC2 instances based on the two values.
- D . Configure AWS Auto Scaling to have a desired capacity of 5 EC2 instances, and disable any existing scaling policies. Monitor the CPU utilization metric for 1 week. Then create dynamic scaling policies that are based on the observed values.
A company’s packaged application dynamically creates and returns single-use text files in response to user requests. The company is using Amazon CloudFront for distribution^ but wants to further reduce data transfer costs. The company cannot modify the application’s source code
What should a solutions architect do to reduce costs?
- A . Use Lambda@Edge to compress the files as they are sent to users.
- B . Enable Amazon S3 Transfer Acceleration to reduce the response times
- C . Enable caching on the CloudFront distribution to store generated files at the edge.
- D . Use Amazon S3 multipart uploads to move the files to Amazon S3 before returning them to users
A solutions architect must provide a fully managed replacement for an on-premises solution that allows employees and partners to exchange files. The solution must be easily accessible to employees connecting from on-premises systems, remote employees, and external partners
Which solution meets these requirements?
- A . Use AWS Transfer for SFTP to transfer files into and out of Amazon S3.
- B . Use AWS Snowball Edge for local storage and large-scale data transfers
- C . Use Amazon FSx to store and transfer files to make them available remotely.
- D . Use AWS Storage Gateway to create a volume gateway to store and transfer files to Amazon S3
What should a solutions architect do to ensure that all objects uploaded to an Amazon S3 bucket are encrypted?
- A . Update the bucket policy to deny if the PutObject does not have an s3 x-amz-acl header set
- B . Update the bucket policy to deny if the PutObject does not have an s3:x-amz-aci header set to private.
- C . Update the bucket policy to deny if the PutObject does not have an aws SecureTransport header set to true
- D . Update the bucket policy to deny if the PutObject does not have an x-amz-server-side-encryption header set.