Exam4Training

Amazon DVA-C01 AWS Certified Developer – Associate Online Training

Question #34

Which are the following additional Metadata columns Stream contains that could be used for creating Efficient Data science Pipelines & helps in transforming only the New/Modified data only?

  • A . METADATA$ACTION
  • B . METADATA$FILE_ID
  • C . METADATA$ISUPDATE
  • D . METADATA$DELETE
  • E . METADATA$ROW_ID

Reveal Solution Hide Solution

Correct Answer: A, C, E
A, C, E

Explanation:

A stream stores an offset for the source object and not any actual table columns or data. When que-ried, a stream accesses and returns the historic data in the same shape as the source object (i.e. the same column names and ordering) with the following additional columns: METADATA$ACTION

Indicates the DML operation (INSERT, DELETE) recorded.

METADATA$ISUPDATE

Indicates whether the operation was part of an UPDATE statement. Updates to rows in the source object are represented as a pair of DELETE and INSERT records in the stream with a metadata column METADATA$ISUPDATE values set to TRUE.

Note that streams record the differences between two offsets. If a row is added and then updated in the current offset, the delta change is a new row. The METADATA$ISUPDATE row records a FALSE

value.

METADATA$ROW_ID

Specifies the unique and immutable ID for the row, which can be used to track changes to specific rows over time.

Question #34

Which are the following additional Metadata columns Stream contains that could be used for creating Efficient Data science Pipelines & helps in transforming only the New/Modified data only?

  • A . METADATA$ACTION
  • B . METADATA$FILE_ID
  • C . METADATA$ISUPDATE
  • D . METADATA$DELETE
  • E . METADATA$ROW_ID

Reveal Solution Hide Solution

Correct Answer: A, C, E
A, C, E

Explanation:

A stream stores an offset for the source object and not any actual table columns or data. When que-ried, a stream accesses and returns the historic data in the same shape as the source object (i.e. the same column names and ordering) with the following additional columns: METADATA$ACTION

Indicates the DML operation (INSERT, DELETE) recorded.

METADATA$ISUPDATE

Indicates whether the operation was part of an UPDATE statement. Updates to rows in the source object are represented as a pair of DELETE and INSERT records in the stream with a metadata column METADATA$ISUPDATE values set to TRUE.

Note that streams record the differences between two offsets. If a row is added and then updated in the current offset, the delta change is a new row. The METADATA$ISUPDATE row records a FALSE

value.

METADATA$ROW_ID

Specifies the unique and immutable ID for the row, which can be used to track changes to specific rows over time.

Question #34

Which are the following additional Metadata columns Stream contains that could be used for creating Efficient Data science Pipelines & helps in transforming only the New/Modified data only?

  • A . METADATA$ACTION
  • B . METADATA$FILE_ID
  • C . METADATA$ISUPDATE
  • D . METADATA$DELETE
  • E . METADATA$ROW_ID

Reveal Solution Hide Solution

Correct Answer: A, C, E
A, C, E

Explanation:

A stream stores an offset for the source object and not any actual table columns or data. When que-ried, a stream accesses and returns the historic data in the same shape as the source object (i.e. the same column names and ordering) with the following additional columns: METADATA$ACTION

Indicates the DML operation (INSERT, DELETE) recorded.

METADATA$ISUPDATE

Indicates whether the operation was part of an UPDATE statement. Updates to rows in the source object are represented as a pair of DELETE and INSERT records in the stream with a metadata column METADATA$ISUPDATE values set to TRUE.

Note that streams record the differences between two offsets. If a row is added and then updated in the current offset, the delta change is a new row. The METADATA$ISUPDATE row records a FALSE

value.

METADATA$ROW_ID

Specifies the unique and immutable ID for the row, which can be used to track changes to specific rows over time.

Question #34

Which are the following additional Metadata columns Stream contains that could be used for creating Efficient Data science Pipelines & helps in transforming only the New/Modified data only?

  • A . METADATA$ACTION
  • B . METADATA$FILE_ID
  • C . METADATA$ISUPDATE
  • D . METADATA$DELETE
  • E . METADATA$ROW_ID

Reveal Solution Hide Solution

Correct Answer: A, C, E
A, C, E

Explanation:

A stream stores an offset for the source object and not any actual table columns or data. When que-ried, a stream accesses and returns the historic data in the same shape as the source object (i.e. the same column names and ordering) with the following additional columns: METADATA$ACTION

Indicates the DML operation (INSERT, DELETE) recorded.

METADATA$ISUPDATE

Indicates whether the operation was part of an UPDATE statement. Updates to rows in the source object are represented as a pair of DELETE and INSERT records in the stream with a metadata column METADATA$ISUPDATE values set to TRUE.

Note that streams record the differences between two offsets. If a row is added and then updated in the current offset, the delta change is a new row. The METADATA$ISUPDATE row records a FALSE

value.

METADATA$ROW_ID

Specifies the unique and immutable ID for the row, which can be used to track changes to specific rows over time.

Question #34

Which are the following additional Metadata columns Stream contains that could be used for creating Efficient Data science Pipelines & helps in transforming only the New/Modified data only?

  • A . METADATA$ACTION
  • B . METADATA$FILE_ID
  • C . METADATA$ISUPDATE
  • D . METADATA$DELETE
  • E . METADATA$ROW_ID

Reveal Solution Hide Solution

Correct Answer: A, C, E
A, C, E

Explanation:

A stream stores an offset for the source object and not any actual table columns or data. When que-ried, a stream accesses and returns the historic data in the same shape as the source object (i.e. the same column names and ordering) with the following additional columns: METADATA$ACTION

Indicates the DML operation (INSERT, DELETE) recorded.

METADATA$ISUPDATE

Indicates whether the operation was part of an UPDATE statement. Updates to rows in the source object are represented as a pair of DELETE and INSERT records in the stream with a metadata column METADATA$ISUPDATE values set to TRUE.

Note that streams record the differences between two offsets. If a row is added and then updated in the current offset, the delta change is a new row. The METADATA$ISUPDATE row records a FALSE

value.

METADATA$ROW_ID

Specifies the unique and immutable ID for the row, which can be used to track changes to specific rows over time.

Question #34

Which are the following additional Metadata columns Stream contains that could be used for creating Efficient Data science Pipelines & helps in transforming only the New/Modified data only?

  • A . METADATA$ACTION
  • B . METADATA$FILE_ID
  • C . METADATA$ISUPDATE
  • D . METADATA$DELETE
  • E . METADATA$ROW_ID

Reveal Solution Hide Solution

Correct Answer: A, C, E
A, C, E

Explanation:

A stream stores an offset for the source object and not any actual table columns or data. When que-ried, a stream accesses and returns the historic data in the same shape as the source object (i.e. the same column names and ordering) with the following additional columns: METADATA$ACTION

Indicates the DML operation (INSERT, DELETE) recorded.

METADATA$ISUPDATE

Indicates whether the operation was part of an UPDATE statement. Updates to rows in the source object are represented as a pair of DELETE and INSERT records in the stream with a metadata column METADATA$ISUPDATE values set to TRUE.

Note that streams record the differences between two offsets. If a row is added and then updated in the current offset, the delta change is a new row. The METADATA$ISUPDATE row records a FALSE

value.

METADATA$ROW_ID

Specifies the unique and immutable ID for the row, which can be used to track changes to specific rows over time.

Question #34

Which are the following additional Metadata columns Stream contains that could be used for creating Efficient Data science Pipelines & helps in transforming only the New/Modified data only?

  • A . METADATA$ACTION
  • B . METADATA$FILE_ID
  • C . METADATA$ISUPDATE
  • D . METADATA$DELETE
  • E . METADATA$ROW_ID

Reveal Solution Hide Solution

Correct Answer: A, C, E
A, C, E

Explanation:

A stream stores an offset for the source object and not any actual table columns or data. When que-ried, a stream accesses and returns the historic data in the same shape as the source object (i.e. the same column names and ordering) with the following additional columns: METADATA$ACTION

Indicates the DML operation (INSERT, DELETE) recorded.

METADATA$ISUPDATE

Indicates whether the operation was part of an UPDATE statement. Updates to rows in the source object are represented as a pair of DELETE and INSERT records in the stream with a metadata column METADATA$ISUPDATE values set to TRUE.

Note that streams record the differences between two offsets. If a row is added and then updated in the current offset, the delta change is a new row. The METADATA$ISUPDATE row records a FALSE

value.

METADATA$ROW_ID

Specifies the unique and immutable ID for the row, which can be used to track changes to specific rows over time.

Question #34

Which are the following additional Metadata columns Stream contains that could be used for creating Efficient Data science Pipelines & helps in transforming only the New/Modified data only?

  • A . METADATA$ACTION
  • B . METADATA$FILE_ID
  • C . METADATA$ISUPDATE
  • D . METADATA$DELETE
  • E . METADATA$ROW_ID

Reveal Solution Hide Solution

Correct Answer: A, C, E
A, C, E

Explanation:

A stream stores an offset for the source object and not any actual table columns or data. When que-ried, a stream accesses and returns the historic data in the same shape as the source object (i.e. the same column names and ordering) with the following additional columns: METADATA$ACTION

Indicates the DML operation (INSERT, DELETE) recorded.

METADATA$ISUPDATE

Indicates whether the operation was part of an UPDATE statement. Updates to rows in the source object are represented as a pair of DELETE and INSERT records in the stream with a metadata column METADATA$ISUPDATE values set to TRUE.

Note that streams record the differences between two offsets. If a row is added and then updated in the current offset, the delta change is a new row. The METADATA$ISUPDATE row records a FALSE

value.

METADATA$ROW_ID

Specifies the unique and immutable ID for the row, which can be used to track changes to specific rows over time.

Question #9

Rebuild the environment with the new load balancer type.

Reveal Solution Hide Solution

Correct Answer: B

Explanation:

https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features.managing.elb.html

By default, Elastic Beanstalk creates an Application Load Balancer for your environment when you enable load balancing with the Elastic Beanstalk console or the EB CLI. It configures the load balancer to listen for HTTP traffic on port 80 and forward this traffic to instances on the same port. You can choose the type of load balancer that your environment uses only during environment creation. Later, you can change settings to manage the behavior of your running environment’s load balancer, but you can’t change its type.

Question #10

Queries to an Amazon DynamoDB table are consuming a large amount of read capacity. The table has a significant number of large attributes. The application does not need all of the attribute data.

How can DynamoDB costs be minimized while maximizing application performance?

  • A . Batch all the writes, and perform the write operations when no or few reads are being performed.
  • B . Create a global secondary index with a minimum set of projected attributes.
  • C . Implement exponential backoffs in the application.
  • D . Load balance the reads to the table using an Application Load Balancer.

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

https://docs.aws.amazon.com/AWSEC2/latest/APIReference/query-api-troubleshooting.html

Question #11

A Developer wants to use AWS X-Ray to trace a user request end-to-end throughput the software stack. The Developer made the necessary changes in the application tested it, and found that the application is able to send the traces to AWS X-Ray. However, when the application is deployed to an EC2 instance, the traces are not available.

Which of the following could create this situation? (Select two.)

  • A . The traces are reaching X-Ray, but the Developer does not have access to view the records.
  • B . The X-Ray daemon is not installed on the EC2 instance.
  • C . The X-Ray endpoint specified in the application configuration is incorrect.
  • D . The instance role does not have “xray:BatchGetTraces” and “xray:GetTraceGraph” permissions.
  • E . The instance role does not have “xray:PutTraceSegments” and “xray:PutTelemetryRecords” permissions.

Reveal Solution Hide Solution

Correct Answer: B,E
Question #12

A developer Is designing an AWS Lambda function that create temporary files that are less than 10 MB during execution. The temporary files will be accessed and modified multiple times during execution. The developer has no need to save or retrieve these files in the future.

Where should the temporary file be stored?

  • A . the /tmp directory
  • B . Amazon EFS
  • C . Amazon EBS
  • D . Amazon S3

Reveal Solution Hide Solution

Correct Answer: A
Question #13

A Developer is writing a REST service that will add items to a shopping list. The service is built on Amazon API Gateway with AWS Lambda integrations. The shopping list items are send as query string parameters in the method request.

How should the Developer convert the query string parameters to arguments for the Lambda function?

  • A . Enable request validation
  • B . Include the Amazon Resource Name (ARN) of the Lambda function
  • C . Change the integration type
  • D . Create a mapping template

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

https://docs.aws.amazon.com/apigateway/latest/developerguide/integrating-api-with-aws-services-lambda.html#api-as-lambda-proxy-expose-get-method-with-query-strings-to-call-lambda-function

Question #14

A Developer wants to encrypt new objects that are being uploaded to an Amazon S3 bucket by an application. There must be an audit trail of who has used the key during this process. There should be no change to the performance of the application.

Which type of encryption meets these requirements?

  • A . Server-side encryption using S3-managed keys
  • B . Server-side encryption with AWS KMS-managed keys
  • C . Client-side encryption with a client-side symmetric master key
  • D . Client-side encryption with AWS KMS-managed keys

Reveal Solution Hide Solution

Correct Answer: B
Question #15

A Developer has developed a web application and wants to deploy it quickly on a Tomcat

server on AWS. The Developer wants to avoid having to manage the underlying infrastructure.

What is the easiest way to deploy the application, based on these requirements?

  • A . AWS CloudFormation
  • B . AWS Elastic Beanstalk
  • C . Amazon S3
  • D . AWS CodePipeline

Reveal Solution Hide Solution

Correct Answer: B
Question #16

A company is building a compute-intensive application that will run on a fleet of Amazon EC2 instances. The application uses attached Amazon EBS disks for storing data. The application will process sensitive information and all the data must be encrypted.

What should a developer do to ensure the data is encrypted on disk without impacting performance?

  • A . Configure the Amazon EC2 instance fleet to use encrypted EBS volumes for storing data
  • B . Add logic to write all data to an encrypted Amazon S3 bucket
  • C . Add a custom encryption algorithm to the application that will encrypt and decrypt all data
  • D . Create a new Amazon Machine Image (AMI) with an encrypted root volume and store the data to ephemeral disks.

Reveal Solution Hide Solution

Correct Answer: A
Question #17

A Developer is creating an application that needs to locate the public IPv4 address of the Amazon EC2 instance on which it runs .

How can the application locate this information?

  • A . Get the instance metadata by retrieving http://169.254.169.254/latest/metadata/.
  • B . Get the instance user data by retrieving http://169.254.169.254/latest/userdata/.
  • C . Get the application to run IFCONFIG to get the public IP address.
  • D . Get the application to run IPCONFIG to get the public IP address.

Reveal Solution Hide Solution

Correct Answer: A
Question #18

Which of the following are good use cases for how Amazon ElastiCache can help an application? (Select TWO.)

  • A . Improve the performance of S3 PUT operations
  • B . Improve the latency of deployments performed by AWS CodeDeploy
  • C . Improve latency and throughput for read-heavy application workloads.
  • D . Reduce the time required to merge AWS CodeCommit branches
  • E . Improve performance of compute-intensive applications.

Reveal Solution Hide Solution

Correct Answer: C,E
Question #19

Where should the appspec.yml file be placed in order for AWS CodeDeploy to work?

  • A . In the root of the application source code directory structure
  • B . In the bin folder along with all the complied code
  • C . In an S3 bucket
  • D . In the same folder as the application configuration files

Reveal Solution Hide Solution

Correct Answer: A
Question #20

Company D is running their corporate website on Amazon S3 accessed from http//www.companyd.com. Their marketing team has published new web fonts to a separate S3 bucket accessed by the S3 endpoint https://s3-us-west-1.amazonaws.com/cdfonts. While testing the new web fonts, Company D recognized the web fonts are being blocked by the browser.

What should Company D do to prevent the web fonts from being blocked by the browser?

  • A . Enable versioning on the cdfonts bucket for each web font
  • B . Create a policy on the cdfonts bucket to enable access to everyone
  • C . Add the Content-MD5 header to the request for webfonts in the cdfonts bucket from the website
  • D . Configure the cdfonts bucket to allow cross-origin requests by creating a CORS configuration

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

https://docs.aws.amazon.com/AmazonS3/latest/dev/cors.html

Question #21

A company uses a third-party tool to build, bundle, and package rts applications on-premises. and store them locally. The company uses Amazon EC2 instances to run its front-end applications.

How can an application be deployed from the source control system onto the EC2 instances?

  • A . Use AWS CodeDeploy and point it to the local storage to directly deploy a bundle m a zip. tar. or tar.gz format
  • B . Upload the bundle to an Amazon S3 bucket and specify the S3 location when doing a deployment using AWS CodeDeploy
  • C . Create a repository using AWS CodeCommit to automatically trigger a deployment to the EC2 instances
  • D . Use AWS CodeBuild to automatically deploy the latest build to the latest EC2 instances

Reveal Solution Hide Solution

Correct Answer: B
Question #22

An application deployed on AWS Elastic Beanstalk experiences increased error rates during deployments of new application versions, resulting in service degradation for users. The Development team believes that this is because of the reduction in capacity during the deployment steps. The team would like to change the deployment policy configuration of the environment to an option that maintains full capacity during deployment while using the existing instances.

Which deployment policy will meet these requirements while using the existing instances?

  • A . All at once
  • B . Rolling
  • C . Rolling with additional batch
  • D . Immutable

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features.rolling-version-deploy.html

Question #22

An application deployed on AWS Elastic Beanstalk experiences increased error rates during deployments of new application versions, resulting in service degradation for users. The Development team believes that this is because of the reduction in capacity during the deployment steps. The team would like to change the deployment policy configuration of the environment to an option that maintains full capacity during deployment while using the existing instances.

Which deployment policy will meet these requirements while using the existing instances?

  • A . All at once
  • B . Rolling
  • C . Rolling with additional batch
  • D . Immutable

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/using-features.rolling-version-deploy.html

Question #24

A gaming company is developing a mobile game application for iOS® and Android® platforms. This mobile game securely stores user data locally on the device. The company wants to allow users to use multiple device for the game, which requires user data synchronization across device.

Which service should be used to synchronize user data across devices without the need to create a backend application?

  • A . AWS Lambda
  • B . Amazon S3
  • C . Amazon DynamoDB
  • D . Amazon Cognito

Reveal Solution Hide Solution

Correct Answer: D
Question #25

An AWS Lambda function generates a 3MB JSON file and then uploads it to an Amazon S3 bucket daily. The file contains sensitive information, so the Developer must ensure that it is encrypted before uploading to the bucket.

Which of the following modifications should the Developer make to ensure that the data is encrypted before uploading it to the bucket?

  • A . Use the default AWS KMS customer master key for S3 in the Lambda function code.
  • B . Use the S3 managed key and call the GenerateDataKey API to encrypt the file.
  • C . Use the GenerateDateKey API, then use that data key to encrypt the file in the Lambda function code.
  • D . Use a custom KMS customer master key created for S3 in the Lambda function code.

Reveal Solution Hide Solution

Correct Answer: C
Question #26

A developer needs to modify an application architecture to meet new functional requirements. Application data is stored in Amazon DynamoDB and processed for analysis in a rightly batch. The system analysts do not want to wait unit the next day to view the processed data and have asked to have it available in near-real time.

Which application architect pattern would enables the data to be processed as it is received?

  • A . Evert driven
  • B . Client served driven
  • C . Fan-out driven
  • D . Schedule driven

Reveal Solution Hide Solution

Correct Answer: A
Question #27

A developer must extend an existing application that is based on the AWS Services Application Model (AWS SAM). The developer has used the AWS SAM CLI to create the project. The project contains different AWS Lambda functions.

Which combination of commands must the developer use to redeploy the AWS SAM application? (Select TWO.)

  • A . Sam init
  • B . Sam validate
  • C . Sam build
  • D . Sam deploy
  • E . Sam publish

Reveal Solution Hide Solution

Correct Answer: A,D
Question #28

A company is using Amazon API Gateway to manage access to a set of microservices implemented as AWS Lambda functions. Following a bug report, the company makes a minor breaking change to one of the APIs.

In order to avoid impacting existing clients when the new API is deployed, the company wants to allow clients six months to migrate from v1 to v2.

Which approach should the Developer use to handle this change?

  • A . Update the underlying Lambda function and provide clients with the new Lambda invocation URL.
  • B . Use API Gateway to automatically propagate the change to clients, specifying 180 days in the phased deployment parameter.
  • C . Use API Gateway to deploy a new stage named v2 to the API and provide users with its URL.
  • D . Update the underlying Lambda function, create an Amazon CloudFront distribution with the updated Lambda function as its origin.

Reveal Solution Hide Solution

Correct Answer: C
Question #29

A developer uses Amazon S3 buckets for static website hosting. The developer creates one S3 bucket for the code and another S3 bucket for the assets, such as image and video files. Access is denied when a user attempts to access the assets bucket from the code bucket, with the website application showing a 403 error

How should the developer solve this issue?

  • A . Create an IAM role and apply it to the assets bucket for the code bucket to be granted access
  • B . Edit the bucket policy of the assets bucket to open access to all principals
  • C . Edit the cross-origin resource sharing (CORS) configuration of the assets bucket to allow any origin to access the assets
  • D . Change the code bucket to use AWS Lambda functions instead of static website hosting.

Reveal Solution Hide Solution

Correct Answer: C
Question #30

A Developer needs to design an application running on AWS that will be used to consume Amazon SQS messages that range from 1 KB up to 1GB in size.

How should the Amazon SQS messages be managed?

  • A . Use Amazon S3 and the Amazon SQS CLI.
  • B . Use Amazon S3 and the Amazon SQS Extended Client Library for Java.
  • C . Use Amazon EBS and the Amazon SQS CLI.
  • D . Use Amazon EFS and the Amazon SQS CLI.

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Reference: https://docs.aws.amazon.com/AWSSimpleQueueService/latest/SQSDeveloperGuide/sqsli mits.html

Question #31

A company needs a version control system for collaborative software development.

Features of the system must include the following:

✑ Support for batches of changes across multiple files

✑ Parallel branching

✑ Version tracking

Which AWS service will meet these requirements?

  • A . AWS CodePipeline
  • B . Amazon S3
  • C . AWS Code Build
  • D . AWS CodeCommit

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

https://docs.aws.amazon.com/codecommit/latest/userguide/welcome.html

Question #32

A Developer has setup an Amazon Kinesis Stream with 4 shards to ingest a maximum of 2500 records per second. A Lambda function has been configured to process these records.

In which order will these records be processed?

  • A . Lambda will receive each record in the reverse order it was placed into the stream following a LIFO (last-in, first-out) method
  • B . Lambda will receive each record in the exact order it was placed into the stream following a FIFO (first-in, first-out) method.
  • C . Lambda will receive each record in the exact order it was placed into the shard following a FIFO (first-in, first-out) method. There is no guarantee of order across shards.
  • D . The Developer can select FIFO, (first-in, first-out), LIFO (last-in, last-out), random, or request specific record using the getRecords API.

Reveal Solution Hide Solution

Correct Answer: C
Question #33

An Amazon RDS database instance is used by many applications to look up historical data. The query rate is relatively constant. When the historical data is updated each day, the resulting write traffic slows the read query performance and affects all application users.

What can be done to eliminate the performance impact on application users?

  • A . Make sure Amazon RDS is Multi-AZ so it can better absorb increased traffic.
  • B . Create an RDS Read Replica and direct all read traffic to the replica.
  • C . Implement Amazon ElastiCache in front of Amazon RDS to buffer the write traffic.
  • D . Use Amazon DynamoDB instead of Amazon RDS to buffer the read traffic.

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

https://aws.amazon.com/rds/details/read-replicas/

Question #34

A Developer is writing transactions into a DynamoDB table called “SystemUpdates” that has 5 write capacity units.

Which option has the highest read throughput?

  • A . Eventually consistent reads of 5 read capacity units reading items that are 4 KB in size
  • B . Strongly consistent reads of 5 read capacity units reading items that are 4 KB in size
  • C . Eventually consistent reads of 15 read capacity units reading items that are 1 KB in size
  • D . Strongly consistent reads of 15 read capacity units reading items that are 1 KB in size

Reveal Solution Hide Solution

Correct Answer: B
Question #35

A company is using Amazon API Gateway to manage its public-facing API. The CISO requires that the APIs be used by test account users only .

What is the MOST secure way to restrict API access to users of this particular AWS account?

  • A . Client-side SSL certificates for authentication
  • B . API Gateway resource policies
  • C . Cross-origin resource sharing (CORS)
  • D . Usage plans

Reveal Solution Hide Solution

Correct Answer: D
Question #36

A developer has written a multi-threaded application that is running on a fleet of Amazon EC2 instances. The operations team has requested a graphical method to monitor the number of running threads over time.

What is the MOST efficient way to fulfill this request?

  • A . Periodically send the thread count to AWS X-Ray segments, then generate a service graph on demand
  • B . Create a custom Amazon CloudWatch metric and periodically perform a PutMetricData call with the current thread count.
  • C . Periodically log thread count data to Amazon S3. Use Amazon Kinesis to process the data into a graph.
  • D . Periodically write the current thread count to a table using Amazon DynarnoDB and use Amazon CloudFront to create a graph

Reveal Solution Hide Solution

Correct Answer: D
Question #37

A developer is writing an AWS Lambda function. The developer wants to log key events that occur during the Lambda function and include a unique identifier to associate the events with a specific function invocation.

Which of the following will help the developer accomplish this objective?

  • A . Obtain the request identifier from the Lambda context object Architect the application to write logs to the console.
  • B . Obtain the request identifier from the Lambda event object Architect the application to write logs to a file
  • C . Obtain the request identifier from the Lambda event object Architect the application to write logs to the console
  • D . Obtain the request identifier from the Lambda context object Architect the application to write logs to a file.

Reveal Solution Hide Solution

Correct Answer: A
Question #38

A global company has an application running on Amazon EC2 instances that serves image files from Amazon S3. User requests from the browser are causing high traffic, which results in degraded performance.

Which optimization solution should a Developer implement to increase application performance?

  • A . Create multiple prefix in the S3 bucket to increase the request rate
  • B . Create an Amazon ElastiCache cluster to cache and serve frequently accessed items.
  • C . Use Amazon CloudFront to serve the content of images stored in Amazon S3.
  • D . Submit a ticket to AWS support to request a rate limit increase for the S3 bucket.

Reveal Solution Hide Solution

Correct Answer: B
Question #39

An application uses Amazon Kinesis Data Streams to ingest and process large streams of data records in real time. Amazon EC2 instances consume and process the data from the shards of the Kinesis data stream by using Amazon Kinesis Client Library (KCL). The application handles the failure scenarios and does not require standby workers. The application reports that a specific shard is receiving more data than expected. To adapt to the chnages in the rate of data flow, the “hot” shard is resharded.

Assuming that the initial number of shards in the Kinesis data stream is 4, and after resharding the number of shards increased to 6, what is the maximum number of EC2 instances that can be deployed to process data from all the shards?

  • A . 12
  • B . 6
  • C . 4
  • D . 1

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Typically, when you use the KCL, you should ensure that the number of instances does not exceed the number of shards (except for failure standby purposes). Each shard is processed by exactly one KCL worker and has exactly one corresponding record processor, so you never need multiple instances to process one shard. However,

one worker can process any number of shards, so it’s fine if the number of shards exceeds the number of instances.

https://docs.aws.amazon.com/streams/latest/dev/kinesis-record-processor-scaling.html

Question #40

A supplier is writing a new RESTful API for customers to query the status of orders. The customers requested the following API endpoint.

http://www.supplierdomain.com/status/customerID

Which of the following application designs meet the requirements? (Select two.)

  • A . Amazon SQS; Amazon SNS
  • B . Elastic Load Balancing; Amazon EC2
  • C . Amazon ElastiCache; Amazon Elacticsearch Service
  • D . Amazon API Gateway; AWS Lambda
  • E . Amazon S3; Amazon CloudFront

Reveal Solution Hide Solution

Correct Answer: D,E

Question #41

A development team is creating a new application designed to run on AWS. While the test and production environments will run on Amazon EC2 instances, developers will each run their own environment on their laptops.

Which of the following is the simplest and MOST secure way to access AWS services from the local development machines?

  • A . Use an IAM role to assume a role and execute API calls using the role.
  • B . Create an IAM user to be shared with the entire development team, provide the development team with the access key.
  • C . Create an IAM user for each developer on the team: provide each developer with a unique access key
  • D . Set up a federation through an Amazon Cognito user pool.

Reveal Solution Hide Solution

Correct Answer: A
Question #42

A Developer must repeatedly and consistently deploy a serverless RESTful API on AWS.

Which techniques will work? (Choose two.)

  • A . Define a Swagger file. Use AWS Elastic Beanstalk to deploy the Swagger file.
  • B . Define a Swagger file. Use AWS CodeDeploy to deploy the Swagger file.
  • C . Deploy a SAM template with an inline Swagger definition.
  • D . Define a Swagger file. Deploy a SAM template that references the Swagger file.
  • E . Define an inline Swagger definition in a Lambda function. Invoke the Lambda function.

Reveal Solution Hide Solution

Correct Answer: C,D
C,D

Explanation:

https://aws.amazon.com/about-aws/whats-new/2017/02/aws-serverless-application-model-aws-sam-supports-inline-swagger-and-aws-cloudformation-intrinsic-functions/

https://aws.amazon.com/about-aws/whats-new/2017/02/aws-serverless-application-model-aws-sam-supports-inline-swagger-and-aws-cloudformation-intrinsic-functions/

Question #43

An ecommerce startup is preparing for an annual sales event As the traffic to the company’s application increases, the development team wants to be notified when the Amazon EC2 instance’s CPU utilization exceeds 80%.

Which solution will meet this requirement?

  • A . Create a custom Amazon CloudWatch alarm that sends a notification to an Amazon SNS topic when the CPU utilization exceeds 80%.
  • B . Create a custom AWS CloudTrail alarm that sends a notification to an Amazon SNS topic when the CPU utilization exceeds 80%
  • C . Create a cron job on the EC2 instance that executes the –describe-instance-information command on the host instance every 15 minutes and sends the results to an Amazon SNS topic
  • D . Create an AWS Lambda function that queries the AWS CloudTrail logs for the CPU Utihzation metric every 15 minutes and sends a notification to an Amazon SNS topic when the CPU utilization exceeds 80%

Reveal Solution Hide Solution

Correct Answer: C
Question #44

A Developer is testing a Docker-based application that uses the AWS SDK to interact with Amazon

DynamoDB. In the local development environment, the application has used IAM access keys. The application is now ready for deployment onto an ECS cluster.

How should the application authenticate with AWS services in production?

  • A . Configure an ECS task IAM role for the application to use
  • B . Refactor the application to call AWS STS AssumeRole based on an instance role
  • C . Configure AWS access key/secret access key environment variables with new credentials
  • D . Configure the credentials file with a new access key/secret access key

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

https://docs.aws.amazon.com/AmazonECS/latest/developerguide/task_IAM_role.html#:~:targetText=Amazon%20ECS%20Task%20Role,Task%20Role%20trust%20relationship%20policy.

Question #45

A developer wants the ability to roll back to a previous version of an AWS Lambda function in the event of errors caused by a new deployment.

How can the developer achieve this with MINIMAL impact on users?

  • A . Change the application to use an alias that points to the current version Deploy the new version of the code Update the alias to use the newly deployed version. If too many errors are encountered, point the alias back to the previous version
  • B . Change the application to use an alias that points to the current version Deploy the new version of the code. Update the alias to direct 10% of users to the newly deployed version. If too many errors are encountered, send 100% of traffic to the previous version
  • C . Do not make any changes to the application Deploy the new version of the code. If too
    many errors are encountered, point the application back to the previous version using the version number in the Amazon Resource Name (ARN)
  • D . Create three aliases: new, existing, and router Point the existing alias to the current version Have the router alias direct 100% of users to the existing alias Update the application to use the router alias Deploy the new version of the code Point the new alias to this version Update the router alias to direct 10% of users to the new alias If too many errors are encountered, send 100% of traffic to the existing alias

Reveal Solution Hide Solution

Correct Answer: A
Question #46

A company is developing a serverless ecommerce web application. The application needs to make coordinated, all-or-nothing changes to multiple items in the company’s inventory table in Amazon DynamoDB.

Which solution will meet these requirements?

  • A . Enable transactions for the DynamoDB table Use the Batch Writeltem operation to update the items.
  • B . Use the Transact Writeitem operation to group the changes Update the items in the table
  • C . Set up a FIFO queue using Amazon SQS. Group the changes in the queue. Update the table based on the grouped changes
  • D . Create a transaction table in an Amazon Aurora DB cluster to manage the transactions Write a backend process to sync the Aurora DB table and the DynamoDB table

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

Transact WriteItems is a synchronous write operation that groups up to 25 action requests. The Batch WriteItem operation puts or deletes multiple items in one or

more tables.

https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/DynamoDB.html

Question #47

A legacy service has an XML-based SOAP interface. The Developer wants to expose the functionality of the service to external clients with the Amazon API Gateway .

Which technique will accomplish this?

  • A . Create a RESTful API with the API Gateway; transform the incoming JSON into a valid XML message for the SOAP interface using mapping templates.
  • B . Create a RESTful API with the API Gateway; pass the incoming JSON to the SOAP interface through an Application Load Balancer.
  • C . Create a RESTful API with the API Gateway; pass the incoming XML to the SOAP interface through an Application Load Balancer.
  • D . Create a RESTful API with the API Gateway; transform the incoming XML into a valid message for the
    SOAP interface using mapping templates.

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

https://blog.codecentric.de/en/2016/12/serverless-soap-legacy-api-integration-java-aws-lambda-aws-api-gateway/

Question #48

A Developer executed a AWS CLI command and received the error shown below:

What action should the Developer perform to make this error human-readable?

  • A . Make a call to AWS KMS to decode the message.
  • B . Use the AWS STS decode-authorization-message API to decode the message.
  • C . Use an open source decoding library to decode the message.
  • D . Use the AWS IAM decode-authorization-message API to decode this message.

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

https://docs.aws.amazon.com/cli/latest/reference/sts/decode-authorization-message.html

The message is encoded because the details of the authorization status can constitute privileged information that the user who requested the operation should not see. To decode an authorization status message, a user must be granted permissions via an IAM policy to request the DecodeAuthorizationMessage (sts:DecodeAuthorizationMessage ) action.

Question #49

Which of the following services are key/value stores? Choose 3 answers

  • A . Amazon ElastiCache
  • B . Simple Notification Service
  • C . DynamoDB
  • D . Simple Workflow Service
  • E . Simple Storage Service

Reveal Solution Hide Solution

Correct Answer: A,C,E
Question #50

A developer is building an application that needs to store date in Amazon S3. Management requires that the data be encrypted before is it sent to Amazon S3 for storage. The encryption keys need to be managed by the security team.

Which approach should the developer take to meet these requirements?

  • A . Implement server-side encryption using customer-provided encryption keys (SSE-C).
  • B . Implement server-side encryption by using client-side master key.
  • C . Implement client-side encryption using an AWS KMS managed customer master key (CMK).
  • D . Implement Client-side encryption using Amazon S3 managed keys.

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Reference: https://aws.amazon.com/s3/faqs/

Question #51

A developer is writing a web application that must share secure documents with end users. The documents are stored in a private Amazon S3 bucket. The application must allow only authenticated users to download specific documents when requested, and only for a duration of 15 minutes.

How can the developer meet these requirements?

  • A . Copy the documents to a separate S3 bucket that has a lifecycle policy for deletion after 15 minutes
  • B . Create a presigned S3 URL using the AWS SDK with an expiration time of 15 minutes
  • C . Use server-side encryption with AWS KMS managed keys (SSE-KMS) and download the documents using HTTPS
  • D . Modify the S3 bucket policy to only allow specific users to download the documents Revert the change after 15 minutes.

Reveal Solution Hide Solution

Correct Answer: B
Question #52

A Developer wants to upload data to Amazon S3 and must encrypt the data in transit.

Which of the following solutions will accomplish this task? (Choose two.)

  • A . Set up hardware VPN tunnels to a VPC and access S3 through a VPC endpoint
  • B . Set up Client-Side Encryption with an AWS KMS-Managed Customer Master Key
  • C . Set up Server-Side Encryption with AWS KMS-Managed Keys
  • D . Transfer the data over an SSL connection
  • E . Set up Server-Side Encryption with S3-Managed Keys

Reveal Solution Hide Solution

Correct Answer: B,D
B,D

Explanation:

https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingEncryption.html

Question #53

An application contains two components one component to handle HI IP requests, and another component to handle background processing tasks Bach component must scale independently. The developer wants to deploy this application using AWS Elastic Beanstalk.

How should this application be deployed, based on these requirements?

  • A . Deploy the application in a single Elastic Beanstalk environment
  • B . Deploy each component in a separate Elastic Beanstalk environment
  • C . Use multiple Elastic Beanstalk environments for the HTTP component but one environment for the background task component
  • D . Use multiple Elastic Beanstalk environments for the background task component but one environment tor the HTTP component

Reveal Solution Hide Solution

Correct Answer: A
Question #54

An AWS Lambda function accesses two Amazon DynamoDB tables. A developer wants to improve the performance of the Lambda function by identifying bottlenecks in the function .

How can the developer inspect the timing of the DynamoDB API calls?

  • A . Add DynamoDB as an event source to the Lambda function. View the performance with Amazon CloudWatch metrics
  • B . Place an Application Load Balancer (ALB) in front of the two DynamoDB tables. Inspect the ALB logs
  • C . Limit Lambda to no more than five concurrent invocations Monitor from the Lambda console
  • D . Enable AWS X-Ray tracing for the function. View the traces from the X-Ray service.

Reveal Solution Hide Solution

Correct Answer: A
Question #55

A company is migrating its on-premises database to Amazon RDS for MySQL. The company has read-heavy workloads, and wants to make sure it re-factors its code to achieve optimum read performance for its queries.

How can this objective be met?

  • A . Add database retries to effectively use RDS with vertical scaling
  • B . Use RDS with multi-AZ deployment
  • C . Add a connection string to use an RDS read replica for read queries
  • D . Add a connection string to use a read replica on an EC2 instance.

Reveal Solution Hide Solution

Correct Answer: C
Question #56

A Developer is writing a serverless application that requires that an AWS Lambda function be invoked every 10 minutes.

What is an automated and serverless way to trigger the function?

  • A . Deploy an Amazon EC2 instance based on Linux, and edit its /etc/crontab file by adding a command to periodically invoke the Lambda function.
  • B . Configure an environment variable named PERIOD for the Lambda function. Set the value to 600.
  • C . Create an Amazon CloudWatch Events rule that triggers on a regular schedule to invoke the Lambda function.
  • D . Create an Amazon SNS topic that has a subscription to the Lambda function with a 600-second timer.

Reveal Solution Hide Solution

Correct Answer: C
C

Explanation:

Reference: https://aws.amazon.com/blogs/architecture/a-serverless-solution-for-invoking-aws-lambda-at-asub-minute-frequency/

Question #57

A company is using AWS CloudFormation templates to deploy AWS resources. The company needs to update one of its AWS CloudFormation stacks.

What can the company do to find out how the changes will impact the resources that are running?

  • A . Investigate the change sets
  • B . Investigate the stack policies
  • C . Investigate the Metadata section.
  • D . Investigate the Resources section.

Reveal Solution Hide Solution

Correct Answer: A
Question #58

A company is migrating from a monolithic architecture to a microservices-based architecture. The Developers need to refactor the application so that the many microservices can asynchronously communicate with each other without impacting performance.

Use of which managed AWS services will enable asynchronous message passing? (Choose two.)

  • A . Amazon SQS
  • B . Amazon Cognito
  • C . Amazon Kinesis
  • D . Amazon SNS
  • E . Amazon ElastiCache

Reveal Solution Hide Solution

Correct Answer: A,D
Question #59

An organization must store thousands of sensitive audio and video files in an Amazon S3 bucket.

Organizational security policies require that all data written to this bucket be encrypted.

How can compliance with this policy be ensured?

  • A . Use AWS Lambda to send notifications to the security team if unencrypted objects are pun in the bucket.
  • B . Configure an Amazon S3 bucket policy to prevent the upload of objects that do not contain the x-amzserver-side-encryption header.
  • C . Create an Amazon CloudWatch event rule to verify that all objects stored in the Amazon S3 bucket are encrypted.
  • D . Configure an Amazon S3 bucket policy to prevent the upload of objects that contain the x-amz-server-sideencryption header.

Reveal Solution Hide Solution

Correct Answer: B
Question #60

A developer is creating a serverless web application and maintains different branches of code. The developer wants to avoid updating the Amazon API Gateway target endpoint each time a new code push is performed.

What solution would allow me developer to

perform a code push efficiently, without the need to update the API Gateway?

  • A . Associate different AWS Lambda functions to an API Gateway target endpoint
  • B . Create different stages in API Gateway, then associate API Gateway with aws Lambda
  • C . Create aliases and versions In AWS Lambda.
  • D . Tag the AWS Lambda functions with different names

Reveal Solution Hide Solution

Correct Answer: B

Question #61

An application is being developed to audit several AWS accounts. The application will run in Account A and must access AWS services in Accounts B and C.

What is the MOST secure way to allow the application to call AWS services in each audited account?

  • A . Configure cross-account roles in each audited account. Write code in Account A that assumes those roles
  • B . Use S3 cross-region replication to communicate among accounts, with Amazon S3 event notifications to trigger Lambda functions
  • C . Deploy an application in each audited account with its own role. Have Account A authenticate with the application
  • D . Create an IAM user with an access key in each audited account. Write code in Account A that uses those access keys

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

https://docs.aws.amazon.com/IAM/latest/UserGuide/tutorial_cross-account-with-roles.html

Question #62

A developer wants to send multi-value headers to an AWS Lambda function that is registered as a target with an Application Load Balancer (ALB).

What should the developer do to achieve this?

  • A . Place the Lambda function and target group in the same account
  • B . Send the request body to the Lambda function with a size less than 1 MB 0
  • C . Include the Base64 encoding status status code, status description, and headers in the Lambda function
  • D . Enable the multi-value headers on the ALB

Reveal Solution Hide Solution

Correct Answer: D
Question #63

The development team is working on an API that will be served from Amazon API gateway. The API will be served from three environments: development, test, and production. The API Gateway is configured to use 237 GB of cache in all three stages.

Which is the MOST cost-efficient deployment strategy?

  • A . Create a single API Gateway with all three stages.
  • B . Create three API Gateways, one for each stage in a single AWS account.
  • C . Create an API Gateway in three separate AWS accounts.
  • D . Enable the cache for development and test environments only when needed.

Reveal Solution Hide Solution

Correct Answer: D
Question #64

An application is running on an EC2 instance. The Developer wants to store an application metric in Amazon CloudWatch.

What is the best practice for implementing this requirement?

  • A . Use the PUT Object API call to send data to an S3 bucket. Use an event notification to invoke a Lambda function to publish data to CloudWatch.
  • B . Publish the metric data to an Amazon Kinesis Stream using a PutRecord API call.
    Subscribe a Lambda function that publishes data to CloudWatch.
  • C . Use the CloudWatch PutMetricData API call to submit a custom metric to CloudWatch.
    Provide the required credentials to enable the API call.
  • D . Use the CloudWatch PutMetricData API call to submit a custom metric to CloudWatch.
    Launch the EC2 instance with the required IAM role to enable the API call.

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_use_switch-role-ec2.html

Question #65

A Developer is making changes to a custom application that is currently using AWS Elastic Beanstalk.

After the Developer completes the changes, what solutions will update the Elastic Beanstalk environment with the new application version? (Choose two.)

  • A . Package the application code into a .zip file, and upload, then deploy the packaged application from the AWS Management Console
  • B . Package the application code into a .tar file, create a new application version from the AWS Management Console, then update the environment by using AWS CLI
  • C . Package the application code into a .tar file, and upload and deploy the packaged application from the AWS Management Console
  • D . Package the application code into a .zip file, create a new application version from the packaged application by using AWS CLI, then update the environment by using AWS CLI
  • E . Package the application code into a .zip file, create a new application version from the AWS Management Console, then rebuild the environment by using AWS CLI

Reveal Solution Hide Solution

Correct Answer: C,D
Question #66

A gaming application stores scores for players in an Amazon DynamoDB table that has four attributes user_id, user_name, user_score, and user_rank. The users are allowed to update their names only A user is authenticated by web identity federation.

Which set of conditions should be added in the policy attached to the role for the dynamodb: PutItem API call?

A)

B)

C)

D)

  • A . Option A
  • B . Option B
  • C . Option C
  • D . Option D

Reveal Solution Hide Solution

Correct Answer: A
Question #67

A software company needs to make sure user-uploaded documents are securely stored in Amazon S3. The documents must be encrypted at rest in Amazon S3. The company does not want to manage the security infrastructure in-house, but the company still needs extra protection to ensure it has control over its encryption keys due to industry regulations.

Which encryption strategy should a developer use to meet these requirements?

  • A . Server-side encryption with Amazon S3 managed keys (SSE-S3)
  • B . Server-side encryption with customer-provided encryption keys (SSE-C)
  • C . Server-side encryption with AWS KMS managed keys (SSE-KMS)
  • D . Client-side encryption

Reveal Solution Hide Solution

Correct Answer: D
Question #68

A company’s ecommerce website is experiencing massive traffic spikes, which are causing performance problems in the company database. Users are reporting that accessing the website takes a long time.

A developer wants to implement a caching layer using Amazon ElastiCache. The website is required to be responsive no matter which product a user views, and the updates to product information and prices must be strongly consistent

  • A . Which cache writing policy will satisfy these requirements?
  • B . Write to the cache directly and sync the backend at a later time.
  • C . Write to the backend first and wait for the cache to expire.
  • D . Write to the cache and the backend at the same time
  • E . Write to the backend first and invalidate the cache

Reveal Solution Hide Solution

Correct Answer: E
Question #69

A website’s page load times are gradually increasing as more users access the system at the same time. Analysis indicates that a user profile is being loaded from a database in all the web pages being visited by each user and this is increasing the database load and the page load latency. To address this issue the Developer decides to cache the user profile data.

Which caching strategy will address this situation MOST efficiently?

  • A . Create a new Amazon EC2 Instance and run a NoSQL database on it. Cache the profile data within this database using the write-through caching strategy.
  • B . Create an Amazon ElastiCache cluster to cache the user profile data. Use a cache-aside caching strategy.
  • C . Use a dedicated Amazon RDS instance for caching profile data. Use a write-through caching strategy.
  • D . Create an ElastiCache cluster to cache the user profile data. Use a write-through caching strategy.

Reveal Solution Hide Solution

Correct Answer: B
B

Explanation:

https://docs.aws.amazon.com/AmazonElastiCache/latest/mem-ug/Strategies.html

Question #70

How is provisioned throughput affected by the chosen consistency model when reading data from a DynamoDB table?

  • A . Strongly consistent reads use the same amount of throughput as eventually consistent reads
  • B . Strongly consistent reads use more throughput than eventually consistent reads.
  • C . Strongly consistent reads use less throughput than eventually consistent reads
  • D . Strongly consistent reads use variable throughput depending on read activity

Reveal Solution Hide Solution

Correct Answer: B

Question #71

An application running on EC2 instances is storing data in an S3 bucket. Security policy mandates that all data must be encrypted in transit.

How can the Developer ensure that all traffic to the S3 bucket is encrypted?

  • A . Install certificates on the EC2 instances.
  • B . Create a bucket policy that allows traffic where SecureTransport is true.
  • C . Create an HTTPS redirect on the EC2 instances.
  • D . Create a bucket policy that denies traffic where SecureTransport is false.

Reveal Solution Hide Solution

Correct Answer: D
D

Explanation:

https://aws.amazon.com/blogs/security/how-to-use-bucket-policies-and-apply-defense-in-depth-to-help-secure-your-amazon-s3-data/

Question #72

A company is running an application built on AWS Lambda functions. One Lambda function has performance issues when it has to download a 50MB file from the Internet in every execution. This function is called multiple times a second.

What solution would give the BEST performance increase?

  • A . Cache the file in the /tmp directory
  • B . Increase the Lambda maximum execution time
  • C . Put an Elastic Load Balancer in front of the Lambda function
  • D . Cache the file in Amazon S3

Reveal Solution Hide Solution

Correct Answer: A
A

Explanation:

https://docs.aws.amazon.com/lambda/latest/dg/runtimes-context.html

Question #73

A developer needs to deploy a new version to an AWS Elastic Beanstalk application.

How can the developer accomplish this task?

  • A . Upload and deploy the new application version in the Elastic Beanstalk console
  • B . Use the eb init CLI command to deploy a new version ‘
  • C . Terminate the current Elastic Beanstalk environment and create a new one
  • D . Modify the ebextensions folder to add a source option to services

Reveal Solution Hide Solution

Correct Answer: A
Question #74

An advertising company has a dynamic website with heavy traffic. The company wants to migrate the website infrastructure to AWS to handle everything except website development.

Which solution BEST meets these requirements?

  • A . Use AWS VM Import to migrate a web server image to AWS Launch the image on a compute-optimized Amazon EC2 instanceLaunch.
  • B . Launch multiple Amazon Lighsall instance behind a load balancer. Set up the website on those instances.
  • C . Deploy the website code in an AWS Elastic Beanstalk environment. Use Auto Scaling to scale the numbers of instance
  • D . Use Amazon S3 to host the website. Use Amazon CloudFornt to deliver the content at scale.

Reveal Solution Hide Solution

Correct Answer: C
Exit mobile version