Amazon DVA-C02 Exam Dumps

07 Jan

Description

Genuine Exam Dumps For DVA-C02:

Prepare Yourself Expertly for DVA-C02 Exam:

Our team of highly skilled and experienced professionals is dedicated to delivering up-to-date and precise study materials in PDF format to our customers. We deeply value both your time and financial investment, and we have spared no effort to provide you with the highest quality work. We ensure that our students consistently achieve a score of more than 95% in the Amazon DVA-C02 exam. You provide only authentic and reliable study material. Our team of professionals is always working very keenly to keep the material updated. Hence, they communicate to the students quickly if there is any change in the DVA-C02 dumps file. The Amazon DVA-C02 exam question answers and DVA-C02 dumps we offer are as genuine as studying the actual exam content.

24/7 Friendly Approach:

You can reach out to our agents at any time for guidance; we are available 24/7. Our agent will provide you information you need; you can ask them any questions you have. We are here to provide you with a complete study material file you need to pass your DVA-C02 exam with extraordinary marks.

Quality Exam Dumps for Amazon DVA-C02:

Pass4surexams provide trusted study material. If you want to meet a sweeping success in your exam you must sign up for the complete preparation at Pass4surexams and we will provide you with such genuine material that will help you succeed with distinction. Our experts work tirelessly for our customers, ensuring a seamless journey to passing the Amazon DVA-C02 exam on the first attempt. We have already helped a lot of students to ace IT certification exams with our genuine DVA-C02 Exam Question Answers. Don’t wait and join us today to collect your favorite certification exam study material and get your dream job quickly.

90 Days Free Updates for Amazon DVA-C02 Exam Question Answers and Dumps:

Enroll with confidence at Pass4surexams, and not only will you access our comprehensive Amazon DVA-C02 exam question answers and dumps, but you will also benefit from a remarkable offer – 90 days of free updates. In the dynamic landscape of certification exams, our commitment to your success doesn’t waver. If there are any changes or updates to the Amazon DVA-C02 exam content during the 90-day period, rest assured that our team will promptly notify you and provide the latest study materials, ensuring you are thoroughly prepared for success in your exam.”

Amazon DVA-C02 Real Exam Questions:

Quality is the heart of our service that’s why we offer our students real exam questions with 100% passing assurance in the first attempt. Our DVA-C02 dumps PDF have been carved by the experienced experts exactly on the model of real exam question answers in which you are going to appear to get your certification.

Amazon DVA-C02 Sample Questions

Question # 1
A company has an application that runs across multiple AWS Regions. The application isexperiencing performance issues at irregular intervals. A developer must use AWS X-Ray to implement distributed tracing for the application to troubleshoot the root cause of theperformance issues.What should the developer do to meet this requirement?

A. Use the X-Ray console to add annotations for AWS services and user-defined services
B. Use Region annotation that X-Ray adds automatically for AWS services Add Regionannotation for user-defined services
C. Use the X-Ray daemon to add annotations for AWS services and user-defined services
D. Use Region annotation that X-Ray adds automatically for user-defined servicesConfigure X-Ray to add Region annotation for AWS services

Answer: B
Explanation: AWS X-Ray automatically adds Region annotation for AWS services that areintegrated with X-Ray. This annotation indicates the AWS Region where the service isrunning. The developer can use this annotation to filter and group traces by Region andidentify any performance issues related to cross-Region calls. The developer can also addRegion annotation for user-defined services by using the X-Ray SDK. This option enablesthe developer to implement distributed tracing for the application that runs across multipleAWS Regions. ReferencesAWS X-Ray AnnotationsAWS X-Ray Concepts

Question # 2
A company is using AWS CloudFormation to deploy a two-tier application. The applicationwill use Amazon RDS as its backend database. The company wants a solution that willrandomly generate the database password during deployment. The solution also mustautomatically rotate the database password without requiring changes to the application.What is the MOST operationally efficient solution that meets these requirements’?

A. Use an AWS Lambda function as a CloudFormation custom resource to generate androtate the password.
B. Use an AWS Systems Manager Parameter Store resource with the SecureString datatype to generate and rotate the password.
C. Use a cron daemon on the application s host to generate and rotate the password.
D. Use an AWS Secrets Manager resource to generate and rotate the password.

Answer: D
Explanation: This solution will meet the requirements by using AWS Secrets Manager,
which is a service that helps protect secrets such as database credentials by encryptingthem with AWS Key Management Service (AWS KMS) and enabling automatic rotation ofsecrets. The developer can use an AWS Secrets Manager resource in AWSCloudFormation template, which enables creating and managing secrets as part of aCloudFormation stack. The developer can use an AWS::SecretsManager::Secret resourcetype to generate and rotate the password for accessing RDS database during deployment.The developer can also specify a RotationSchedule property for the secret resource, whichdefines how often to rotate the secret and which Lambda function to use for rotation logic.Option A is not optimal because it will use an AWS Lambda function as a CloudFormationcustom resource, which may introduce additional complexity and overhead for creating andmanaging a custom resource and implementing rotation logic. Option B is not optimalbecause it will use an AWS Systems Manager Parameter Store resource with theSecureString data type, which does not support automatic rotation of secrets. Option C isnot optimal because it will use a cron daemon on the application’s host to generate androtate the password, which may incur more costs and require more maintenance forrunning and securing a host.References: [AWS Secrets Manager], [AWS::SecretsManager::Secret]

Question # 3
A developer is designing a serverless application for a game in which users register andlog in through a web browser The application makes requests on behalf of users to a set ofAWS Lambda functions that run behind an Amazon API Gateway HTTP APIThe developer needs to implement a solution to register and log in users on theapplication’s sign-in page. The solution must minimize operational overhead and mustminimize ongoing management of user identities.Which solution will meet these requirements’?

A. Create Amazon Cognito user pools for external social identity providers Configure 1AMroles for the identity pools.
B. Program the sign-in page to create users’ 1AM groups with the 1AM roles attached tothe groups
C. Create an Amazon RDS for SQL Server DB instance to store the users and manage thepermissions to the backend resources in AWS
D. Configure the sign-in page to register and store the users and their passwords in anAmazon DynamoDB table with an attached IAM policy.

Question # 4
A developer needs to build an AWS CloudFormation template that self-populates the AWSRegion variable that deploys the CloudFormation templateWhat is the MOST operationally efficient way to determine the Region in which thetemplate is being deployed?

A. Use the AWS:.Region pseudo parameter
B. Require the Region as a CloudFormation parameter
C. Find the Region from the AWS::Stackld pseudo parameter by using the Fn::Split intrinsic function
D. Dynamically import the Region by referencing the relevant parameter in AWS SystemsManager Parameter Store

Question # 5
An Amazon Simple Queue Service (Amazon SQS) queue serves as an event source for anAWS Lambda function In the SQS queue, each item corresponds to a video file that theLambda function must convert to a smaller resolution The Lambda function is timing out onlonger video files, but the Lambda function’s timeout is already configured to its maximumvalueWhat should a developer do to avoid the timeouts without additional code changes’?

A. Increase the memory configuration of the Lambda function
B. Increase the visibility timeout on the SQS queue
C. Increase the instance size of the host that runs the Lambda function.
D. Use multi-threading for the conversion.

Answer: A
Explanation: Increasing the memory configuration of the Lambda function will alsoincrease the CPU and network throughput available to the function. This can improve the performance of the video conversion process and reduce the execution time of thefunction. This solution does not require any code changes or additional resources. It is alsorecommended to follow the best practices for preventing Lambda functiontimeouts1. ReferencesTroubleshoot Lambda function invocation timeout errors | AWS re:Post

Question # 6
A company needs to deploy all its cloud resources by using AWS CloudFormationtemplates A developer must create an Amazon Simple Notification Service (Amazon SNS)automatic notification to help enforce this rule. The developer creates an SNS topic andsubscribes the email address of the company’s security team to the SNS topic.The security team must receive a notification immediately if an 1AM role is created withoutthe use of CloudFormation.Which solution will meet this requirement?

A. Create an AWS Lambda function to filter events from CloudTrail if a role was createdwithout CloudFormation Configure the Lambda function to publish to the SNS topic. Createan Amazon EventBridge schedule to invoke the Lambda function every 15 minutes
B. Create an AWS Fargate task in Amazon Elastic Container Service (Amazon ECS) tofilter events from CloudTrail if a role was created without CloudFormation Configure theFargate task to publish to the SNS topic Create an Amazon EventBridge schedule to runthe Fargate task every 15 minutes
C. Launch an Amazon EC2 instance that includes a script to filter events from CloudTrail ifa role was created without CloudFormation. Configure the script to publish to the SNStopic. Create a cron job to run the script on the EC2 instance every 15 minutes.
D. Create an Amazon EventBridge rule to filter events from CloudTrail if a role was createdwithout CloudFormation Specify the SNS topic as the target of the EventBridge rule.

Answer: D
Explanation: Creating an Amazon EventBridge rule is the most efficient and scalable way
to monitor and react to events from CloudTrail, such as the creation of an IAM role withoutCloudFormation. EventBridge allows you to specify a filter pattern to match the events youare interested in, and then specify an SNS topic as the target to send notifications. Thissolution does not require any additional resources or code, and it can trigger notifications innear real-time. The other solutions involve creating and managing additional resources,such as Lambda functions, Fargate tasks, or EC2 instances, and they rely on pollingCloudTrail events every 15 minutes, which can introduce delays and increasecosts. ReferencesUsing Amazon EventBridge rules to process AWS CloudTrail eventsUsing AWS CloudFormation to create and manage AWS Batch resourcesHow to use AWS CloudFormation to configure auto scaling for Amazon Cognitoand AWS AppSyncUsing AWS CloudFormation to automate the creation of AWS WAF web ACLs,rules, and conditions

Question # 7
A company has an application that is hosted on Amazon EC2 instances The applicationstores objects in an Amazon S3 bucket and allows users to download objects from the S3bucket A developer turns on S3 Block Public Access for the S3 bucket After this change,users report errors when they attempt to download objects The developer needs to implement a solution so that only users who are signed in to the application can accessobjects in the S3 bucket.Which combination of steps will meet these requirements in the MOST secure way? (SelectTWO.)

A. Create an EC2 instance profile and role with an appropriate policy Associate the rolewith the EC2 instances
B. Create an 1AM user with an appropriate policy. Store the access key ID and secretaccess key on the EC2 instances
C. Modify the application to use the S3 GeneratePresignedUrl API call
D. Modify the application to use the S3 GetObject API call and to return the object handleto the user
E. Modify the application to delegate requests to the S3 bucket.

Answer: A,C
Explanation: The most secure way to allow the EC2 instances to access the S3 bucket isto use an EC2 instance profile and role with an appropriate policy that grants the necessarypermissions. This way, the EC2 instances can use temporary security credentials that areautomatically rotated and do not need to store any access keys on the instances. To allowthe users who are signed in to the application to download objects from the S3 bucket, theapplication can use the S3 GeneratePresignedUrl API call to create a pre-signed URL thatgrants temporary access to a specific object. The pre-signed URL can be returned to theuser, who can then use it to download the object within a specified time period. ReferencesUse Amazon S3 with Amazon EC2How to Access AWS S3 Bucket from EC2 Instance In a Secured WaySharing an Object with Others

Question # 8
A company runs a payment application on Amazon EC2 instances behind an ApplicationLoad Balance The EC2 instances run in an Auto Scaling group across multiple AvailabilityZones The application needs to retrieve application secrets during the application startup and export the secrets as environment variables These secrets must be encrypted at restand need to be rotated every month.Which solution will meet these requirements with the LEAST development effort?

A. Save the secrets in a text file and store the text file in Amazon S3 Provision a customermanaged key Use the key for secret encryption in Amazon S3 Read the contents of thetext file and read the export as environment variables Configure S3 Object Lambda torotate the text file every month
B. Save the secrets as strings in AWS Systems Manager Parameter Store and use thedefault AWS Key Management Service (AWS KMS) key Configure an Amazon EC2 userdata script to retrieve the secrets during the startup and export as environment variablesConfigure an AWS Lambda function to rotate the secrets in Parameter Store every month.
C. Save the secrets as base64 encoded environment variables in the applicationproperties. Retrieve the secrets during the application startup. Reference the secrets in theapplication code. Write a script to rotate the secrets saved as environment variables.
D. Store the secrets in AWS Secrets Manager Provision a new customer master key Usethe key to encrypt the secrets Enable automatic rotation Configure an Amazon EC2 userdata script to programmatically retrieve the secrets during the startup and export asenvironment variables

Answer: D
Explanation: AWS Secrets Manager is a service that enables the secure managementand rotation of secrets, such as database credentials, API keys, or passwords. By usingSecrets Manager, the company can avoid hardcoding secrets in the application code orproperties files, and instead retrieve them programmatically during the application startup.Secrets Manager also supports automatic rotation of secrets by using AWS Lambdafunctions or built-in rotation templates. The company can provision a customer master key(CMK) to encrypt the secrets and use the AWS SDK or CLI to export the secrets asenvironment variables. References:What Is AWS Secrets Manager? – AWS Secrets ManagerRotating Your AWS Secrets Manager Secrets – AWS Secrets ManagerRetrieving a Secret – AWS Secrets Manager

Question # 9
A developer is creating a simple proof-of-concept demo by using AWS CloudFormation andAWS Lambda functions The demo will use a CloudFormation template to deploy anexisting Lambda function The Lambda function uses deployment packages anddependencies stored in Amazon S3 The developer defined anAWS Lambda Functionresource in a CloudFormation template. The developer needs to add the S3 bucket to theCloudFormation template.What should the developer do to meet these requirements with the LEAST developmenteffort?

A. Add the function code in the CloudFormation template inline as the code property
B. Add the function code in the CloudFormation template as the ZipFile property.
C. Find the S3 key for the Lambda function Add the S3 key as the ZipFile property in theCloudFormation template.
D. Add the relevant key and bucket to the S3Bucket and S3Key properties in theCloudFormation template

Answer: D
Explanation: The easiest way to add the S3 bucket to the CloudFormation template is touse the S3Bucket and S3Key properties of the AWS::Lambda::Function resource. Theseproperties specify the name of the S3 bucket and the location of the .zip file that containsthe function code and dependencies. This way, the developer does not need to modify the function code or upload it to a different location. The other options are either not feasible ornot efficient. The code property can only be used for inline code, not for code stored in S3.The ZipFile property can only be used for code that is less than 4096 bytes, not for codethat has dependencies. Finding the S3 key for the Lambda function and adding it as theZipFile property would not work, as the ZipFile property expects a base64-encoded .zip file,not an S3 location. ReferencesAWS::Lambda::Function – AWS CloudFormationDeploying Lambda functions as .zip file archivesAWS Lambda Function Code – AWS CloudFormation

Question # 10
A company has a web application that is hosted on Amazon EC2 instances The EC2instances are configured to stream logs to Amazon CloudWatch Logs The company needsto receive an Amazon Simple Notification Service (Amazon SNS) notification when thenumber of application error messages exceeds a defined threshold within a 5-minuteperiodWhich solution will meet these requirements?

A. Rewrite the application code to stream application logs to Amazon SNS Configure anSNS topic to send a notification when the number of errors exceeds the defined thresholdwithin a 5-minute period
B. Configure a subscription filter on the CloudWatch Logs log group. Configure the filter tosend an SNS notification when the number of errors exceeds the defined threshold within a5-minute period.
C. Install and configure the Amazon Inspector agent on the EC2 instances to monitor forerrors Configure Amazon Inspector to send an SNS notification when the number of errorsexceeds the defined threshold within a 5-minute period
D. Create a CloudWatch metric filter to match the application error pattern in the log data. Set up a CloudWatch alarm based on the new custom metric. Configure the alarm to sendan SNS notification when the number of errors exceeds the defined threshold within a 5-minute period.

Answer: D
Explanation: The best solution is to create a CloudWatch metric filter to match theapplication error pattern in the log data. This will allow you to create a custom metric thattracks the number of errors in your application. You can then set up a CloudWatch alarmbased on this metric and configure it to send an SNS notification when the number of errorsexceeds a defined threshold within a 5-minute period. This solution does not require anychanges to your application code or installing any additional agents on your EC2 instances.It also leverages the existing integration between CloudWatch and SNS for sendingnotifications. ReferencesCreate Metric Filters – Amazon CloudWatch LogsCreating Amazon CloudWatch Alarms – Amazon CloudWatchHow to send alert based on log message on CloudWatch – Stack Overflo

Question # 11
A developer designed an application on an Amazon EC2 instance The application makesAPI requests to objects in an Amazon S3 bucketWhich combination of steps will ensure that the application makes the API requests in theMOST secure manner? (Select TWO.)

A. Create an IAM user that has permissions to the S3 bucket. Add the user to an 1AMgroup
B. Create an IAM role that has permissions to the S3 bucket
C. Add the IAM role to an instance profile. Attach the instance profile to the EC2 instance.
D. Create an 1AM role that has permissions to the S3 bucket Assign the role to an 1AMgroup
E. Store the credentials of the IAM user in the environment variables on the EC2 instance

Answer: B,C
Explanation: – Create an IAM role that has permissions to the S3 bucket. – Add the IAMrole to an instance profile. Attach the instance profile to the EC2 instance. We first need tocreate a n IAM Role with permissions to read and eventually write a specific S3 bucket.Then, we need to attach the role to the EC2 isntance through an instance profile. In this way, the ec2 instance has the permissions to read and eventually write the specified S3bucket

Question # 12
A developer is using AWS Step Functions to automate a workflow The workflow defineseach step as an AWS Lambda function task The developer notices that runs of the StepFunctions state machine fail in the GetResource task with either anUlegalArgumentException error or a TooManyRequestsException errorThe developer wants the state machine to stop running when the state machine encountersan UlegalArgumentException error. The state machine needs to retry the GetResourcetask one additional time after 10 seconds if the state machine encounters aTooManyRequestsException error. If the second attempt fails, the developer wants thestate machine to stop running.How can the developer implement the Lambda retry functionality without addingunnecessary complexity to the state machine’?

A. Add a Delay task after the GetResource task. Add a catcher to the GetResource task.Configure the catcher with an error type of TooManyRequestsException. Configure thenext step to be the Delay task Configure the Delay task to wait for an interval of 10 secondsConfigure the next step to be the GetResource task.
B. Add a catcher to the GetResource task Configure the catcher with an error type ofTooManyRequestsException. an interval of 10 seconds, and a maximum attempts value of1. Configure the next step to be the GetResource task.
C. Add a retrier to the GetResource task Configure the retrier with an error type ofTooManyRequestsException, an interval of 10 seconds, and a maximum attempts value of1.
D. Duplicate the GetResource task Rename the new GetResource task to TryAgain Add acatcher to the original GetResource task Configure the catcher with an error type ofTooManyRequestsException. Configure the next step to be TryAgain.

Answer: C
Explanation: The best way to implement the Lambda retry functionality is to use
the Retry field in the state definition of the GetResource task. The Retry field allows thedeveloper to specify an array of retriers, each with an error type, an interval, and amaximum number of attempts. By setting the error type to TooManyRequestsException,the interval to 10 seconds, and the maximum attempts to 1, the developer can achieve thedesired behavior of retrying the GetResource task once after 10 seconds if it encountersa TooManyRequestsException error. If the retry fails, the state machine will stop running. Ifthe GetResource task encounters an UlegalArgumentException error, the state machinewill also stop running without retrying, as this error type is not specified inthe Retry field. ReferencesError handling in Step FunctionsHandling Errors, Retries, and adding Alerting to Step Function State MachineExecutionsThe Jitter Strategy for Step Functions Error Retries on the New Workflow Studio

Question # 13
A developer creates a static website for their department The developer deploys the staticassets for the website to an Amazon S3 bucket and serves the assets with AmazonCloudFront The developer uses origin access control (OAC) on the CloudFront distributionto access the S3 bucketThe developer notices users can access the root URL and specific pages but cannotaccess directories without specifying a file name. For example, /products/index.html works,but /products returns an error The developer needs to enable accessing directories withoutspecifying a file name without exposing the S3 bucket publicly.Which solution will meet these requirements’?

A. Update the CloudFront distribution’s settings to index.html as the default root object isset
B. Update the Amazon S3 bucket settings and enable static website hosting. Specify indexhtml as the Index document Update the S3 bucket policy to enable access. Update theCloudFront distribution’s origin to use the S3 website endpoint
C. Create a CloudFront function that examines the request URL and appends index.htmlwhen directories are being accessed Add the function as a viewer request CloudFrontfunction to the CloudFront distribution’s behavior.
D. Create a custom error response on the CloudFront distribution with the HTTP error codeset to the HTTP 404 Not Found response code and the response page path to /index htmlSet the HTTP response code to the HTTP 200 OK response code

Answer: A
Explanation: The simplest and most efficient way to enable accessing directories without
specifying a file name is to update the CloudFront distribution’s settings to index.html asthe default root object. This will instruct CloudFront to return the index.html object when auser requests the root URL or a directory URL for the distribution. This solution does notrequire enabling static website hosting on the S3 bucket, creating a CloudFront function, orcreating a custom error response. ReferencesSpecifying a default root objectcloudfront-default-root-object-configuredHow to setup CloudFront default root object?Ensure a default root object is configured for AWS Cloudfront …

Question # 14
A company has an existing application that has hardcoded database credentials Adeveloper needs to modify the existing application The application is deployed in two AWSRegions with an active-passive failover configuration to meet company’s disaster recoverystrategyThe developer needs a solution to store the credentials outside the code. The solution mustcomply With the company’s disaster recovery strategyWhich solution Will meet these requirements in the MOST secure way?

A. Store the credentials in AWS Secrets Manager in the primary Region. Enable secretreplication to the secondary Region Update the application to use the Amazon ResourceName (ARN) based on the Region.
B. Store credentials in AWS Systems Manager Parameter Store in the primary Region.Enable parameter replication to the secondary Region. Update the application to use theAmazon Resource Name (ARN) based on the Region.
C. Store credentials in a config file. Upload the config file to an S3 bucket in me primaryRegion. Enable Cross-Region Replication (CRR) to an S3 bucket in the secondary region.Update the application to access the config file from the S3 bucket based on the Region.
D. Store credentials in a config file. Upload the config file to an Amazon Elastic File System(Amazon EFS) file system. Update the application to use the Amazon EFS file systemRegional endpoints to access the config file in the primary and secondary Regions.

Answer: A
Explanation: AWS Secrets Manager is a service that allows you to store and managesecrets, such as database credentials, API keys, and passwords, in a secure andcentralized way. It also provides features such as automatic secret rotation, auditing, andmonitoring1. By using AWS Secrets Manager, you can avoid hardcoding credentials in yourcode, which is a bad security practice and makes it difficult to update them. You can alsoreplicate your secrets to another Region, which is useful for disaster recovery purposes2.To access your secrets from your application, you can use the ARN of the secret, which isa unique identifier that includes the Region name. This way, your application can use theappropriate secret based on the Region where it is deployed3.References:AWS Secrets ManagerReplicating and sharing secretsUsing your own encryption keys

Question # 15
A developer must use multi-factor authentication (MFA) to access data in an Amazon S3 bucket that is in another AWS account. Which AWS Security Token Service (AWS STS)API operation should the developer use with the MFA information to meet thisrequirement?

A. AssumeRoleWithWebidentity
B. GetFederationToken
C. AssumeRoleWithSAML
D. AssumeRole

Answer: D
Explanation: The AssumeRole API operation returns a set of temporary securitycredentials that can be used to access resources in another AWS account. The developercan specify the MFA device serial number and the MFA token code in the requestparameters. This option enables the developer to use MFA to access data in an S3 bucketthat is in another AWS account. The other options are not relevant or effective for thisscenario. ReferencesAssumeRoleRequesting Temporary Security Credentials

Question # 16
A developer is working on an ecommerce platform that communicates with several thirdpartypayment processing APIs The third-party payment services do not provide a testenvironment.The developer needs to validate the ecommerce platform’s integration with the third-partypayment processing APIs. The developer must test the API integration code withoutinvoking the third-party payment processing APIs.Which solution will meet these requirements’?

A. Set up an Amazon API Gateway REST API with a gateway response configured forstatus code 200 Add response templates that contain sample responses captured from thereal third-party API.
B. Set up an AWS AppSync GraphQL API with a data source configured for each thirdpartyAPI Specify an integration type of Mock Configure integration responses by usingsample responses captured from the real third-party API.
C. Create an AWS Lambda function for each third-party API. Embed responses capturedfrom the real third-party API. Configure Amazon Route 53 Resolver with an inboundendpoint for each Lambda function’s Amazon Resource Name (ARN).
D. Set up an Amazon API Gateway REST API for each third-party API Specify anintegration request type of Mock Configure integration responses by using sampleresponses captured from the real third-party API

Answer: D
Explanation: Amazon API Gateway can mock responses for testing purposes withoutrequiring any integration backend. This allows the developer to test the API integrationcode without invoking the third-party payment processing APIs. The developer canconfigure integration responses by using sample responses captured from the real thirdpartyAPI. References:Mocking Integration Responses in API GatewaySet up Mock Integrations for an API in API Gateway

Question # 17
An AWS Lambda function requires read access to an Amazon S3 bucket and requiresread/write access to an Amazon DynamoDB table The correct 1AM policy already existsWhat is the MOST secure way to grant the Lambda function access to the S3 bucket andthe DynamoDB table?

A. Attach the existing 1AM policy to the Lambda function.
B. Create an 1AM role for the Lambda function Attach the existing 1AM policy to the roleAttach the role to the Lambda function
C. Create an 1AM user with programmatic access Attach the existing 1AM policy to theuser. Add the user access key ID and secret access key as environment variables in theLambda function.
D. Add the AWS account root user access key ID and secret access key as encryptedenvironment variables in the Lambda function

Answer: B
Explanation: The most secure way to grant the Lambda function access to the S3 bucketand the DynamoDB table is to create an IAM role for the Lambda function and attach theexisting IAM policy to the role. This way, you can use the principle of least privilege andavoid exposing any credentials in your function code or environment variables. You canalso leverage the temporary security credentials that AWS provides to the Lambda functionwhen it assumes the role. This solution follows the best practices for working with AWSLambda functions1 and designing and architecting with DynamoDB2. ReferencesBest practices for working with AWS Lambda functionsBest practices for designing and architecting with DynamoDB

Question # 18
A company hosts a client-side web application for one of its subsidiaries on Amazon S3.The web application can be accessed through Amazon CloudFront fromhttps://www.example.com. After a successful rollout, the company wants to host three moreclient-side web applications for its remaining subsidiaries on three separate S3 buckets.To achieve this goal, a developer moves all the common JavaScript files and web fonts to acentral S3 bucket that serves the web applications. However, during testing, the developernotices that the browser blocks the JavaScript files and web fonts.What should the developer do to prevent the browser from blocking the JavaScript files andweb fonts?

A. Create four access points that allow access to the central S3 bucket. Assign an accesspoint to each web application bucket.
B. Create a bucket policy that allows access to the central S3 bucket. Attach the bucketpolicy to the central S3 bucket.
C. Create a cross-origin resource sharing (CORS) configuration that allows access to thecentral S3 bucket. Add the CORS configuration to the central S3 bucket.
D. Create a Content-MD5 header that provides a message integrity check for the centralS3 bucket. Insert the Content-MD5 header for each web application request.

Answer: C
Explanation: This is a frequent trouble. Web applications cannot access the resources inother domains by default, except some exceptions. You must configure CORS on theresources to be accessed.https://docs.aws.amazon.com/AmazonS3/latest/userguide/cors.html

Question # 19
A company runs an application on AWS The application stores data in an AmazonDynamoDB table Some queries are taking a long time to run These slow queries involve anattribute that is not the table’s partition key or sort keyThe amount of data that the application stores in the DynamoDB table is expected toincrease significantly. A developer must increase the performance of the queries.Which solution will meet these requirements’?

A. Increase the page size for each request by setting the Limit parameter to be higher thanthe default value Configure the application to retry any request that exceeds theprovisioned throughput.
B. Create a global secondary index (GSI). Set query attribute to be the partition key of theindex
C. Perform a parallel scan operation by issuing individual scan requests in the parametersspecify the segment for the scan requests and the total number of segments for the parallelscan.
D. Turn on read capacity auto scaling for the DynamoDB table. Increase the maximumread capacity units (RCUs).

Answer: B
Explanation: Creating a global secondary index (GSI) is the best solution to improve theperformance of the queries that involve an attribute that is not the table’s partition key orsort key. A GSI allows you to define an alternate key for your table and query the datausing that key. This way, you can avoid scanning the entire table and reduce the latencyand cost of your queries. You should also follow the best practices for designing and usingGSIs in DynamoDB12. ReferencesWorking with Global Secondary Indexes – Amazon DynamoDBDynamoDB Performance & Latency – Everything You Need To Know

Question # 20
A developer is creating an AWS Lambda function that searches for items from an AmazonDynamoDB table that contains customer contact information- The DynamoDB table itemshave the customer’s email_address as the partition key and additional properties such ascustomer_type, name, and job_tltle.The Lambda function runs whenever a user types a new character into the customer_typetext input The developer wants the search to return partial matches of all theemail_address property of a particular customer_type The developer does not want torecreate the DynamoDB table.What should the developer do to meet these requirements?

A. Add a global secondary index (GSI) to the DynamoDB table with customer_type as thepartition key and email_address as the sort key Perform a query operation on the GSI byusing the begvns_wth key condition expression With the email_address property
B. Add a global secondary index (GSI) to the DynamoDB table With ernail_address as thepartition key and customer_type as the sort key Perform a query operation on the GSI byusing the begins_wtth key condition expression With the email_address property.
C. Add a local secondary index (LSI) to the DynamoDB table With customer_type as thepartition key and email_address as the sort key Perform a query operation on the LSI byusing the begins_wlth key condition expression With the email_address property
D. Add a local secondary Index (LSI) to the DynamoDB table With job_tltle as the partitionkey and emad_address as the sort key Perform a query operation on the LSI by using thebegins_wrth key condition expression With the email_address property

Answer: A
Explanation: By adding a global secondary index (GSI) to the DynamoDB table with
customer_type as the partition key and email_address as the sort key, the developer canperform a query operation on the GSI using the Begins_with key condition expression withthe email_address property. This will return partial matches of all email_address properties of a specific customer_type.
Leave A Comment