1. Home
  2. Amazon
  3. SCS-C02 Exam Info
  4. SCS-C02 Exam Questions

Unlock Cloud Security Mastery: Amazon AWS Certified Security - Specialty SCS-C02 Prep

Ready to fortify your cloud security expertise and stand out in the competitive IT landscape? Our cutting-edge AWS Certified Security - Specialty SCS-C02 practice questions are your secret weapon. Designed by industry veterans, these materials go beyond mere memorization, diving deep into real-world scenarios you'll face as a cloud security specialist. Whether you prefer studying on-the-go with our PDF format, enjoy interactive learning through our web-based platform, or crave the robust features of our desktop software, we've got you covered. Don't let imposter syndrome hold you back – join thousands of successful candidates who've aced the exam using our resources. With the rising demand for cloud security professionals in fields like DevSecOps and compliance, your Amazon certification could be the key to unlocking lucrative opportunities. Time is ticking – start your journey to becoming an AWS security guru today!

Page: 1 /
Total 327 questions
Get Free Questions & Answers PDF
Question 1

A company manages multiple AWS accounts using AWS Organizations. The company's security team notices that some member accounts are not sending AWS CloudTrail logs to a centralized Amazon S3 logging bucket. The security team wants to ensure there is at least one trail configured for all existing accounts and for any account that is created in the future.

Which set of actions should the security team implement to accomplish this?


Correct : C

The correct answer is C. Edit the existing trail in the Organizations management account and apply it to the organization.

The reason is that this is the simplest and most effective way to ensure that there is at least one trail configured for all existing accounts and for any account that is created in the future. According to the AWS documentation1, ''If you have created an organization in AWS Organizations, you can create a trail that logs all events for all AWS accounts in that organization. This is sometimes called an organization trail.'' The documentation1 also states that ''The management account for the organization can edit an existing trail in their account, and apply it to an organization, making it an organization trail. Organization trails log events for the management account and all member accounts in the organization.'' Therefore, by editing the existing trail in the management account and applying it to the organization, the security team can ensure that all accounts are sending CloudTrail logs to a centralized S3 logging bucket.

The other options are incorrect because:

A) Create a new trail and configure it to send CloudTrail logs to Amazon S3. Use Amazon EventBridge to send notification if a trail is deleted or stopped. This option is not sufficient to ensure that there is at least one trail configured for all accounts, because it does not prevent users from deleting or stopping the trail in their accounts. Even if EventBridge sends a notification, the security team would have to manually restore or restart the trail, which is not efficient or scalable.

B) Deploy an AWS Lambda function in every account to check if there is an existing trail and create a new trail, if needed. This option is not optimal because it requires deploying and maintaining a Lambda function in every account, which adds complexity and cost. Moreover, it does not prevent users from deleting or stopping the trail after it is created by the Lambda function.

D) Create an SCP to deny the cloudtrail:Delete and cloudtrail:Stop actions. Apply the SCP to all accounts. This option is not sufficient to ensure that there is at least one trail configured for all accounts, because it does not create or apply a trail in the first place. It only prevents users from deleting or stopping an existing trail, but it does not guarantee that a trail exists in every account.


Options Selected by Other Users:
Mark Question:

Start a Discussions

Submit Your Answer:
0 / 1500
Question 2

A company's data scientists want to create artificial intelligence and machine learning (AI/ML) training models by using Amazon SageMaker. The training models will use large datasets in an Amazon S3 bucket. The datasets contain sensitive information.

On average. the data scientists need 30 days to train models. The S3 bucket has been secured appropriately The companfs data retention policy states that all data that is older than 45 days must be removed from the S3 bucket.

Which action should a security engineer take to enforce this data retention policy?


Correct : A

The correct answer is A. Configure an S3 Lifecycle rule on the S3 bucket to delete objects after 45 days.

The reason is that this is the simplest and most effective way to enforce the data retention policy. According to the AWS documentation1, ''To manage your objects so that they are stored cost effectively throughout their lifecycle, configure their Amazon S3 Lifecycle. An S3 Lifecycle configuration is a set of rules that define actions that Amazon S3 applies to a group of objects. There are two types of actions: Transition actions and Expiration actions.'' The documentation1 also states that ''Expiration actions define when objects expire. Amazon S3 deletes expired objects on your behalf.'' Therefore, by configuring an S3 Lifecycle rule on the S3 bucket to delete objects after 45 days, the security engineer can ensure that the data is removed from the S3 bucket according to the company's policy.

The other options are incorrect because:

B) Create an AWS Lambda function to check the last-modified date of the S3 objects and delete objects that are older than 45 days. Create an S3 event notification to invoke the Lambda function for each PutObject operation. This option is not optimal because it requires deploying and maintaining a Lambda function, which adds complexity and cost. Moreover, it does not guarantee that the data is deleted exactly after 45 days, since the Lambda function is triggered only when a new object is put into the S3 bucket. If there are no new objects for a long period of time, the Lambda function will not run and the data will not be deleted.

C) Create an AWS Lambda function to check the last-modified date of the S3 objects and delete objects that are older than 45 days. Create an Amazon EventBridge rule to invoke the Lambda function each month. This option is not optimal because it requires deploying and maintaining a Lambda function, which adds complexity and cost. Moreover, it does not guarantee that the data is deleted exactly after 45 days, since the Lambda function is triggered only once a month. If the data is older than 45 days but less than a month, it will not be deleted until the next month.

D) Configure S3 Intelligent-Tiering on the S3 bucket to automatically transition objects to another storage class. This option is not sufficient to enforce the data retention policy, because it does not delete the data from the S3 bucket. It only moves the data to a less expensive storage class based on access patterns. According to the AWS documentation2, ''S3 Intelligent-Tiering optimizes storage costs by automatically moving data between two access tiers, frequent access and infrequent access, when access patterns change.'' However, this feature does not expire or delete the data after a certain period of time.


Options Selected by Other Users:
Mark Question:

Start a Discussions

Submit Your Answer:
0 / 1500
Question 3

A company that operates in a hybrid cloud environment must meet strict compliance requirements. The company wants to create a report that includes evidence from on-premises workloads alongside evidence from AWS resources. A security engineer must implement a solution to collect, review, and manage the evidence to demonstrate compliance with company policy.'

Which solution will meet these requirements?


Correct : A

The reason is that this solution will meet the requirements of collecting, reviewing, and managing the evidence from both on-premises and AWS resources to demonstrate compliance with company policy.According to the web search results12, ''AWS Audit Manager helps you continuously audit your AWS usage to simplify how you manage risk and compliance with regulations and industry standards.AWS Audit Manager makes it easier to evaluate whether your policies, procedures, and activities---also known as controls---are operating as intended.'' The results1also state that ''In addition to the evidence that Audit Manager collects from your AWS environment, you can also upload and centrally manage evidence from your on-premises or multicloud environment.'' Therefore, by creating an assessment in AWS Audit Manager, the security engineer can use a prebuilt or custom framework that contains the relevant controls for the company policy, upload manual evidence from the on-premises workloads, and add the evidence to the assessment. After Audit Manager collects the necessary evidence from the AWS resources, the security engineer can generate an assessment report that includes all the evidence from both sources.

The other options are incorrect because:

B) Install the Amazon CloudWatch agent on the on-premises workloads. Use AWS Config to deploy a conformance pack from a sample conformance pack template or a custom YAML template. Generate an assessment report after AWS Config identifies noncompliant workloads and resources. This option is not sufficient to meet the requirements, because it does not collect or manage the evidence from both sources. It only monitors and evaluates the configuration compliance of the workloads and resources using AWS Config rules.According to the web search results3, ''A conformance pack is a collection of AWS Config rules and remediation actions that can be easily deployed as a single entity in an account and a Region or across an organization in AWS Organizations.'' However, a conformance pack does not provide a way to upload or include manual evidence from the on-premises workloads, nor does it generate an assessment report that contains all the evidence.

C) Set up the appropriate security standard in AWS Security Hub. Upload manual evidence from the on-premises workloads. Wait for Security Hub to collect the evidence from the AWS resources. Download the list of controls as a .csv file. This option is not optimal to meet the requirements, because it does not provide a comprehensive or audit-ready report that contains all the evidence. It only provides a list of controls and their compliance status in a .csv file format.According to the web search results4, ''Security Hub provides you with a comprehensive view of your security state within AWS and helps you check your environment against security industry standards and best practices.'' However, Security Hub does not provide a way to upload or include manual evidence from the on-premises workloads, nor does it generate an assessment report that contains all the evidence.

D) Install the Amazon CloudWatch agent on the on-premises workloads. Create a CloudWatch dashboard to monitor the on-premises workloads and the AWS resources. Run a query on the workloads and resources. Download the results. This option is not sufficient to meet the requirements, because it does not collect or manage the evidence from both sources. It only monitors and analyzes the metrics and logs of the workloads and resources using CloudWatch. According to the web search results, ''Amazon CloudWatch is a monitoring and observability service built for DevOps engineers, developers, site reliability engineers (SREs), and IT managers.'' However, CloudWatch does not provide a way to upload or include manual evidence from the on-premises workloads, nor does it generate an assessment report that contains all the evidence.


Options Selected by Other Users:
Mark Question:

Start a Discussions

Submit Your Answer:
0 / 1500
Question 4

A company that uses AWS Organizations is using AWS 1AM Identity Center (AWS Single Sign-On) to administer access to AWS accounts. A security engineer is creating a custom permission set in 1AM Identity Center. The company will use the permission set across multiple accounts. An AWS managed policy and a customer managed policy are attached to the permission set. The security engineer has full administrative permissions and is operating in the management account.

When the security engineer attempts to assign the permission set to an 1AM Identity Center user who has access to multiple accounts, the assignment fails.

What should the security engineer do to resolve this failure?


Correct : A

https://docs.aws.amazon.com/singlesignon/latest/userguide/howtocmp.html

'Before you assign your permission set with IAM policies, you must prepare your member account. The name of an IAM policy in your member account must be a case-sensitive match to name of the policy in your management account. IAM Identity Center fails to assign the permission set if the policy doesn't exist in your member account.'


Options Selected by Other Users:
Mark Question:

Start a Discussions

Submit Your Answer:
0 / 1500
Question 5

A company wants to receive automated email notifications when AWS access keys from developer AWS accounts are detected on code repository sites.

Which solution will provide the required email notifications?


Correct : A

The solution to receiving automated email notifications when AWS access keys are detected on code repository sites is to use Amazon EventBridge with Amazon GuardDuty findings. Specifically, creating an EventBridge rule that targets Amazon GuardDuty findings, particularly the UnauthorizedAccess:IAMUser/InstanceCredentialExfiltration finding type, allows for the detection of potential unauthorized use or exposure of AWS credentials. When such a finding is detected, EventBridge can then trigger an action to send a notification via Amazon Simple Notification Service (Amazon SNS). By configuring an SNS topic to send emails, stakeholders can be promptly informed of such security incidents. This approach leverages AWS's native security and monitoring services to provide timely alerts with minimal operational overhead, ensuring that the company can respond quickly to potential security breaches involving exposed AWS credentials.


Options Selected by Other Users:
Mark Question:

Start a Discussions

Submit Your Answer:
0 / 1500
Page:    1 / 66   
Total 327 questions