From time to time, the customer needs to validate heuristics, which requires going back to data samples extracted from the last 12 hours?

Your customer is willing to consolidate their log streams (access logs application logs security logs etc.) in one single system. Once consolidated, the customer wants to analyze these logs in real time based on heuristics.

From time to time, the customer needs to validate heuristics, which requires going back to data samples extracted from the last 12 hours?

What is the best approach to meet your customer’s requirements?
A . Send all the log events to Amazon SQ
B . Setup an Auto Scaling group of EC2 servers to consume the logs and apply the heuristics.
C . Send all the log events to Amazon Kinesis develop a client process to apply heuristics on the logs
D .  Configure Amazon Cloud Trail to receive custom logs, use EMR to apply heuristics the logs
E . Setup an Auto Scaling group of EC2 syslogd servers, store the logs on 53 use EMR to apply heuristics on the logs

Answer: B

Explanation:

The throughput of an Amazon Kinesis stream is designed to scale without limits via increasing the number of shards within a stream.

However, there are certain limits you should keep in mind while using Amazon Kinesis Streams:

By default, Records of a stream are accessible for up to 24 hours from the time they are added to the stream. You can raise this limit to up to 7 days by enabling extended data retention.

The maximum size of a data blob (the data payload before Base64-encoding) within one record is 1 megabyte (MB).

Each shard can support up to 1000 PUT records per second.

For more information about other API level limits, see Amazon Kinesis Streams Limits.

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments