Total Pageviews

Translate

June 5, 2018

AWS Logging and Analysis - Part 3 - Exploring Raw Logs in S3 and CloudWatch Logs Log Group

by 4hathacker  |  in AWS Logging and Analysis at  7:59 PM

Hi folks!!!

Welcome to AWS Logging and Analysis Quest...


In the previous article, we have configured universal logging of AWS CloudTrail logs with CloudWatch Logs log group and S3 bucket. In this article, we will explore our working configuration and structure of log saving.

Let us first check in S3 in the bucket named as "aws-logs-bucket123" which we have configured in the previous post.

1. Go to,
  
Services --> Storage --> S3 --> aws-logs-bucket123 --> AWSLogs --> your_account_id --> CloudTrail --> region_name --> 2018 --> 06 --> 04



2. Since, we have enabled universal logging, we can see logs coming from different regions. For this scenario, the region I have selected is "ap-south-1". The final console will look like given below.



3. The format of the log file saved in S3 is ".json.gz". The structure of file contains list of dictionaries or simply we can say a list of key-value pairs. Download and open the file and extract some details like, "eventTime", "eventSource", "eventName", "awsRegion", "eventID".

I have extracted some details from the log as:

a) "eventTime":"2018-06-04T14:23:45Z",
b) "eventSource":"s3.amazonaws.com",
c) "eventName":"PutObject",
d) "awsRegion":"ap-south-1",
e) "eventID": "f27b2290-a72a-4193-8293-74b359a5d94f"

Now, we will go through the CloudWatch Logs log group and try to find the same log event with the following details.

1. Go to,

Services --> Management Tools --> CloudWatch --> Logs --> log_group_name --> accountId_CloudTrail_region

In this scenario, I have the default log group name as "CloudTrail/DefaultLogGroup" with region as "ap-south-1".



2. There we can see a lot of raw logs same as we have seen in the S3 bucket. I have set the range for last 5 minutes and opened the first log event.



3. Now let's use the "filter bar" to filter by name or something. Try one by one using the above extracted attributes. e.g. "2018-06-04T14:23:45Z".



The more precise way of doing the same is defined in the AWS Documentation as
a) {$.eventTime="2018-06-04T14:23:45Z"}.
b) {$.eventSource="s3.amazon.com"}
and so on for others.

4. We can also apply the combination of filters using AND(&&) and OR(||) operators. e.g.{($.eventTime="2018-06-04T14:23:45Z")  && ($.eventSource="s3.amazonaws.com")}



5. While applying filters one must think logically. eventID will always be unique. The eventTime can be unique for a particular event and can be used in conjunction with other attribute filters. However, filters made from other attributes mentioned above, might return a large set of raw event logs.

In this post, we have seen a number of attributes in the log events on the basis of which we can segregate events, and perform automated tasks. For the same, AWS CloudWatch has the feature of Metrics and Alarms.

0 comments:

Like Our Facebook Page

Nitin Sharma's DEV Profile
Proudly Designed by 4hathacker.