top of page
horizontal lines
Gigasheet Primary logo
  • Syed Hasan

How To Search AWS CloudTrail Logs

If you're looking for an easy CloudTrail log analysis explainer, look no further. In this blog we'll show you where your CloudTrail logs are stored, how to search and analyze logs with JQ, and how to use Gigasheet to search your AWS CloudTrail logs without any coding or configuration.


On-premise infrastructure is slowly hitting a dead-break as cloud adoption continues. AWS, GCP, and Azure are at the forefront of this change. However, as organizations continue to make this move; one thing which still persists is the security of their cloud infrastructure.


Luckily, these three services do a tremendous job at logging user activity in the cloud. In this article, I’m going to focus on Amazon Web Services and the logs it generates for security analysts to review. In a later series, we might explore other cloud platforms as well.

With that said, let’s first discuss the service AWS uses to log user activities - CloudTrail.

What is AWS CloudTrail?

AWS CloudTrail is a service released by AWS to perform governance, risk management, and compliance (GRC) checks on the cloud. CloudTrail is enabled on all AWS accounts by default to ensure operational security. The service logs all actions performed on the cloud as ‘events’.

CloudTrail can log all actions taken by:

  • Users

  • Roles

  • AWS Services

These actions can further be taken on any one of these following modes of accessing cloud infrastructure on AWS:

  • Management Console (the web)

  • Command-line Interface (AWSCLI)

  • AWS SDK (development kits)

  • APIs

If you’re an incident responder performing root-cause analysis, CloudTrail logs can quickly help pinpoint the initial point of intrusion. By picking up a trail of abnormal API calls, you can quickly uncover compromised users or rogue insiders. These logs can be particularly helpful to threat hunters as well. By looking for known patterns of API calls, you can quickly identify suspicious activity and take swift action.

Interested in a more detailed overview of how CloudTrail works? Give this a read. Let’s switch our focus to accessing CloudTrail logs now.

View our other AWS articles:

Where are AWS CloudTrail logs stored?

By default, CloudTrail is enabled for all AWS accounts and continues to log activities made through the management console, command-line interface, and SDKs. Head over to CloudTrail, open up the Event History tab and it’s going to pull up all activity on your AWS account from the last 90 days.

Showing where AWS CloudTrail logs are stored
Event History in AWS CloudTrail

However, a trail still needs to be created if you wish to record events beyond 90-days worth of logs. Here’s an excellent article by Amazon itself on how to create trails on AWS CloudTrail.

Now that you’ve configured a trail, it’s time to access your CloudTrail logs. Head over to S3 and find the bucket configured against your trail. CloudTrail logs are nested under the bucket in the following order of folders:

  • Bucket name and prefix of the trail (e.g. ‘cloudsecurity’)

  • AWSLogs

  • AccountID

  • CloudTrail

  • Region

  • Year

  • Month

  • Day

  • Logfile (.gzip archive)

CloudTrail Logs in S3
CloudTrail Logs in S3

Once you’re at the end of the trail, you’ll see a gun-zipped log file with the filename conforming to the format;

{ACCOUNT_ID}_CloudTrail_{REGION}_{DATETIME}_{UNIQUESTRING}.json.gz.

Let’s download it and analyze CloudTrail logs with Gigasheet and JQ.

Analyzing CloudTrail Logs with JQ

CloudTrail log files store events in the JSON format. To easily view and perform analysis on JSON log files, we can use Gigasheet (more on that later), or more powerful command-line utilities like JQ. JQ is a lightweight command-line processor for JSON files using which we can view, modify, and transform our data as needed.


Looking for an easier method, with no-coding or install? Skip ahead

Here, I’ll be using the FLAWS2 dataset released by Scott Piper of Summit Route. You can access the exercises released by Scott and download the logs from the S3 service. Start by decompressing the recently acquired log files. Once done, you’ll have a few JSON files. Let’s run a simple JQ query to see what data is returned by the processor: