Did you know that Amazon Web Services has more than a million active users? Enterprise customers make up nearly 10% of the overall AWS users whereas the rest are Small and Medium Businesses (SMB's).
With more and more companies moving to cloud computing, we’re witnessing on-demand cloud computing platforms like AWS, IBM Cloud, Microsoft Azure, Oracle Cloud Infrastructure (Gen 2), and many more leading the revolution.
And while transitioning to the cloud has helped companies save money and resources, organizations should easily be able to analyze and understand activities or events happening in their AWS environment.
Whether you want to monitor user changes, security risks, and/or compliance, AWS CloudTrail is Amazon Web Services’ integrated service to perform risk management, compliance, and governance checks on the cloud. Learn more about AWS CloudTrail here.
And while you can monitor and filter AWS CloudTrail log data using Amazon CloudWatch, the only problem is – it’s complicated if you have no experience running command line queries or lack technical expertise.
AWS CloudWatch Logs Insights is an integrated interactive log analytics capability that allows AWS users to query the CloudTrail logs and analyze the trend of API activities. In simple words, you can use AWS CloudWatch Logs Insights to search and analyze your AWS CloudTrail log data.
But the only problem is – if you want to dive deep into the data, you need to familiarize yourself with Amazon CloudWatch query commands. Following are a few sample queries we fetched from AWS Docs for CloudTrail Logs:
AWS CloudTrail Log Queries
Learn more about how to monitor and analyze AWS CloudTrail Log Data in Amazon CloudWatch here.
But if you’re someone with little-to-no technical experience and like to filter data using the old-fashioned way (using spreadsheets!), you can always use Microsoft Excel or Google Sheets.
First, it’s important to note that AWS CloudTrail log events are stored in JSON format. While you can also use jq – a lightweight + flexible command-line JSON processor - if you have little-to-no technical experience running jq queries, you’ll face a difficult time getting your hands on the data you want.
Two good alternatives that’ll allow you to analyze and filter data in good old-fashioned manner are – Microsoft Excel & Google Spreadsheets.
Large spreadsheets are always a problem, especially if you’re using Microsoft Excel or Google Spreadsheets. However, when it comes to the simplicity of using spreadsheets to monitor and filter your data, there’s nothing better.
BUT! There are potential performance issues with these spreadsheet options that you should know about.
You can directly open your AWS CloudTrail log data file using Excel. However, the data will be all messed up. Here’s what it’ll look like:
Opening a JSON file in Excel
To ensure that your AWS CloudTrail data is structured right in Excel, you need to import data using Excel’s in-built import functionality.
Here’s how to do it:
Lastly, if your CloudTrail log data file is large – filled with thousands of rows, then you’ll see Microsoft Excel freezing or causing your computer to crash.
If you’re getting the “Excel (Not Responding)” error, then chances are that your spreadsheet file is too big for Excel.
Maybe you have a low-end PC. And even if your PC is high-end, if your CloudTrail log file is too large, then there’s a good chance Microsoft Excel will freeze or cause your computer to crash.
And the same can be said for Google Spreadsheets.
Just like Microsoft Excel, loading data from your JSON file to Google Spreadsheets is complicated. Also, if your CloudTrail log data file is large – filled with thousands of rows, then you’ll see your browser crashing while using Google Spreadsheets.
If your browser is freezing while monitoring your CloudTrail log data file in Google Spreadsheets, then again – chances are that your spreadsheet file is too big.
So, what to do in case your CloudTrail Log file is too large for both Microsoft Excel and Google Spreadsheets?
Don’t want to go through the trouble of importing data from your JSON file to Microsoft Excel or Google Spreadsheets? Or maybe your spreadsheet file is just too large for that spreadsheet software to process?
Let Gigasheet turn the process simpler for you.
From CSV to JSON to PCAP to ZIP, Gigasheet allows its users to upload files in numerous formats. At the same time, you can open files with millions of rows without facing any technical difficulties. Lastly, you can narrow down your search and get your hands on the data you need with Gigasheet’s Filter, Group by Column, Pivot Mode, and numerous features.
We loaded a sample AWS CloudTrail log file into Gigasheet. You can upload your CloudTrail log file either by adding the S3 Bucket link where your CloudTrail logs are stored or by uploading the file from your local system. Once uploaded, Gigasheet will process your file in no time. Here’s what the file will look like inside Gigasheet:
AWS CloudTrail File Inside Gigasheet
You can manually arrange the position of the columns by dragging Column Group names – depending on how you want to analyze the data. Let’s swap Event Time & Event Version.
Before Swapping Column Group
After Swapping Column Group
NOTE - We performed the above step just to show you how you can arrange column groups in Gigasheet.
Next, you can individually select the rows and get a detailed view of them in the right panel.
How to Use Gigasheet
Let’s filter the events by their name “Assume Role.”
So, we’ll click on “Filter” and add the filter:
Filtering Data by EventName
Click on “Apply” to apply the filter. You can also save the filter – so that you can quickly apply it next time.
Here’s the filtered data:
AssumeRole Filter Results
You can add different filters (one or more than one at a time) to narrow down your search.
Let’s find the entries with Source IP Address as “126.96.36.199.”
Adding SourceIPAddress Filter Inside Gigasheet
Here are the results:
How to Apply Filter in Gigasheet
Now, let’s add another filter:
How to Filter Data by Adding Two or More Filters
These are the events with the Event Name as “Get Object” and Source IP Address as “188.8.131.52.” We have set the “AND” condition. You can even use the “OR” condition – depending on your requirements.
Now, let’s remove these filters and group our data by IP address:
How to Group Data
Here are the results:
Grouping Data by Column
Now, when we expand the IP Address “184.108.40.206,” you’ll get all the entries with the source IP address as “220.127.116.11.” You can add another layer of grouping We’ve grouped our data by “Source IP Address.” Now, let’s add another layer “Event Name.”
How to Group Data by Adding Multiple Layer of Grouping
So first our data was grouped by “Source IP Address” and then by “Event Name.”
Here are the results:
Grouping Data Results
As simple as that!
You can also remove Column Groups from the right panel. If you want to remove the column “Event Time,” just untick the box “Event Time” from the right panel.
Removing Column Inside Gigasheet
Removing Column Inside Gigasheet
You can also use the Pivot mode to group your data with ease. Learn how to use Pivot mode here. And you can perform arithmetic calculations, tap into cross-file VLookup, enrich data by columns and do so much more with Gigasheet.
Large spreadsheets are always a problem. We understand.
If you have been struggling to analyze AWS CloudTrail files and get your hands on the exact data you want without having to run technical queries or face technical difficulties like your program freezing/crashing, Gigasheet’s the answer.