Stream logs to Amazon S3
DataStream 2 supports sending log files to Amazon Simple Storage Service (Amazon S3). Amazon S3 is a static file storage that lets you organize your data and configure finely-tuned access controls to meet your specific business, organizational, and compliance requirements.
DataStream uploads logs to Amazon S3 in a GZIP-compressed file.
For security reasons, DataStream sends logs over TLS even if Amazon S3 policies allow insecure requests.
Before you begin
- Create an Identity and Access Management (IAM) user in Amazon S3. See Overview of access management: permissions and policies.
- Create a dedicated storage bucket in an AWS region. See Create storage buckets.
- Grant the user or role that can
access the bucket the appropriate permissions to the bucket contents, including
- Make note of the access keys and client secret associated with your account. See Understanding and getting your security credentials.
- Set up and manage server side encryption (SSE) in the container's settings. See Server side encryption for Amazon S3.
- In Destination, select S3.
- In Name, enter a human-readable description for the destination.
- In Bucket, enter the name of the bucket you created in the S3 account where you want to store logs.
In Folder path,
provide the path to the folder within the bucket where you want to store logs.
If the folders don't exist in the bucket, Amazon creates them—for example,
logs/diagnostics. You can use Dynamic time variables in folder paths.
In Region, enter the
AWS region code where the bucket resides. For example,
- In Access key ID, enter the access key associated with the Amazon S3 bucket.
In Secret access
key, enter the secret key associated with the Amazon S3
Tip: You can check your authentication details in the .csv file that you saved when creating your access key. If you didn’t download the .csv file, or if you lost it, you may need to delete the existing access key and add a new one. See Managing access keys (console) in AWS.
Click Validate &
Save to validate the connection to the destination, and save the
details you provided.
As part of this validation process, the system uses the provided access key identifier and secret to create an akamai_write_test_2147483647.txt file in your S3 folder. You can only see this file if the validation process is successful, and you have access to the Amazon S3 bucket and folder that you are trying to send logs to.
Optionally, in the Delivery options
menu, edit the Filename field to change the prefix and suffix for your log
files. File name prefixes support Dynamic time variables.
For file name prefixes, you shouldn't use the
.character, as it may result in errors and data loss. File name suffixes don't support dynamic time variables and the
?characters. See AWS object naming conventions.
- Optionally, change the Push frequency to receive bundled logs to your destination every 30 or 60 seconds.
- Click Next.