Stream logs to Amazon S3

DataStream 2 supports sending log files to Amazon Simple Storage Service (Amazon S3). Amazon S3 is a static file storage that lets you organize your data and configure finely-tuned access controls to meet your specific business, organizational, and compliance requirements.

DataStream uploads logs to Amazon S3 in a GZIP-compressed file.

For security reasons, DataStream sends logs over TLS even if Amazon S3 policies allow insecure requests.

Before you begin

How to

  1. In Destination, select S3.
  2. In Name, enter a human-readable description for the destination.
  3. In Bucket, enter the name of the bucket you created in the S3 account where you want to store logs.
  4. In Folder path, provide the path to the folder within the bucket where you want to store logs. If the folders don't exist in the bucket, Amazon creates them—for example, logs or logs/diagnostics. You can use Dynamic time variables in folder paths.
    Note: Amazon treats objects that end with / as folders. For example, if you start your path with / like /logs, Amazon creates two folders in your bucket. The first one is named / and it contains the logs folder. See Using folders in AWS and AWS bucket naming rules.
  5. In Region, enter the AWS region code where the bucket resides. For example, ap-south-1.
  6. In Access key ID, enter the access key associated with the Amazon S3 bucket.
  7. In Secret access key, enter the secret key associated with the Amazon S3 bucket.
    Tip: You can check your authentication details in the .csv file that you saved when creating your access key. If you didn’t download the .csv file, or if you lost it, you may need to delete the existing access key and add a new one. See Managing access keys (console) in AWS.
  8. Click Validate & Save to validate the connection to the destination, and save the details you provided.
    As part of this validation process, the system uses the provided access key identifier and secret to create an akamai_write_test_2147483647.txt file in your S3 folder. You can only see this file if the validation process is successful, and you have access to the Amazon S3 bucket and folder that you are trying to send logs to.
  9. Optionally, in the Delivery options menu, edit the Filename field to change the prefix and suffix for your log files. File name prefixes support Dynamic time variables.
    For file name prefixes, you shouldn't use the . character, as it may result in errors and data loss. File name suffixes don't support dynamic time variables and the ., /, %, and ? characters. See AWS object naming conventions.
  10. Optionally, change the Push frequency to receive bundled logs to your destination every 30 or 60 seconds.
  11. Click Next.