Stream logs to Azure Storage

DataStream 2 supports sending log files to Microsoft Azure Blob Storage. Azure Blob Storage is a static file storage service used to control arbitrarily large amounts of unstructured data and serve them to users over HTTP and HTTPS.

DataStream uploads logs to Azure Storage in a GZIP-compressed file.

For security reasons, DataStream sends logs over TLS even if Azure container policies allow insecure requests.

Before you begin

Note: DataStream 2 supports streaming data only to block blobs at this time. See Block blobs in Azure.

How to

  1. In Destination, select Azure Storage.
  2. In Name, enter a human-readable description for the endpoint.
  3. In Storage account name, enter the name of the storage account where you want to store logs.
  4. In Container name, enter the name of the container within your account where you want to store logs.
  5. In Path, enter the path to the folder in the container where you want to store logs—for example, logs or logs/diagnostics. If the folders in the path don't exist, Azure creates them for you. You can use Dynamic time variables in folder paths.
    Don't start your path with /. It may cause your logs to reach a wrong destination.
  6. In Access key, enter either of the access keys associated with your Azure account. .
  7. Click Validate & Save to validate the connection to the destination and save the integration details you provided.
  8. Optionally, in the Delivery options menu, edit the Filename field to change the prefix and suffix for your log files. File name prefixes support Dynamic time variables.
    For file name prefixes, you shouldn't use the . character, as it may result in errors and data loss. File name suffixes don't support dynamic time variables and the ., /, %, and ? characters. See Naming resources in Azure.
  9. Optionally, change the Push frequency to receive bundled logs to your destination every 30 or 60 seconds.
  10. Click Next.