Export data to Amazon Web Services
- Create an S3 bucket on your AWS account.
- Give Swrve log-export user access to the bucket.
- (Optional) Provide key or key alias for KMS encryption.
Create S3 bucket
To create an S3 bucket:
Step 1: If you haven’t already done so, sign up for a Free Amazon S3 Account at http://aws.amazon.com/s3/.
Step 2: On the Amazon Web console, go to S3.
Step 3: Select Create Bucket.
Step 4: Enter the bucket name as swrveexternal-<companyname>.
Step 5: Depending on your data requirements, Swrve stores all customer data and content in either our US or EU data centers.
- If your app uses our US data center (for example, https://dashboard.swrve.com/), set the region as US Standard.
- If your app uses our EU data center (for example, https://eu-dashboard.swrve.com), set the region as EU-West-1.
Step 6: Select Create.
Give Swrve permissions to write to the bucket
After you create the S3 bucket, you need to give Swrve’s log-export account permission to write to the bucket by adding a bucket policy. To add a bucket policy:
Step 1: Select the name of the new bucket.
Step 2: On the top right, select Properties.
Step 3: Select Permissions.
Step 4: Select Add bucket policy.
Step 5: Enter the following JSON configuration, replacing <companyname> with your company name (lower case, no spaces or hyphens). Ensure the JSON code matches exactly.
Step 6: Select Save.
To get access to logs:
Step 1: Send a message to email@example.com to request that Swrve enable log export for the required apps.
Step 2: After your CSM confirms logging export is enabled, download s3cmd from s3tools.org.
Step 3: Use s3cmd ls to browse your bucket.
Log files are generated for every hour. File names are formatted as:
For example, if your company name was Acme, the app ID 12345, and you wanted the logs from 15-May-2020 from 14:00 to 15:00, you could download them with the following:
s3cmd get s3://swrveexternal-acme/app-12345/2020-05-15T14Z.log.gz
Once the file is downloaded, decompress it with the following:
This results in the plain text file 2020-05-15T14Z.log containing JSON-formatted events.
If you would like to use AWS Key Management Service (KMS) to secure the export of your data from Swrve to Amazon, contact your CSM at firstname.lastname@example.org. You need to provide the Amazon Resource Name (ARN) for the KMS key or key alias that the you want to use to encrypt the data in the S3 bucket.
Export data to Google Cloud Storage
Swrve logs data for your apps in a Cloud Storage Bucket that you access from the Google Cloud Platform Console. This section covers the requirements for setting up raw data export to a Google Cloud Storage bucket.
- In your Google Cloud Platform Console, create a service account that you want to use to write the data to Google Cloud Storage. The service account name usually follows the same syntax as an email address; for example, email@example.com.
- Create a Cloud Storage bucket that you want the data to be written to. The bucket name must be unique, however there is no set requirement on what or how you name the bucket.
- Give Swrve Writer user permission for the bucket. For example, include the following object permissions so Swrve can list the contents of the bucket:
- Get the p12 certificate that gives authentication for the named service account from the Cloud Storage Console.
- Contact your CSM at firstname.lastname@example.org and ask them to enable your app for Google Cloud Storage data export. You need to provide the service account name, bucket name and p12 certificate.
To access the data logs in Google Cloud Storage:
Step 1: After your CSM confirms logging export is enabled, access the Cloud Storage Bucket in your Google Cloud Platform Console. The download files in the Cloud Storage Bucket are automatically named by app ID.
Step 2: Double-click an app folder to view the log files within it. Log files are generated for every hour. File names are formatted as YYYY-MM-DDTHHZ.log.gz.
Step 3: Follow the Google Cloud Platform steps for downloading and viewing the log file. The decompressed file is a plain text file named YYYY-MM-DDTHHZ.log, containing JSON-formatted events.
- Learn how to parse the raw event logs by understanding the Swrve data schema.
Need a hosted solution?
Most Swrve customers can self-host this pipeline; all you have to do is follow the steps in these tutorials. However, if you prefer a turn-key hosted solution, we do offer one as a professional service. There is an additional fee associated with this service. For more information, contact your CSM at email@example.com.
Need help with Queries?
Swrve support can help you with basic setup and configuration of the pipeline described above. If you need help with your queries, contact our Data Services team at firstname.lastname@example.org. They will help you get the most out of your data with dedicated support, pre-built and custom reports, and dedicated data science hours.