You can transfer data from your Pigment Views, Lists, Metrics, and Tables to Amazon S3 with our REST API.
First, you create a Amazon Lambda function that runs a Python script. This script retrieves the Pigment API data and stores it in an Amazon S3 bucket.
Recommended reading
In case you need it, we’ve compiled some useful background information for you:
Before you begin
Before you start your export, we recommend that you complete the following tasks in Amazon S3 and Pigment.
Amazon S3
You need an Amazon S3 account to perform this export.
Consult with your Amazon S3 admin. You may need to obtain permissions and information to successfully complete your export.
Pigment
⚠️ Important
Only Pigment members with the account type Workspace Admin or Security Admin can access the Access API key management page and manage API keys.
Obtain your export API key. This is explained in Manage API Keys.
Obtain your View ID. This is explained in How to export data from Pigment with APIs.
Obtain your List, Metric, or Table IDs. This is explained in Export raw data from a Pigment Block.
Create an AWS Lambda function
Here’s how you create an AWS Lambda function that loads Pigment API data into a S3 bucket.
ℹ️ Note
In the following example, we use
view_ID
, however you can uselistID
,metricID
, ortableID
as appropriate.
In AWS Secrets Manager, create a secret using your Export API key as a value.
In Lambda, create a new AWS Lambda function, and complete these steps:
In the Function name field, enter your new function name.
In the Runtime menu, select
Python 3.7
Click Configuration.
Enter the following environment variable values:
-view_ID
: The View ID of the block you want to export from Pigment.
For information on obtaining this ID information, see here.
-AWS_secret_name
: The secret name that corresponds to the Export API Key in AWS Secrets Manager.
For example:pigment_export_api_key
-AWS_secret_region
: The region your AWS Secrets Manager entry resides in.
For example:us-east-1
-S3_bucket_name
:the name of your S3 bucket that holds your files.
-secret_key
: the secret key corresponding to your Pigment Export API Key entry in AWS Secrets Manager.In the Configuration pane, click Permissions, and complete the following steps:
Under role name, click the required role for the export.
Configure the AWS IAM manager so your Lambda project's IAM user can:
- read/write to S3
- read from AWS Secrets ManagerSave your changes.
ℹ️ Note
Depending on your organization's access management policy, you can use either a role or a policy.
Click the Code tab, and add the following Python code:
from datetime import datetime import os import json import boto3 import requests from botocore.exceptions import ClientError # Init AWS Secrets Manager def get_secret(): # Replace with your AWS secrets manager details via the environment variables SECRET_NAME = os.environ['AWS_SECRET_NAME'] REGION_NAME = os.environ['AWS_SECRET_REGION'] SECRET_KEY = os.environ['SECRET_KEY'] # Create a Secrets Manager client session = boto3.session.Session() client = session.client( service_name='secretsmanager', region_name=REGION_NAME ) try: get_secret_value_response = client.get_secret_value( SecretId=SECRET_NAME ) except ClientError as e: # For a list of exceptions thrown, see # https://docs.aws.amazon.com/secretsmanager/latest/apireference/API_GetSecretValue.html raise e # Decrypts secret using the associated KMS key. secret_response = get_secret_value_response['SecretString'] secret_dict = json.loads(secret_response) return secret_dict[SECRET_KEY] # Replace with your S3 Bucket name and desired S3 name (object name) # Adding a folder for each export attempt, feel free to remove if overkill S3_BUCKET_NAME = os.environ['S3_BUCKET_NAME'] DT = datetime.now() DT_ISO = DT.strftime("%Y-%m-%dT%H:%M:%SZ") S3_FILE = f'pigment_exports/{DT_ISO}/export.csv' def lambda_handler(event, context): # Yor Pigment block's App ID and View ID VIEW_ID = os.environ['VIEW_ID'] # Compose export API URI API_URL = f'https://pigment.app/api/export/view/{VIEW_ID}' KEY = get_secret() HEADERS = {'Authorization': 'bearer '+KEY} # Initialize S3 client s3 = boto3.client('s3') try: # Make the HTTP GET request to the API response = requests.get(API_URL, headers=HEADERS) if response.status_code == 200: # Save the content to S3 s3.put_object(Bucket=S3_BUCKET_NAME, Key=S3_FILE, Body=response.content) return { 'statusCode': 200, 'body': json.dumps('File saved to S3 successfully!') } else: return { 'statusCode': response.status_code, 'body': json.dumps('Failed to retrieve data from the API.') } except Exception as e: return { 'statusCode': 500, 'body': json.dumps(f'Error: {str(e)}') }
Click Deploy and then Test.
Go to your S3 bucket.
Locate the folder that corresponds to the ISO 8601 date and time when you ran the test.
If your test was successful, you’ll find your exported Pigment data in the CSV file.
Schedule regular exports with EventBridge Scheduler
In AWS, go to EventBridge Scheduler and click Create rule.
In the Schedule name field, enter your rule name.
Define your schedule, and click Next.
In the Select Target pane, select AWS Lambda.
Select your Lambda function.
Complete the remaining setup tasks.
Your scheduling job is set up, and runs at 9:00AM GMT daily.