Triggering an ETL from an Email via SES and Lambda

Triggering an ETL from an Email via SES and Lambda


Overview

This example details how to have the arrival of an email at an Amazon SES controlled email address trigger a Matillion job via AWS S3 and a Lambda function. This process will extract any TEXT/CSV attachments from emails sent to an SES email address and load them into RedshiftSnowflake.

 

Why Do This

Perhaps you’d like the the ability for a business user to create a standard CSV file and simply email it to Matillion ETL, rather than have to give them access to and train them on using S3 buckets. Or perhaps you receive a CSV email from a supplier you’d like to simply forward to RedshiftSnowflake.

 

Prerequisites

This exercise requires some items set up in your AWS account before you begin:

 

Basic Workflow

The basic steps and services involved in this process are:

  1. User sends an email to a specific AWS SES controlled email address
  2. AWS SES copies the email in plain text format to the S3 Landing Bucket
  3. S3 Trigger calls a Lambda function to scan the email for TEXT/CSV attachments and output them to S3 Process Attachment bucket
  4. Same Lambda function writes a message to an SQS Queue containing the required parameters to run a Matillion ETL for (including the bucket and filename written by the Lambda function)
  5. Matillion ETL consumes the message from the SQS Queue and runs a job to load the file to RedshiftSnowflake.

The process is summarised in this diagram:
 

Example Steps

Note: example files are attached to this article should you wish to try this yourself.

  1. Configure an email address to be managed by SES as per the AWS documentation 
  2. Configure an SES Receipt Rule to perform an S3 action (see here) to place all received emails into the first S3 bucket (S3 Landing) , e.g.
     
  3. Any emails sent to this address will be dropped into the S3 bucket (in the example mtln-email-test) in plain text format. Note you may need to amend the bucket permissions to allow the SES process access
  4. Within the AWS Console create a New Lambda function for Python 2.7, ideally using the s3-get-object-python Blueprint (using the Blueprint will allow you to configure which bucket the trigger will be from, and a number of permissions to). If you are using the template method set these values on the next page:
    1. Name: Matillion_Email_Test
    2. Role: Create new role(s) from Template
    3. Role Name: Matillion_Email_Role
    4. Bucket: <choose the emails land in>
    5. Event Type: Object Created (All)
    6. Enable Trigger: Yes
  5. Press “Create Function”
  6. Replace the code contents of the Lambda function generated by AWS with the contents of the file lambda_matillion_email_to_execution_example.py
  7. Amend the variables at the top of the Python code to the specifics of your environment:
    1. S3_OUTPUT_BUCKETNAME: The bucket created for processed files to be output to
    2. SQS_MATILLION_QUEUE: The SQS Matillion is listening to 
    3. MTLN_GROUP: The Project Group Name in Matillion ETL
    4. MTLN_PROJECT: The Project Name in Matillion ETL
    5. MTLN_ENV: The name of the environment to execute against in Matillion ETL
  8. Save the function (note: you may need to amend the Permissions on your S3 Process Attachment bucket to allow the Lambda function to write to it)
  9. Import the appropriate JSON file into your Matillion ETL instance (based on whether you’re running RedshiftSnowflake)
  10. Ensure your Matillion ETL instance is listening to the same SQS Queue as configured in the Python Lambda
  11. That’s it! Test by emailing the Data_matillion_email_to_execute_example.csv file attached to this article to your SES configured email address - If you are watching Matillion ETL you should see the imported job run.
  12. A table should be created in the target default schema containing the contents of the CSV, with a unique name.
 

Troubleshooting

Most problems you may experience with this will be related to permissions, ensure your Lambda function has the required permissions to:

  • Read from your source S3 bucket
  • Write to your target S3 bucket
  • Write messages to your SQS queue

Do not hesitate to Contact Support for further assistance.