Amplitude Extract

Amplitude Extract



Amplitude Extract

The Amplitude Extract component uses the Amplitude API to retrieve and store data to be referenced by an External Tableloaded into a table. You can then use Transformation Jobs to enrich and manage the data.

Note: This component may return structured data that requires flattening. For help flattening such data, please read our Nested Data Load Component documentation.

Note: This component may return structured data that requires flattening. For help flattening such data, please read our Extract Nested Data Component documentation.


Redshift Properties

Property Setting Description
Name String Input the descriptive name for the component.
Data Source Select Please select an Amplitude data source. Available options are "Event Types" and "Events".
API Key String Please provide your Amplitude API Key.
For help acquiring your Amplitude API Key, please read our Amplitude Authentication Guide.
Secret Key String Please provide your Amplitude Secret Key. Secret Keys can be stored inside the component; however, it is highly recommended to use the Password Manager feature instead.
For help acquiring your Amplitude Secret Key, please read our Amplitude Authentication Guide.
Start time Datetime String Please designate a start time. The format, without whitespaces, is
YYYYMMDDT00
T00, T01, T02...T23 indicate the hour (there is no T24). Please select only one hour.
Note: This parameter is only available when the Data Source parameter is set to "Events".
End time Datetime String Please designate an end time. The format, without whitespaces, is
YYYYMMDDT00
T00, T01, T02...T23 indicate the hour (there is no T24). Please select only one hour.
Note: This parameter is only available when the Data Source parameter is set to "Events".
Location Filepath Provide an S3 Bucket path that will be used to store the data. Once on an S3 bucket, the data can be referenced by an external table. A folder will be created at this location with the same name as the target table.
External Schema Select Select the table's external schema. To learn more about external schemas, please read our support documentation on Getting Started With Amazon Redshift Spectrum.
The special value, [Environment Default], will use the schema defined in the environment.
For more information on using multiple schemas, see Schema Support.
Target Table String Provide a name for the external table to be used.
Warning: Upon running the job, this table will be recreated and will drop any existing table of the same name.

Snowflake Properties

Property Setting Description
Name String Input the descriptive name for the component.
Data Source Select Please select an Amplitude data source. Available options are "Event Types" and "Events".
API Key String Please provide your Amplitude API Key.
For help acquiring your Amplitude API Key, please read our Amplitude Authentication Guide.
Secret Key String Please provide your Amplitude Secret Key. Secret Keys can be stored inside the component; however, it is highly recommended to use the Password Manager feature instead.
For help acquiring your Amplitude Secret Key, please read our Amplitude Authentication Guide.
Start time Datetime String Please designate a start time. The format, without whitespaces, is
YYYYMMDDT00
T00, T01, T02...T23 indicate the hour (there is no T24). Please select only one hour.
Note: This parameter is only available when the Data Source parameter is set to "Events".
End time Datetime String Please designate an end time. The format, without whitespaces, is
YYYYMMDDT00
T00, T01, T02...T23 indicate the hour (there is no T24). Please select only one hour.
Note: This parameter is only available when the Data Source parameter is set to "Events".
Location Filepath Provide an S3 bucket path, GCS bucket path, or Azure Blob Storage path that will be used to store the data. Once on an S3 bucket, GCS bucket or Azure Blob, the data can be referenced by an external table. A folder will be created at this location with the same name as the Target Table.
Integration Select Choose your Google Cloud Storage Integration. Integrations are required to permit Snowflake to read data from and write to a Google Cloud Storage bucket. Integrations must be set up in advance of selecting them in Matillion ETL. To learn more about setting up a storage integration, read our Storage Integration Setup Guide.
Warehouse Select Choose a Snowflake warehouse that will run the load.
Database Select Choose a database to create the new table in.
Schema Select Select the table schema. The special value, [Environment Default], will use the schema defined in the environment. For more information on using multiple schemas, please refer to this article.
Target Table String Provide a new table name.
Warning: Upon running the job, this table will be recreated and will drop any existing table of the same name.

BigQuery Properties

Property Setting Description
Name String Input the descriptive name for the component.
Data Source Select Please select an Amplitude data source. Available options are "Event Types" and "Events".
API Key String Please provide your Amplitude API Key.
For help acquiring your Amplitude API Key, please read our Amplitude Authentication Guide.
Secret Key String Please provide your Amplitude Secret Key. Secret Keys can be stored inside the component; however, it is highly recommended to use the Password Manager feature instead.
For help acquiring your Amplitude Secret Key, please read our Amplitude Authentication Guide.
Start time Datetime String Please designate a start time. The format, without whitespaces, is
YYYYMMDDT00
T00, T01, T02...T23 indicate the hour (there is no T24). Please select only one hour.
Note: This parameter is only available when the Data Source parameter is set to "Events".
End time Datetime String Please designate an end time. The format, without whitespaces, is
YYYYMMDDT00
T00, T01, T02...T23 indicate the hour (there is no T24). Please select only one hour.
Note: This parameter is only available when the Data Source parameter is set to "Events".
Project Select The target BigQuery project to load data into.
Dataset Select The target BigQuery dataset to load data into.
Target Table String Provide a new table name.
Warning: Upon running the job, this table will be recreated and will drop any existing table of the same name.
Cloud Storage Staging Area Filepath The URL and path of the target Google Storage bucket to be used for staging the queried data.
Load Options Multiple Select Clean Cloud Storage Files: Destroy staged files on Cloud Storage after loading data. Default is On.
Cloud Storage File Prefix: Give staged file names a prefix of your choice. The default setting is an empty field.
Recreate Target Table: Choose whether the component recreates its target table before the data load. If Off, the component will use an existing table or create one if it does not exist. Default is On.
Use Grid Variable: Check this checkbox to use a grid variable. This box is unchecked by default.