Pardot Extract

Pardot Extract



Pardot Extract

The Pardot Extract component retrieves data from your Salesforce Pardot account and loads that data into a table. This action stages the data, so that the table is reloaded each time. Users can then transform their data with Matillion ETL's library of transformation components.

Note: This component may return structured data that requires flattening. Please see the Nested Data Load Component for information on how to accomplish this.

Note: This component may return structured data that requires flattening. Please see the Extract Nested Data Component for information on how to accomplish this.


Redshift Properties

Property Setting Description
Name String The descriptive name for the component.
Data Source Select Select a data source. As noted above, once you have configured the Data Source property, one or more properties specific to that data source will become available to configure. These properties are not optional and must be configured.
Please refer to the "Data Source Properties" table in this documentation for guidance with these additional properties.
Auth Method Select Set the authentication method. The only available method is "Pardot".
User Key String The user key string that corresponds to the email address and password credentials that are used to log in to Pardot. For help acquiring a Pardot user key, read our Pardot Authentication Guide.
Email Address String Please provide the email address for your Salesforce Pardot login.
Password String Please provide the password for your Salesforce Pardot login. Passwords can be stored inside the component; however, it is highly recommend to use the Password Manager feature instead.
Page Limit Integer Set the page limit for the amount of records to be returned and staged. You can use -1 to attempt to take all available data, but please be warned that this might take some time.
Location Select Provide an S3 Bucket path that will be used to store the data. Once on an S3 bucket, the data can be referenced by an external table. A folder will be created at this location with the same name as the Target Table.
External Schema Select Select the table's external schema. To learn more about external schemas, please consult the Configuring The Matillion ETL Client section of the Getting Started With Amazon Redshift Spectrum documentation.
The special value, [Environment Default], will use the schema defined in the environment.
For more information on using multiple schemas, see Schema Support.
Target Table String Provide a name for the external table to be used.
Warning: This table will be recreated and will drop any existing table of the same name.

Snowflake Properties

Property Setting Description
Name String The descriptive name for the component.
Data Source Select Select a data source. As noted above, once you have configured the Data Source property, one or more properties specific to that data source will become available to configure. These properties are not optional and must be configured.
Please refer to the "Data Source Properties" table in this documentation for guidance with these additional properties.
Auth Method Select Set the authentication method. The only available method is "Pardot".
User Key String The user key string that corresponds to the email address and password credentials that are used to log in to Pardot. For help acquiring a Pardot user key, read our Pardot Authentication Guide.
Email Address String Please provide the email address for your Salesforce Pardot login.
Password String Please provide the password for your Salesforce Pardot login. Passwords can be stored inside the component; however, it is highly recommend to use the Password Manager feature instead.
Page Limit Integer Set the page limit for the amount of records to be returned and staged. You can use -1 to attempt to take all available data, but please be warned that this might take some time.
Location Select Provide an S3 bucket path, GCS bucket path, or Azure Blob Storage path that will be used to store the data. Once on an S3 bucket, GCS bucket or Azure Blob, the data can be referenced by an external table. A folder will be created at this location with the same name as the Target Table.
Integration Select Choose your Google Cloud Storage Integration. Integrations are required to permit Snowflake to read data from and write to a Google Cloud Storage bucket. Integrations must be set up in advance of selecting them in Matillion ETL. To learn more about setting up a storage integration, read our Storage Integration Setup Guide.
Warehouse Select Choose a Snowflake warehouse that will run the load.
Database Select Choose a database to create the new table in.
Schema Select Select the table schema. The special value, [Environment Default], will use the schema defined in the environment. For more information on using multiple schemas, please refer to this article.
Target Table String Provide a new table name.
Warning: This table will be recreated and will drop any existing table of the same name.

BigQuery Properties

Property Setting Description
Name String The descriptive name for the component.
Data Source Select Select a data source. As noted above, once you have configured the Data Source property, one or more properties specific to that data source will become available to configure. These properties are not optional and must be configured.
Please refer to the "Data Source Properties" table in this documentation for guidance with these additional properties.
Auth Method Select Set the authentication method. The only available method is "Pardot".
User Key String The user key string that corresponds to the email address and password credentials that are used to log in to Pardot. For help acquiring a Pardot user key, read our Pardot Authentication Guide.
Email Address String Please provide the email address for your Salesforce Pardot login.
Password String Please provide the password for your Salesforce Pardot login. Passwords can be stored inside the component; however, it is highly recommend to use the Password Manager feature instead.
Page Limit Integer Set the page limit for the amount of records to be returned and staged. You can use -1 to attempt to take all available data, but please be warned that this might take some time.
Project String The target BigQuery project to load data into.
Dataset String The target BigQuery dataset to load data into.
Target Table String Provide a new table name.
Warning: This table will be recreated and will drop any existing table of the same name.
Cloud Storage Staging Area String The URL and path of the target Google Storage bucket to be used for staging the queried data.
Load Options Multiple Select Clean Cloud Storage Files: Destroy staged files on Cloud Storage after loading data. Default is On.
Cloud Storage File Prefix: Give staged file names a prefix of your choice. The default setting is an empty field.
Recreate Target Table: Choose whether the component recreates its target table before the data load. If Off, the component will use an existing table or create one if it does not exist. Default is On.
Use Grid Variable: Check this checkbox to use a grid variable. This box is unchecked by default.

Data Source Properties

The following table lists any Data Source that requires one or more unique component properties for configuration. If a Data Source is missing from this table, it does NOT have any unique component properties.

Data Source Property Name Type Description
Email Stats List Email ID Integer A single Email ID values.
Emails Email ID Integer A single Email ID value.
List Created before Date Include records created before a given GNU format data input.
Created after Date Include records created after a given GNU format data input
Updated before Date Include records updated before a given GNU format data input
Updated after Date Include records updated after a given GNU format data input
List Membership Created before Date Include records created before a given GNU format data input.
Created after Date Include records created after a given GNU format data input
Updated before Date Include records updated before a given GNU format data input
Updated after Date Include records updated after a given GNU format data input
Prospects Created before Date Include records created before a given GNU format data input.
Created after Date Include records created after a given GNU format data input
Updated before Date Include records updated before a given GNU format data input
Updated after Date Include records updated after a given GNU format data input
Specific visitor details ID Integer A single ID value or a comma-separated list of visitor ID values.
Visitor Activity Created before Date Include visitor activity records created before a given GNU format data input.
Created after Date Include visitor activity records created after a given GNU format data input
Updated before Date Include visitor activity records updated before a given GNU format data input
Updated after Date Include visitor activity records updated after a given GNU format data input
Visitors Only identified visitors true/false Only include identified visitors in this query.
Created before Date Include records created before a given GNU format data input.
Created after Date Include records created after a given GNU format data input
Updated before Date Include records updated before a given GNU format data input
Updated after Date Include records updated after a given GNU format data input
Visits Created before Date Include records created before a given GNU format data input.
Created after Date Include records created after a given GNU format data input
Updated before Date Include records updated before a given GNU format data input
Updated after Date Include records updated after a given GNU format data input
IDs Integer A single ID value or comma-separated list of ID values.
Visitor IDs Integer A single ID value or comma-separated list of Visitor ID values.
Prospect IDs Integer A single ID value or comma-separated list of Prospect ID values.

How to Obtain Your User Key

  1. Log in to your Salesforce Pardot account.
  2. Via the left-hand menu, go to Admin → User Management → Users.
  3. Click on the name of the user you want the User Key for.
  4. Copy the entry beside "API User Key".