Salesforce Output
  • Dark
    Light

Salesforce Output

  • Dark
    Light

This article is specific to the following platforms - Snowflake - Redshift - BigQuery - Synapse.

Salesforce Output

The Salesforce Output component uses the Salesforce API to write back the contents of a source table (or view) into a table in Salesforce.

Properties

The table below cites the Salesforce Output component's setup properties, including any actions required of the user.

Warning: this component is potentially destructive. The output operations performed by this component can delete, overwrite, and truncate target objects within Salesforce, and these operations may be irreversible.

Snowflake Properties

Property Setting Description
Name String Input a human-readable name for the component.
Authentication Method Select Select the authentication method. Users can choose between a username/password combination or an OAuth.
Use Sandbox Select No: connect to a live Salesforce account. This is the default setting.
Yes: connect to a sandbox Salesforce account.
This property is only available when Authentication Method is set to "User/Password".
Username String Provide a valid Salesforce username.
This property is only available when Authentication Method is set to "User/Password".
Password String Provide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended.
This property is only available when Authentication Method is set to "User/Password".
Security Token String Provide a valid Salesforce security token.
This property is only available when Authentication Method is set to "User/Password".
Authentication Select Select an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide.
This property is only available when Authentication Method is set to "OAuth".
Use Bulk API Select No: write up to 200 rows in real-time. This is the default setting.
Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion.
Connection Options Parameter A JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed.
Available parameters are explained in the Data Model.
Value A value for the given parameter.
Database Select Select a Snowflake database. The special value, [Environment Default], is the default setting, and uses the database defined in the Matillion ETL environment.
Schema Select Select the source schema of the local data to be output to Salesforce. The special value, [Environment Default], is the default setting, and uses the schema defined in the Matillion ETL environment.
Source Table Select Select the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema.
Target Object Select Select the Salesforce object (table) into which local data will be loaded (input).
Output Operation Select Select the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert.
Salesforce ID Select Select the unique ID of the row within the Target Object into which the local data will be written.
This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert").
Column Mappings Source Columns Specify the columns in the source table that will be unloaded (output).
Target Columns Specify columns in the target object where the source columns will be output to.
On Warnings Dropdown Continue: Loads data despite records that return an error or that are rejected. This is the default setting.
Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job.
Batch Size Integer The maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000.
When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000.
Records Per Ingest Job Integer An integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently.
If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default.
If a negative value is specified, the job will fail at runtime.
This property is only available when Use Bulk API is set to Yes.
Auto Debug Select On: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component.
Off select this option to override any debugging connection options.
Debug Level Select Select the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution:
1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors.
2: Also log cache queries and additional information about the request, if applicable.
3: Also log the body of the request and the response.
4: Also log transport-level communication with the data source. This includes SSL negotiation.
5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands.

Redshift Properties

Property Setting Description
Name String Input a human-readable name for the component.
Authentication Method Select Select the authentication method. Users can choose between a username/password combination or an OAuth.
Use Sandbox Select No: connect to a live Salesforce account. This is the default setting.
Yes: connect to a sandbox Salesforce account.
This property is only available when Authentication Method is set to "User/Password".
Username String Provide a valid Salesforce username.
This property is only available when Authentication Method is set to "User/Password".
Password String Provide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended.
This property is only available when Authentication Method is set to "User/Password".
Security Token String Provide a valid Salesforce security token.
This property is only available when Authentication Method is set to "User/Password".
Authentication Select Select an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide.
This property is only available when Authentication Method is set to "OAuth".
Use Bulk API Select No: write up to 200 rows in real-time. This is the default setting.
Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion.
Connection Options Parameter A JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed.
Available parameters are explained in the Data Model.
Value A value for the given parameter.
Source Schema Select Select the source schema of the local data to be output to Salesforce. The special value, [Environment Default], is the default setting, and uses the schema defined in the Matillion ETL environment.
Source Table Select Select the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema.
Target Object Select Select the Salesforce object (table) into which local data will be loaded (input).
Output Operation Select Select the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert.
Salesforce ID Select Select the unique ID of the row within the Target Object into which the local data will be written.
This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert").
Column Mappings Source Columns Specify the columns in the source table that will be unloaded (output).
Target Columns Specify columns in the target object where the source columns will be output to.
On Warnings Dropdown Continue: Loads data despite records that return an error or that are rejected. This is the default setting.
Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job.
Batch Size Integer The maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000.
When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000.
Records Per Ingest Job Integer An integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently.
If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default.
If a negative value is specified, the job will fail at runtime.
This property is only available when Use Bulk API is set to Yes.
Auto Debug Select On: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component.
Off select this option to override any debugging connection options.
Debug Level Select Select the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution:
1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors.
2: Also log cache queries and additional information about the request, if applicable.
3: Also log the body of the request and the response.
4: Also log transport-level communication with the data source. This includes SSL negotiation.
5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands.

BigQuery Properties

Property Setting Description
Name String Input a human-readable name for the component.
Authentication Method Select Select the authentication method. Users can choose between a username/password combination or an OAuth.
Use Sandbox Select No: connect to a live Salesforce account. This is the default setting.
Yes: connect to a sandbox Salesforce account.
This property is only available when Authentication Method is set to "User/Password".
Username String Provide a valid Salesforce username.
This property is only available when Authentication Method is set to "User/Password".
Password String Provide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended.
This property is only available when Authentication Method is set to "User/Password".
Security Token String Provide a valid Salesforce security token.
This property is only available when Authentication Method is set to "User/Password".
Authentication Select Select an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide.
This property is only available when Authentication Method is set to "OAuth".
Use Bulk API Select No: write up to 200 rows in real-time. This is the default setting.
Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion.
Connection Options Parameter A JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed.
Available parameters are explained in the Data Model.
Value A value for the given parameter.
Project Select Select the BigQuery project. The special value, [Environment Default], is the default setting, and uses the project defined in the Matillion ETL environment.
Dataset Select Select the BigQuery dataset. The special value, [Environment Default], is the default setting, and uses the dataset defined in the Matillion ETL environment.
For more information, see Google's datasets documentation.
Source Table Select Select the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema.
Target Object Select Select the Salesforce object (table) into which local data will be loaded (input).
Output Operation Select Select the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert.
Salesforce ID Select Select the unique ID of the row within the Target Object into which the local data will be written.
This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert").
Column Mappings Source Columns Specify the columns in the source table that will be unloaded (output).
Target Columns Specify columns in the target object where the source columns will be output to.
On Warnings Dropdown Continue: Loads data despite records that return an error or that are rejected. This is the default setting.
Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job.
Batch Size Integer The maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000.
When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000.
Records Per Ingest Job Integer An integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently.
If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default.
If a negative value is specified, the job will fail at runtime.
This property is only available when Use Bulk API is set to Yes.
Auto Debug Select On: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component.
Off select this option to override any debugging connection options.
Debug Level Select Select the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution:
1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors.
2: Also log cache queries and additional information about the request, if applicable.
3: Also log the body of the request and the response.
4: Also log transport-level communication with the data source. This includes SSL negotiation.
5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands.

Synapse Properties

Property Setting Description
Name String Input a human-readable name for the component.
Authentication Method Select Select the authentication method. Users can choose between a username/password combination or an OAuth.
Use Sandbox Select No: connect to a live Salesforce account. This is the default setting.
Yes: connect to a sandbox Salesforce account.
This property is only available when Authentication Method is set to "User/Password".
Username String Provide a valid Salesforce username.
This property is only available when Authentication Method is set to "User/Password".
Password String Provide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended.
This property is only available when Authentication Method is set to "User/Password".
Security Token String Provide a valid Salesforce security token.
This property is only available when Authentication Method is set to "User/Password".
Authentication Select Select an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide.
This property is only available when Authentication Method is set to "OAuth".
Use Bulk API Select No: write up to 200 rows in real-time. This is the default setting.
Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion.
Connection Options Parameter A JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed.
Available parameters are explained in the Data Model.
Value A value for the given parameter.
Source Schema Select Select the source schema of the local data to be output to Salesforce. The special value, [Environment Default], is the default setting, and uses the schema defined in the Matillion ETL environment.
Source Table Select Select the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema.
Target Object Select Select the Salesforce object (table) into which local data will be loaded (input).
Output Operation Select Select the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert.
Salesforce ID Select Select the unique ID of the row within the Target Object into which the local data will be written.
This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert").
Column Mappings Source Columns Specify the columns in the source table that will be unloaded (output).
Target Columns Specify columns in the target object where the source columns will be output to.
On Warnings Dropdown Continue: Loads data despite records that return an error or that are rejected. This is the default setting.
Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job.
Batch Size Integer The maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000.
When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000.
Records Per Ingest Job Integer An integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently.
If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default.
If a negative value is specified, the job will fail at runtime.
This property is only available when Use Bulk API is set to Yes.
Auto Debug Select On: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component.
Off select this option to override any debugging connection options.
Debug Level Select Select the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution:
1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors.
2: Also log cache queries and additional information about the request, if applicable.
3: Also log the body of the request and the response.
4: Also log transport-level communication with the data source. This includes SSL negotiation.
5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands.


Video


What's Next