Salesforce Output
-
DarkLight
Salesforce Output
-
DarkLight
Salesforce Output
The Salesforce Output component uses the Salesforce API to write back the contents of a source table (or view) into a table in Salesforce.
Properties
The table below cites the Salesforce Output component's setup properties, including any actions required of the user.
Warning: this component is potentially destructive. The output operations performed by this component can delete, overwrite, and truncate target objects within Salesforce, and these operations may be irreversible.
Snowflake Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | Input a human-readable name for the component. |
Authentication Method | Select | Select the authentication method. Users can choose between a username/password combination or an OAuth. |
Use Sandbox | Select | No: connect to a live Salesforce account. This is the default setting. Yes: connect to a sandbox Salesforce account. This property is only available when Authentication Method is set to "User/Password". |
Username | String | Provide a valid Salesforce username. This property is only available when Authentication Method is set to "User/Password". |
Password | String | Provide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended. This property is only available when Authentication Method is set to "User/Password". |
Security Token | String | Provide a valid Salesforce security token. This property is only available when Authentication Method is set to "User/Password". |
Authentication | Select | Select an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide. This property is only available when Authentication Method is set to "OAuth". |
Use Bulk API | Select | No: write up to 200 rows in real-time. This is the default setting. Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion. |
Content Type | Select | Select the content type. When set to ZIP-XML (the default), data is loaded to Salesforce and the Content Type in the Salesforce UI is shown as ZIP-XML. This applies accordingly to each other content type. This parameter is only available when Use Bulk API is set to Yes. |
Connection Options | Parameter | A JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed. Available parameters are explained in the Data Model. |
Value | A value for the given parameter. | |
Database | Select | Select a Snowflake database. The special value, [Environment Default], is the default setting, and uses the database defined in the Matillion ETL environment. |
Schema | Select | Select the source schema of the local data to be output to Salesforce. The special value, [Environment Default], is the default setting, and uses the schema defined in the Matillion ETL environment. |
Source Table | Select | Select the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema. |
Target Object | Select | Select the Salesforce object (table) into which local data will be loaded (input). |
Output Operation | Select | Select the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert. |
Salesforce ID | Select | Select the unique ID of the row within the Target Object into which the local data will be written. This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert"). |
Column Mappings | Source Columns | Specify the columns in the source table that will be unloaded (output). |
Target Columns | Specify columns in the target object where the source columns will be output to. | |
On Warnings | Dropdown | Continue: Loads data despite records that return an error or that are rejected. This is the default setting. Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job. |
Batch Size | Integer | The maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000. When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000. |
Records Per Ingest Job | Integer | An integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently. If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default. If a negative value is specified, the job will fail at runtime. This property is only available when Use Bulk API is set to Yes. |
Relationship Columns | Parent Object | Drop-down list of available parent target objects. For example, if the "child" is User , the "parent" could be Account . |
Relationships | The relationship to the Parent Relationship column. For example, OwnerId is a relationship column in Account , but the relationship is named Owner . |
|
Type | Relationship columns refer to specific target objects. For example, OwnerId in Account refers to User . However, polymorphic objects like OwnerId in Event can refer to Calendar or User , but only one may be used. In the case of customer fields, this will always be. |
|
Index Column | The name of the column that uniquely identifies the Parent Target Object. The User of the Parent Target Object is identified by the Email column in User. |
|
Capture Rejected Entries | Select | Set this property to On to enable the capture of any rejected/errored records into an exception table, so that they can be flagged for further analysis. This is set to Off by default. This property is only available when Use Bulk API is set to Yes. |
Truncate Rejected Entries | Select | When set to Yes, errored results are replaced on each run. When No, error results are appended as additional entries on each run. This property is only available when Use Bulk API is set to Yes. |
Rejected Entries Database | Select | Select a database to hold the Rejected Entries table. The default is [Environment Default]. This property is only available when Capture Rejected Entries is set to On. |
Rejected Entries Schema | Select | Select a schema from the chosen Rejected Entries Database. The default is [Environment Default]. This property is only available when Capture Rejected Entries is set to On. |
Rejected Entries Table | String | Enter a name for the table that Rejected Entries will be written to. If the table does not already exist, it will be created. This property is only available when Capture Rejected Entries is set to On. |
Capture Batch Results | Select | When set to Yes, enables the capture of batch results. This property is only available when Use Bulk API is set to Yes. |
Truncate Batch Results | Select | When set to Yes, enables the truncation of batch results. This property is only available when Use Bulk API is set to Yes. |
Batch Results Database | Select | Select a database to hold the batch results. The default is [Environment Default]. This property is only available when Capture Batch Results is set to On. |
Batch Results Schema | Select | Select a schema from the chosen Batch Results Database. The default is [Environment Default]. This property is only available when Capture Batch Results is set to On. |
Batch Results Table | String | Enter a name for the table that batch results will be written to. If the table does not already exist, it will be created. This property is only available when Capture Batch Results is set to On. |
Auto Debug | Select | On: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component. Off select this option to override any debugging connection options. |
Debug Level | Select | Select the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution: 1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors. 2: Also log cache queries and additional information about the request, if applicable. 3: Also log the body of the request and the response. 4: Also log transport-level communication with the data source. This includes SSL negotiation. 5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands. |
Redshift Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | Input a human-readable name for the component. |
Authentication Method | Select | Select the authentication method. Users can choose between a username/password combination or an OAuth. |
Use Sandbox | Select | No: connect to a live Salesforce account. This is the default setting. Yes: connect to a sandbox Salesforce account. This property is only available when Authentication Method is set to "User/Password". |
Username | String | Provide a valid Salesforce username. This property is only available when Authentication Method is set to "User/Password". |
Password | String | Provide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended. This property is only available when Authentication Method is set to "User/Password". |
Security Token | String | Provide a valid Salesforce security token. This property is only available when Authentication Method is set to "User/Password". |
Authentication | Select | Select an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide. This property is only available when Authentication Method is set to "OAuth". |
Use Bulk API | Select | No: write up to 200 rows in real-time. This is the default setting. Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion. |
Content Type | Select | Select the content type. When set to ZIP-XML (the default), data is loaded to Salesforce and the Content Type in the Salesforce UI is shown as ZIP-XML. This applies accordingly to each other content type. This parameter is only available when Use Bulk API is set to Yes. |
Connection Options | Parameter | A JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed. Available parameters are explained in the Data Model. |
Value | A value for the given parameter. | |
Source Schema | Select | Select the source schema of the local data to be output to Salesforce. The special value, [Environment Default], is the default setting, and uses the schema defined in the Matillion ETL environment. |
Source Table | Select | Select the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema. |
Target Object | Select | Select the Salesforce object (table) into which local data will be loaded (input). |
Output Operation | Select | Select the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert. |
Salesforce ID | Select | Select the unique ID of the row within the Target Object into which the local data will be written. This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert"). |
Column Mappings | Source Columns | Specify the columns in the source table that will be unloaded (output). |
Target Columns | Specify columns in the target object where the source columns will be output to. | |
On Warnings | Dropdown | Continue: Loads data despite records that return an error or that are rejected. This is the default setting. Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job. |
Batch Size | Integer | The maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000. When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000. |
Records Per Ingest Job | Integer | An integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently. If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default. If a negative value is specified, the job will fail at runtime. This property is only available when Use Bulk API is set to Yes. |
Relationship Columns | Parent Object | Drop-down list of available parent target objects. For example, if the "child" is User , the "parent" could be Account . |
Relationships | The relationship to the Parent Relationship column. For example, OwnerId is a relationship column in Account , but the relationship is named Owner . |
|
Type | Relationship columns refer to specific target objects. For example, OwnerId in Account refers to User . However, polymorphic objects like OwnerId in Event can refer to Calendar or User , but only one may be used. In the case of customer fields, this will always be. |
|
Index Column | The name of the column that uniquely identifies the Parent Target Object. The User of the Parent Target Object is identified by the Email column in User. |
|
Capture Rejected Entries | Select | Set this property to On to enable the capture of any rejected/errored records into an exception table, so that they can be flagged for further analysis. This is set to Off by default. This property is only available when Use Bulk API is set to Yes. |
Truncate Rejected Entries | Select | When set to Yes, errored results are replaced on each run. When No, error results are appended as additional entries on each run. This property is only available when Use Bulk API is set to Yes. |
Rejected Entries Database | Select | Select a database to hold the Rejected Entries table. The default is [Environment Default]. This property is only available when Capture Rejected Entries is set to On. |
Rejected Entries Schema | Select | Select a schema from the chosen Rejected Entries Database. The default is [Environment Default]. This property is only available when Capture Rejected Entries is set to On. |
Rejected Entries Table | String | Enter a name for the table that Rejected Entries will be written to. If the table does not already exist, it will be created. This property is only available when Capture Rejected Entries is set to On. |
Capture Batch Results | Select | When set to Yes, enables the capture of batch results. This property is only available when Use Bulk API is set to Yes. |
Truncate Batch Results | Select | When set to Yes, enables the truncation of batch results. This property is only available when Use Bulk API is set to Yes. |
Batch Results Database | Select | Select a database to hold the batch results. The default is [Environment Default]. This property is only available when Capture Batch Results is set to On. |
Batch Results Schema | Select | Select a schema from the chosen Batch Results Database. The default is [Environment Default]. This property is only available when Capture Batch Results is set to On. |
Batch Results Table | String | Enter a name for the table that batch results will be written to. If the table does not already exist, it will be created. This property is only available when Capture Batch Results is set to On. |
Auto Debug | Select | On: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component. Off select this option to override any debugging connection options. |
Debug Level | Select | Select the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution: 1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors. 2: Also log cache queries and additional information about the request, if applicable. 3: Also log the body of the request and the response. 4: Also log transport-level communication with the data source. This includes SSL negotiation. 5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands. |
BigQuery Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | Input a human-readable name for the component. |
Authentication Method | Select | Select the authentication method. Users can choose between a username/password combination or an OAuth. |
Use Sandbox | Select | No: connect to a live Salesforce account. This is the default setting. Yes: connect to a sandbox Salesforce account. This property is only available when Authentication Method is set to "User/Password". |
Username | String | Provide a valid Salesforce username. This property is only available when Authentication Method is set to "User/Password". |
Password | String | Provide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended. This property is only available when Authentication Method is set to "User/Password". |
Security Token | String | Provide a valid Salesforce security token. This property is only available when Authentication Method is set to "User/Password". |
Authentication | Select | Select an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide. This property is only available when Authentication Method is set to "OAuth". |
Use Bulk API | Select | No: write up to 200 rows in real-time. This is the default setting. Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion. |
Content Type | Select | Select the content type. When set to ZIP-XML (the default), data is loaded to Salesforce and the Content Type in the Salesforce UI is shown as ZIP-XML. This applies accordingly to each other content type. This parameter is only available when Use Bulk API is set to Yes. |
Connection Options | Parameter | A JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed. Available parameters are explained in the Data Model. |
Value | A value for the given parameter. | |
Project | Select | Select the BigQuery project. The special value, [Environment Default], is the default setting, and uses the project defined in the Matillion ETL environment. |
Dataset | Select | Select the BigQuery dataset. The special value, [Environment Default], is the default setting, and uses the dataset defined in the Matillion ETL environment. For more information, see Google's datasets documentation. |
Source Table | Select | Select the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema. |
Target Object | Select | Select the Salesforce object (table) into which local data will be loaded (input). |
Output Operation | Select | Select the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert. |
Salesforce ID | Select | Select the unique ID of the row within the Target Object into which the local data will be written. This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert"). |
Column Mappings | Source Columns | Specify the columns in the source table that will be unloaded (output). |
Target Columns | Specify columns in the target object where the source columns will be output to. | |
On Warnings | Dropdown | Continue: Loads data despite records that return an error or that are rejected. This is the default setting. Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job. |
Batch Size | Integer | The maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000. When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000. |
Records Per Ingest Job | Integer | An integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently. If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default. If a negative value is specified, the job will fail at runtime. This property is only available when Use Bulk API is set to Yes. |
Relationship Columns | Parent Object | Drop-down list of available parent target objects. For example, if the "child" is User , the "parent" could be Account . |
Relationships | The relationship to the Parent Relationship column. For example, OwnerId is a relationship column in Account , but the relationship is named Owner . |
|
Type | Relationship columns refer to specific target objects. For example, OwnerId in Account refers to User . However, polymorphic objects like OwnerId in Event can refer to Calendar or User , but only one may be used. In the case of customer fields, this will always be. |
|
Index Column | The name of the column that uniquely identifies the Parent Target Object. The User of the Parent Target Object is identified by the Email column in User. |
|
Capture Rejected Entries | Select | Set this property to On to enable the capture of any rejected/errored records into an exception table, so that they can be flagged for further analysis. This is set to Off by default. This property is only available when Use Bulk API is set to Yes. |
Truncate Rejected Entries | Select | When set to Yes, errored results are replaced on each run. When No, error results are appended as additional entries on each run. This property is only available when Use Bulk API is set to Yes. |
Rejected Entries Database | Select | Select a database to hold the Rejected Entries table. The default is [Environment Default]. This property is only available when Capture Rejected Entries is set to On. |
Rejected Entries Schema | Select | Select a schema from the chosen Rejected Entries Database. The default is [Environment Default]. This property is only available when Capture Rejected Entries is set to On. |
Rejected Entries Table | String | Enter a name for the table that Rejected Entries will be written to. If the table does not already exist, it will be created. This property is only available when Capture Rejected Entries is set to On. |
Capture Batch Results | Select | When set to Yes, enables the capture of batch results. This property is only available when Use Bulk API is set to Yes. |
Truncate Batch Results | Select | When set to Yes, enables the truncation of batch results. This property is only available when Use Bulk API is set to Yes. |
Batch Results Database | Select | Select a database to hold the batch results. The default is [Environment Default]. This property is only available when Capture Batch Results is set to On. |
Batch Results Schema | Select | Select a schema from the chosen Batch Results Database. The default is [Environment Default]. This property is only available when Capture Batch Results is set to On. |
Batch Results Table | String | Enter a name for the table that batch results will be written to. If the table does not already exist, it will be created. This property is only available when Capture Batch Results is set to On. |
Auto Debug | Select | On: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component. Off select this option to override any debugging connection options. |
Debug Level | Select | Select the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution: 1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors. 2: Also log cache queries and additional information about the request, if applicable. 3: Also log the body of the request and the response. 4: Also log transport-level communication with the data source. This includes SSL negotiation. 5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands. |
Synapse Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | Input a human-readable name for the component. |
Authentication Method | Select | Select the authentication method. Users can choose between a username/password combination or an OAuth. |
Use Sandbox | Select | No: connect to a live Salesforce account. This is the default setting. Yes: connect to a sandbox Salesforce account. This property is only available when Authentication Method is set to "User/Password". |
Username | String | Provide a valid Salesforce username. This property is only available when Authentication Method is set to "User/Password". |
Password | String | Provide a valid password corresponding to the Salesforce username. Users can store passwords in the component; however, use of the Password Manager feature is recommended. This property is only available when Authentication Method is set to "User/Password". |
Security Token | String | Provide a valid Salesforce security token. This property is only available when Authentication Method is set to "User/Password". |
Authentication | Select | Select an OAuth entry to authenticate this component. An OAuth entry must be set up in advance. To learn how to create and authorise an OAuth entry, please read our Salesforce Output Authentication Guide. This property is only available when Authentication Method is set to "OAuth". |
Use Bulk API | Select | No: write up to 200 rows in real-time. This is the default setting. Yes: write up to 10,000 rows asynchronously in the background—this cannot be cancelled before completion. |
Content Type | Select | Select the content type. When set to ZIP-XML (the default), data is loaded to Salesforce and the Content Type in the Salesforce UI is shown as ZIP-XML. This applies accordingly to each other content type. This parameter is only available when Use Bulk API is set to Yes. |
Connection Options | Parameter | A JDBC parameter supported by the database driver—manual setup is not usually required, since sensible defaults are assumed. Available parameters are explained in the Data Model. |
Value | A value for the given parameter. | |
Source Schema | Select | Select the source schema of the local data to be output to Salesforce. The special value, [Environment Default], is the default setting, and uses the schema defined in the Matillion ETL environment. |
Source Table | Select | Select the source table from which data will be unloaded (output). The tables available in the dropdown selection depend on the source schema. |
Target Object | Select | Select the Salesforce object (table) into which local data will be loaded (input). |
Output Operation | Select | Select the output operation to be performed into the target object. Available operations include Delete, Insert, Update, and Upsert. |
Salesforce ID | Select | Select the unique ID of the row within the Target Object into which the local data will be written. This property is only available when Output Operation is set to "Delete", "Update", or "Upsert" (not "Insert"). |
Column Mappings | Source Columns | Specify the columns in the source table that will be unloaded (output). |
Target Columns | Specify columns in the target object where the source columns will be output to. | |
On Warnings | Dropdown | Continue: Loads data despite records that return an error or that are rejected. This is the default setting. Fail: The load will fail at the point that a single record errors or is rejected, aborting the Salesforce bulk job. |
Batch Size | Integer | The maximum batch size of records. Accepts an integer between 0 – 10,000 when Use Bulk API is set to Yes. The default value is 10,000. When Use Bulk API is set to no, accepts an integer between 0 – 2000. The default batch size is 2000. |
Records Per Ingest Job | Integer | An integer for the number of records to load per bulk job. The component loads the entered number of rows (to a maximum of 10,000 per batch) into each Salesforce Ingest Job, and these jobs will load concurrently. If this field is empty, Salesforce Output will load all records and batches into Salesforce using a single Salesforce Ingest Job. This field is empty by default. If a negative value is specified, the job will fail at runtime. This property is only available when Use Bulk API is set to Yes. |
Relationship Columns | Parent Object | Drop-down list of available parent target objects. For example, if the "child" is User , the "parent" could be Account . |
Relationships | The relationship to the Parent Relationship column. For example, OwnerId is a relationship column in Account , but the relationship is named Owner . |
|
Type | Relationship columns refer to specific target objects. For example, OwnerId in Account refers to User . However, polymorphic objects like OwnerId in Event can refer to Calendar or User , but only one may be used. In the case of customer fields, this will always be. |
|
Index Column | The name of the column that uniquely identifies the Parent Target Object. The User of the Parent Target Object is identified by the Email column in User. |
|
Auto Debug | Select | On: select this option to automatically log debug information about the load—these logs can be found in the Task History and should be included in support requests concerning the component. Off select this option to override any debugging connection options. |
Debug Level | Select | Select the desired level of detail of debugging information logged. Beyond 1, huge amounts of data may be logged, resulting in slower execution: 1: Log the query, the number of rows returned by it, the start of execution and the time taken, and any errors. 2: Also log cache queries and additional information about the request, if applicable. 3: Also log the body of the request and the response. 4: Also log transport-level communication with the data source. This includes SSL negotiation. 5: Also log communication with the data source and additional details helpful in troubleshooting problems, including interface commands. |