Azure SQL Bulk Output
  • Dark
    Light

Azure SQL Bulk Output

  • Dark
    Light

This article is specific to the following platforms - Snowflake - Synapse.

Azure SQL Bulk Output

Note: This feature is only available for instances hosted on Azure.

Load the contents of a table (or view) into a table in a Microsoft Azure database.

Properties

Property Setting Description
Name Text A human-readable name for the component.
Azure SQL Server Name Select The server IP or DNS address of the Azure SQL server.
Database Name Select This is the name of the database within your Azure instance.
Username Text This is your Azure connection username.
Password Text This is your Azure connection password. The password is masked, so it can be set, but not read. We advise using the Password Manager rather than simply storing your password in the component.
JDBC Options Parameter A JDBC parameter supported by the Database Driver. The available parameters are determined automatically from the driver, and may change from version to version.
They are usually not required, since sensible defaults are assumed.
Value The parameter value.
Schema Select Select the table schema. The special value, [Environment Default] will use the schema defined in the environment. For more information on using multiple schemas, see this article.
Source Table Select The table (or view) on your cluster to copy to Azure.
Target Table Text Provide a new table name.
Target Schema Text The name of the schema used to create the target table.
Load Columns Select Choose the columns to load into Azure. This parameter can be empty, which means that all columns will be loaded.
Table Maintenance Select None - assume the Azure database already has the table defined with the correct structure.
Create if doesn't exist - Only create the table if it doesn't already exist.
Replace - Always drop and re-create the table. Use with care!
Primary Key Select Select one or more columns to be designated as the table's primary key.
Truncate Target Table Select Whether or not to truncate the target table before loading data.
On Warnings Select Choose whether to Continue with the load if an error is raised, or to Fail the run.
Additional Copy Options Text Any additional options that you want to apply to the copy. Some of these may conflict with the options the component already sets, in particular, care is taken to escape the data to ensure that it loads into the target database even if the data contains row and/or column delimiters, so you should never override the escape or delimiter options.
Batch Size Integer This is optional, and specifies the number of rows to load to the target between each COMMIT. On a very large export, this may be desirable to keep the size of the Azure log files from growing very large before the data is committed.
Note: While this parameter is optional, the default value is 5000.

Strategy

A select query is issued against the source table. The output is formatted in an appropriate way to load into the target database, and data is streamed in, meaning that no temporary files are used and the load into the target table begins almost immediately.

Example

In this example, we load our Snowflake data into an Azure SQL database. We call this simply "all_data_types_sf".

We allow the component to create the target table (if it does not already exist), and then load the data into the table in batch sizes of 1,000 rows.


What's Next