Connect to Delta Lake on Databricks
-
DarkLight
Connect to Delta Lake on Databricks
-
DarkLight
Overview
Connect to Delta Lake on Databricks via Matillion Data Loader and use it as your destination for batch-loading a pipeline.
Add AWS credentials
Select an existing AWS cloud credential, or click +Add AWS cloud credential to make a new one.
Property | Description |
---|---|
AWS credential label | A unique, descriptive name for your new AWS credential. |
Access Key ID | An AWS access key. Read Understanding and getting your AWS credentials for more information. |
Secret Access Key | An AWS secret access key. Read Understanding and getting your AWS credentials for more information. |
Click Test and save to continue.
Connect to Delta Lake on Databricks
Property | Description |
---|---|
Destination label | A unique, descriptive label for your destination. |
Workspace ID | Your Databricks workspace identifier. Read Get workspace, cluster, notebook, folder, model, and job identifiers to learn more. |
Token | A Databricks personal access token. Read https://docs.databricks.com/dev-tools/api/latest/authentication.html to learn more. |
Username | Your Databricks login username. |
Password | A managed entry representing your Databricks login password. Choose an existing password from the dropdown menu or click Manage and then click Add new password to configure a new managed password entry. Give the password a label, which is what you can see in the password dropdown menu, and then input the value of the password. Read Manage Passwords to learn more. |
Click Test and continue to test your settings and move forward. You can't continue if the test fails for any reason.
Configure destination
Property | Description |
---|---|
Endpoint/Cluster | Select a Databricks cluster. The dropdown list is populated based on the Databricks account you logged into. |
Schema | Select a schema. |
S3 bucket | Select an S3 bucket to load the data into. |
Table prefix | An optional string to prepend your table with. |
Click Continue