IAM role authentication to Databricks AWS staging buckets
Authenticating with IAM Roles
See Using AWS IAM roles to access S3 buckets for detailed documentation on configuring Polytomic connections with IAM roles.
If you're on AWS and require authenticating using an IAM role rather than an access key and secret, switch the Authentication Method dropdown to IAM role as shown in this screenshot:

In addition to configuring Polytomic, you may also need to configure a Storage Credential which Databricks will use when reading data from the staging bucket. See the Databricks documentation for information on creating storage credentials. If your staging bucket is not configured as an External Location in Databricks, you'll need to provide Polytomic with the name of the Storage Credential in the connection configuration.
The Databricks user Polytomic uses will need the following permissions for the external location:
READ FILES
WRITE FILES
CREATE EXTERNAL TABLE
CREATE MANAGED STORAGE
And the following permissions for the storage credential:
READ FILES
WRITE FILES
CREATE EXTERNAL TABLE
Updated about 1 month ago