Setup Access Management for Databricks SQL¶
This section outlines the steps to configure the Databricks SQL connector in Privacera. Ensure that all prerequisites are met before proceeding.
Best Practice for Enabling the Connector
This connector manages access control policies for Databricks SQL. If there are existing policies in Databricks SQL, they will be overwritten by policies from Privacera. It is strongly recommended to start by managing access for a limited set of resources before enabling the connector for all resources. For example, you can create a test database in Databricks SQL and use Privacera to manage its access control policies. Once you're confident with the setup, you can extend the connector to manage all databases.
Create Instance of Databricks SQL¶
-
SSH to the instance where Privacera Manager is installed.
-
Run the following command to navigate to the
/config
directory.Bash -
Create a new directory for the Databricks SQL connector configuration.
Note
In the example below,
instance1
is the name of the connector instance. You can change this name to uniquely identify your installed connector configuration. The connector instance name should consist of only hyphens and alphanumeric characters.Bash -
Copy the sample connector configuration file to your custom directory:
Bash -
Run the following command to open the
.yml
file to be edited:Bash
-
In PrivaceraCloud, navigate to Settings -> Applications.
-
On the Applications screen, select Databricks SQL application under Available connections.
-
Enter the application Name and Description, then click Save. Name could be any name of your choice. E.g.
Databricks SQL Connector for account 123456
. -
Open the Databricks SQL application.
-
Enable the Access Management option with toggle button.
Connection Details¶
- Specify the Databricks workspace URL. For example,
https://dev-environment.cloud.databricks.com
. - Provide the Databricks personal access token used to authenticate with the Databricks API.
- Enter the JDBC URL, default database, and username for the Databricks connection.
Replace <workspace-url>
, <jdbc-url>
, <db_name>
, and <databricks-access-token>
with your actual values.
Default Repository Behavior
By default, the Databricks SQL connector uses the SQL repository. If you prefer to use the Hive repository instead, refer to the corresponding tab below for configuration steps.
-
Enter these fields under the BASIC tab:
- Databricks SQL workspace URL:
https://<workspace-url>.cloud.databricks.com
- Databricks SQL JDBC url:
<jdbc-url>
- Databricks SQL default database:
<db_name>
- Databricks SQL JDBC username:
token
- Databricks SQL JDBC password:
<databricks-access-token>
- Databricks SQL workspace URL:
-
Update the Advanced tab with the following property:
- System config:
privacera-databricks_sql_analytics-system-config.json
- System config:
Hive Repository Support
To enable the Hive repository, configure the following system property:
- System config:
privacera-databricks_sql_analytics-hive-system-config.json
Managed Databases¶
This property is used to specify a comma-separated list of database names to which access control policies will be applied. Wildcards are supported.
Example: test_db1,test_db2,sales_*
.
Warning
- Values are case-sensitive.
- Replace the below example value with your actual value.
Recommended Best Practice
It is advisable to test the connector with a test database before enabling it for all databases.
YAML | |
---|---|
Under the BASIC tab, enter the value for:
- Databases to set access control policies:
test_database
Enforcing Grants/Revoke¶
This property enables the enforcement of GRANT and REVOKE operations for managing users, groups, and roles.
YAML | |
---|---|
Under the BASIC tab, enable the toggle:
- Enable policy enforcements and user/group/role management
Apply the Configuration¶
After all the changes are done you can start the connector by running the following instructions:
Step 1 - Setup which generates the helm charts. This step usually takes few minutes.
Step 2 - Apply the Privacera Manager helm charts. Step 3 - Post-installation step which generates Plugin tar ball, updates Route 53 DNS and so on.-
Once all the required fields are filled, click Save.
-
The configured Databricks SQL connector appears under Connected Applications.
-
Once saved and enabled, the Databricks SQL connector will start. Then you can hover on the VIEW LOGS button to check the status, either Running or Stopped.
-
Perform the following steps to restart the Databricks SQL connector application:
-
Go to Settings → Applications → select the Databricks SQL connector application.
-
Edit the application by disabling the Access Management option with toggle button and then Save it.
-
Open the same application again and then enable the Access Management option with toggle button and then Save it.
-
- Prev topic: Prerequisites
- Next topic: Advanced Configuration