Prerequisites
Before you begin, make sure you have the following:- An active Databricks workspace.
 - Permissions to create and manage Service Principals within your Databricks account or workspace.
 - A Databricks SQL Warehouse set up and running.
 
1. Obtain Databricks Connection Details
You will need the following four pieces of information from your Databricks environment:- Host: The URL of your Databricks workspace, which typically looks like 
https://<region>.cloud.databricks.com/. You can find this in your SQL Warehouse connection details. - HTTP Path: The specific path to your SQL Warehouse, usually starting with 
/sql/1.0/warehouses/. You can find this in the “Connection Details” tab of your SQL Warehouse. - Client ID: The Application ID of your Databricks Service Principal.
 - Client Secret: The secret key generated for your Databricks Service Principal.
 
How to Find Your Connection Details
Follow these steps to find the necessary credentials.Host and HTTP Path
- Navigate to your Databricks Workspace.
 - Go to SQL Warehouses (or SQL Endpoints).
 - Select the warehouse you want to connect to.
 - Click on the Connection Details tab. Here you will find the Server Hostname (your Host) and the HTTP Path.
 
Client ID and Client Secret (using a Databricks-managed Service Principal)
You’ll first need to create a Service Principal in Databricks. Step 1: Create a Service Principal- Log in to your Databricks environment.
- A) account-level permissions: Log in to your Databricks account console.
 - B) administrator-level permissions: Log in to your Databricks workspace as an administrator.
 
 - Navigate to the Service Principals section.
- A) From Account Console: Navigate to User management > Service principals.
 - B) From Workspace: Navigate to Admin Settings (click your username in the top right, then Admin Settings) > Identity and access > Service principals.
 
 - Click Add service principal.
 - Select Databricks managed.
 - Provide a descriptive Name for the service principal (e.g., WisdomAI-ServicePrincipal).
 - Click Add service principal.
 - Once created, you will see the Application ID for this service principal. This is your Client ID. Copy this value.
 
- From the service principal’s page, go to the Secrets tab.
 - Under OAuth secrets, click Generate secret.
 - 
Set a lifetime for the secret (up to 730 days) and click Generate.
Choose an expiration that balances security and operational convenience.
 - Immediately copy the displayed secret. This is your Client Secret.
 
The Client Secret is only shown once and cannot be retrieved later. Please store it in a secure location immediately.
2. Assign Permissions in Databricks
The Service Principal needs specific permissions to access the required data that WisdomAI needs.- SQL Warehouse/Cluster Permissions: Grant CAN USE permission to the Service Principal on the SQL Warehouse you are connecting to. You can do this from the Permissions tab of the SQL Warehouse.
 - Data Privileges: The service principal requires the following privileges in Unity Catalog (or Hive Metastore):
USE CATALOGon the target catalog.USE SCHEMAon the target schema.SELECTon the tables and views you want WisdomAI to access.
 
<service_principal_id> with the Application ID of your service principal, and <your_catalog_name>,<your_schema_name>, and <your_table_name> with your specific values.
3. Connect WisdomAl to Databricks
Once you have gathered the required information, you can configure the connection in WisdomAl:- In WisdomAl, navigate to the Connections section and click Add Connection.
 - Select Databricks as the data source type.
 - Fill in the connection details in the “Databricks connection details” section.
- Connection Name: Choose a descriptive name for your connection (e.g., “Databricks Prod”).
 - Host: Paste the Server Hostname you obtained from Databricks.
 - HTTP Path: Paste the HTTP Path for your SQL Warehouse.
 - Client ID: Enter your Client ID (Application ID). This field is mandatory.
 - Client Secret: Enter your Client Secret (the value copied immediately after creation). This field is mandatory.
 - Catalog Filters (Optional): Specify any Catalog Filters if you want to further restrict the catalogs/schemas WisdomAl crawls.
 
 - Click Save and Sync metadata. WisdomAl will use the provided credentials to connect to Databricks and scan the metadata of the specified catalogs and tables.
 

Security Considerations and Best Practices
- Least Privilege: Always follow the principle of least privilege. Only grant the necessary permissions to the Service Principal.
 - Secure Key Management: Treat your Client ID and Client Secret like passwords. Avoid sharing them via insecure channels, such as unencrypted email. Use secure methods, such as LastPass or other secure file-sharing services.
 - Credential Rotation: Regularly rotate your Client Secrets to enhance security, especially before their expiration.
 
Troubleshooting Common Issues
Having trouble? Here are solutions to some frequently encountered problems.Authentication Failed or Invalid Credentials
- Ensure the Host, HTTP Path, Client ID, and Client Secret are complete and correct. No extra characters, spaces, or missing lines.
 - Verify that the Client Secret has not expired or been revoked in Databricks.
 - Confirm that the Service Principal exists and is enabled in Databricks.
 
Permission Denied Errors
- Double-check that the Service Principal has the necessary 
CAN USEpermission on the SQL Warehouse. - Verify that the Service Principal has the correct 
USE CATALOG,USE SCHEMA, andSELECTprivileges on the Databricks catalogs, schemas, and tables you are trying to access. - Confirm that the SQL Warehouse is running and accessible.
 
Catalog Not Found or Table Not Found
- Verify the spelling of the catalog or table names.
 - If using catalog filters in WisdomAl, ensure the catalog is included in the filter.
 - Confirm that the Service Principal has permissions on the specific catalog you are trying to access.
 
Connection Timeout or Network Errors
- Verify that WisdomAl’s CIDR block (3
5.238.115.103/32or34.82.248.105/32) is allowlisted in your Snowflake network policies or any corporate firewalls.