Added

Databricks Data Source Support

Connect to Databricks SQL Warehouses as a data source alongside Snowflake and PostgreSQL. This enables organizations using Databricks to leverage Bobsled AI's natural language to SQL capabilities directly against their lakehouse data.

What's New

  • Connect to Databricks SQL Warehouses using Personal Access Tokens (PAT)
  • Run SQL queries against Databricks catalogs and schemas
  • Test connections before saving data source configuration
  • Full support for Databricks 3-level namespace (catalog.schema.table)

New Data Source Type

The type field in data source endpoints now accepts databricks in addition to snowflake, postgres, and sledhouse.

Modified Endpoints

Test Connection (Stateless)

POST /api/v1/accounts/{accountId}/data-sources/test-connection

Now accepts Databricks connection configuration:

Important: The accessToken field must contain your Personal Access Token (PAT) encoded in base64. Encode your PAT before including it in the request body (e.g., echo -n "dapi..." | base64).

{
  "connectionConfig": {
    "type": "databricks",
    "host": "dbc-xxxxx.cloud.databricks.com",
    "httpPath": "/sql/1.0/warehouses/xxxxx",
    "port": 443,
    "catalog": "main",
    "accessToken": "ZGFwaXh4eHh4eHh4eHh4eA=="
  }
}

Test Connection (Existing Data Source)

POST /api/v1/accounts/{accountId}/data-sources/{id}/test-connection

Works with saved Databricks data sources.

Run SQL

POST /api/v1/accounts/{accountId}/workspaces/{workspaceId}/run-sql

Execute queries against workspaces connected to Databricks data sources.

Chat Endpoints

POST /api/v1/accounts/{accountId}/workspaces/{workspaceId}/chat POST /api/v1/accounts/{accountId}/workspaces/{workspaceId}/chat/stream

Chat endpoints now include dataSourceType in the context passed to the AI agent, enabling data-source-aware query generation.

Getting Started

  1. Obtain a Personal Access Token from your Databricks workspace
  2. Navigate to Data Sources → Add New → Databricks
  3. Enter your workspace host, SQL warehouse HTTP path, and PAT
  4. Test the connection and save

Response Format

Connection test response for Databricks:

{
  "ok": true,
  "data": {
    "success": true,
    "message": "Successfully connected to Databricks",
    "details": {
      "type": "databricks",
      "currentCatalog": "main",
      "accessibleCatalogs": ["main", "hive_metastore"]
    }
  }
}