Databricks

DATABRICKS CLI — COMPLETE COMMAND DOCUMENTATION

A complete operational guide for managing Azure Databricks using the CLI — covering authentication, workspaces, compute, jobs, Unity Catalog, secrets, DBFS, Git repos, and Infrastructure-as-Code via Asset Bundles.

Architectural Overview

The CLI communicates with Azure Databricks via REST APIs using OAuth-based authentication:

Local Machine (CLI)

OAuth Authentication

Azure Databricks Workspace

Clusters / Jobs / Unity Catalog / Warehouses

1. Installation & Setup

Command Description
winget –version Verify Windows Package Manager is installed
winget install Databricks.DatabricksCLI Install the Databricks CLI
databricks -v Confirm installation and check version
where databricks Show CLI binary path on disk

1.1 Check Winget

Confirm the Windows Package Manager is available before installing:

winget –version

1.2 Install Databricks CLI

winget install Databricks.DatabricksCLI

1.3 Verify Installation

databricks -v

Expected output: Databricks CLI v0.2xx.x

1.4 Check CLI Path

where databricks

2. Authentication (OAuth Based)

Command Description
databricks auth login –host <url> –profile DEFAULT Log in with OAuth (opens browser)
databricks auth describe –profile DEFAULT Show active profile details
databricks auth profiles List all configured profiles
Delete .databricks folder Log out / remove cached tokens

2.1 Login

databricks auth login –host <workspace-url> –profile DEFAULT

# Example:

databricks auth login –host https://adb-7405609716325528.8.azuredatabricks.net/

2.2 Check Profile

databricks auth describe –profile DEFAULT

2.3 List Profiles

databricks auth profiles

2.4 Logout

Delete the token cache folder to log out:

C:\Users\<username>\.databricks

3. Current User Information

databricks current-user me

Returns: username, email address, and user ID.

4. Workspace Commands

Command Description
databricks workspace list / List root workspace contents
databricks workspace mkdirs /path Create a folder
databricks workspace delete /path Delete a file
databricks workspace delete /path –recursive Recursively delete a folder
databricks workspace import /dest –file local.py –language PYTHON –format SOURCE Upload a file
databricks workspace export /src –file out.py –format SOURCE Download a file

4.1 List Workspace Root

databricks workspace list /

4.2 Create Folder

databricks workspace mkdirs /Shared/Yash

4.3 Delete File or Folder

# Delete a single file

databricks workspace delete /Shared/Yash/file.py

# Recursive delete

databricks workspace delete /Shared/Yash –recursive

4.4 Import File

databricks workspace import /Shared/Yash/local.py \

–file local.py –language PYTHON –format SOURCE –overwrite

Verify upload:

databricks workspace list /Shared/Yash

4.5 Export File

databricks workspace export /Shared/Yash/local.py \

–file exported.py –format SOURCE

Verify locally:

dir exported.py

4.6 List Files

databricks workspace list /Shared/Yash

5. Cluster Management

Command Description
databricks clusters list List all clusters
databricks clusters get <cluster-id> Get full cluster details/config
databricks clusters start <cluster-id> Start a terminated cluster
databricks clusters delete <cluster-id> Terminate (stop) a running cluster
databricks clusters permanent-delete <cluster-id> Permanently remove a cluster

5.1 List Clusters

databricks clusters list

5.2 Get Cluster Details

databricks clusters get 0106-075344-c5jhxigd

5.3 Start Cluster

databricks clusters start 0106-075344-c5jhxigd

5.4 Terminate Cluster

databricks clusters delete 0106-075344-c5jhxigd

5.5 Permanent Delete

databricks clusters permanent-delete <cluster-id>

6. Jobs Management

Command Description
databricks jobs list List all jobs
databricks jobs get <job-id> Get job configuration and details
databricks jobs run-now <job-id> Trigger an immediate run
databricks jobs list-runs List recent job runs and statuses
databricks jobs cancel-run <run-id> Cancel an active run
databricks jobs delete <job-id> Delete a job definition

6.1 List Jobs

databricks jobs list

6.2 Get Job Details

databricks jobs get 909703333311409

6.3 Run Job Now

databricks jobs run-now 909703333311409

6.4 List Runs

databricks jobs list-runs

6.5 Cancel Run

databricks jobs cancel-run 909703333311409

6.6 Delete Job

databricks jobs delete 909703333311409

7. SQL Warehouses

Command Description
databricks warehouses list List all SQL warehouses
databricks warehouses start <id> Start a stopped warehouse
databricks warehouses stop <id> Stop a running warehouse

7.1 List Warehouses

databricks warehouses list

7.2 Start Warehouse

databricks warehouses start dd2325bf3cbab4f6

7.3 Stop Warehouse

databricks warehouses stop dd2325bf3cbab4f6

8. Unity Catalog Commands

Command Description
databricks catalogs list List all catalogs
databricks catalogs get <catalog> Get catalog metadata
databricks schemas list <catalog> List schemas in a catalog
databricks tables list <catalog> <schema> List tables in a schema
databricks tables get <catalog>.<schema>.<table> Get full table details

8.1 List Catalogs

databricks catalogs list

8.2 Get Catalog

databricks catalogs get fq_dev_catalog

8.3 List Schemas

databricks schemas list fq_dev_catalog

8.4 List Tables

databricks tables list <CATALOG_NAME> <SCHEMA_NAME>

# Example:

databricks tables list fq_dev_catalog gold

8.5 Get Table Details

databricks tables get <CATALOG>.<SCHEMA>.<TABLE>

# Example:

databricks tables get fq_dev_catalog.gold.sales_sample

9. Secrets Management

Command Description
databricks secrets create-scope <scope> Create a new secret scope
databricks secrets list-scopes List all secret scopes
databricks secrets put-secret <scope> <key> Add or update a secret (prompted input)
databricks secrets list-secrets <scope> List key names in a scope
databricks secrets delete-secret <scope> <key> Delete a specific secret key
databricks secrets delete-scope <scope> Delete entire scope + all secrets

9.1 Create Secret Scope

databricks secrets create-scope my-scope

9.2 List All Secret Scopes

databricks secrets list-scopes

9.3 Add Secret

databricks secrets put-secret my-scope my-key

# You will be prompted (input is hidden):

Enter secret value: ****

9.4 List Secrets in Scope

databricks secrets list-secrets my-scope

# Output:

Key

my-key

9.5 Delete a Secret

databricks secrets delete-secret my-scope my-key

9.6 Delete Entire Scope

databricks secrets delete-scope my-scope

10. Repos (Git Integration)

Command Description
databricks repos list List connected Git repos
databricks repos create –url <git-url> –provider github Connect a new repo
databricks repos delete <repo-id> Disconnect a repo

10.1 List Repos

databricks repos list

10.2 Create Repo

databricks repos create –url https://github.com/repo.git –provider github

10.3 Delete Repo

databricks repos delete <repo-id>

11. Files (DBFS)

Command Description
databricks fs ls dbfs:/ List root DBFS contents
databricks fs cp local.csv dbfs:/tmp/ Copy a file to DBFS
databricks fs ls dbfs:/tmp/ Verify uploaded file
databricks fs rm dbfs:/tmp/local.csv Remove a file from DBFS

11.1 List DBFS

databricks fs ls dbfs:/

11.2 Copy File to DBFS

databricks fs cp local.csv dbfs:/tmp/local.csv

11.3 Verify Upload

databricks fs ls dbfs:/tmp/

11.4 Remove File

databricks fs rm dbfs:/tmp/local.csv

11.5 Verify Deletion

databricks fs ls dbfs:/tmp/

12. Databricks Asset Bundles (DAB)

Asset Bundles enable Infrastructure-as-Code (IaC) — deploy jobs, pipelines, and resources in a repeatable, version-controlled way.

Command Description
databricks bundle init Scaffold a new bundle project
databricks bundle validate Validate bundle configuration
databricks bundle deploy -t dev Deploy bundle to the dev target
databricks bundle destroy -t dev Tear down a deployed bundle

12.1 Initialize Project

mkdir my_dab_project

cd my_dab_project

databricks bundle init

12.2 Validate Bundle

databricks bundle validate

12.3 Deploy Bundle

databricks bundle deploy -t dev

12.4 Destroy Deployment

databricks bundle destroy -t dev

13. Debug & Advanced

Enable Debug Logs

databricks –debug clusters list

Check CLI Help

databricks –help

# Command-specific:

databricks clusters –help

14. Profile Management (Multiple Environments)

Use named profiles to switch between Dev, QA, and Prod workspaces without re-authenticating each time.

Command Description
databricks auth login –host <url> –profile PROD Add a named profile
databricks clusters list –profile PROD Run any command against a profile
databricks auth profiles List all saved profiles

Login with New Profile

databricks auth login –host <url> –profile PROD

Use a Specific Profile

databricks clusters list –profile PROD

Real Use Cases

  • Automating cluster lifecycle — start/stop/terminate on a schedule via cron or CI/CD
  • Deploying jobs via CI/CD pipelines (GitHub Actions, Azure DevOps, Jenkins)
  • Managing Unity Catalog — create schemas, manage table permissions at scale
  • Bulk workspace operations — batch import/export of notebooks across projects
  • Infrastructure as Code — version-controlled deployments with Asset Bundles

Security & Best Practices

  • Prefer OAuth authentication — it supports automatic token refresh
  • Use profiles to isolate Dev / QA / Prod environments
  • Never hardcode cluster IDs, workspace URLs, or secrets in scripts
  • Use Asset Bundles for repeatable, peer-reviewed deployments
  • Store credentials in Databricks Secrets — never in plain text or source repos
  • Delete the .databricks cache folder when decommissioning a machine

🚀 Ready to Simplify Your Databricks Operations?

Whether you’re managing clusters, automating jobs, or implementing Infrastructure-as-Code with Databricks Asset Bundles — having the right strategy makes all the difference.

At TGH Software Solutions, we help businesses streamline data workflows, optimize cloud infrastructure, and implement scalable Databricks solutions tailored to your needs.

💡 Let’s help you unlock the full potential of Databricks for your organization.

📞 +91 88106 10395
🌐 www.techygeekhub.com
🔗https://techygeekhub.com/contact-us/

👉 Get in touch today for expert consultation and end-to-end implementation support.

Author

Yash Roopam

Leave a comment

Your email address will not be published. Required fields are marked *