About 50 results
Open links in new tab
  1. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · 2 Building on @camo's answer, since you're looking to use the secret value outside Databricks, you can use the Databricks Python SDK to fetch the bytes representation of the secret …

  2. Databricks shows REDACTED on a hardcoded value - Stack Overflow

    Mar 16, 2023 · It's not possible, Databricks just scans entire output for occurences of secret values and replaces them with " [REDACTED]". It is helpless if you transform the value. For example, like you …

  3. Is there a way to use parameters in Databricks in SQL with parameter ...

    Sep 29, 2024 · There is a lot of confusion wrt the use of parameters in SQL, but I see Databricks has started harmonizing heavily (for example, 3 months back, IDENTIFIER () didn't work with catalog, …

  4. Databricks: How do I get path of current notebook?

    Databricks is smart and all, but how do you identify the path of your current notebook? The guide on the website does not help. It suggests: %scala dbutils.notebook.getContext.notebookPath res1: ...

  5. Databricks Permissions Required to Create a Cluster

    Nov 9, 2023 · In Azure Databricks, if you want to create a cluster, you need to have the " Can Manage " permission. This permission basically lets you handle everything related to clusters, like making new …

  6. Installing multiple libraries 'permanently' on Databricks' cluster ...

    Feb 28, 2024 · Installing multiple libraries 'permanently' on Databricks' cluster Asked 1 year, 11 months ago Modified 1 year, 11 months ago Viewed 5k times

  7. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done in pyspark …

  8. Databricks - Download a dbfs:/FileStore file to my Local Machine

    Method3: Using third-party tool named DBFS Explorer DBFS Explorer was created as a quick way to upload and download files to the Databricks filesystem (DBFS). This will work with both AWS and …

  9. databricks - Using Service Principal in Azure Devops Pipeline to run ...

    Jan 22, 2024 · Go to the target databricks job -> Job details -> Edit permissions -> add Can Manage run for the service principal. In your azure pipeline yaml, you can get the access token for service …

  10. What is the correct way to access a workspace file in databricks

    Nov 16, 2023 · According to these documentations (1, 2), the workspace files or assets are available for Databricks Runtime 11.2 and above. With Databricks Runtime 11.2 and above, you can create and …