What’s new in Databricks for June 2023


Delta Lake ( DBR 13.2 required)

  • Delta Lake Universal Format (UniForm) allows you to read Delta tables with Iceberg clients. For more information https://docs.databricks.com/delta/uniform.html
  • Delta Lake liquid clustering replaces table partitioning and ZORDER to simplify data layout decisions and optimize query performance. For more information https://docs.databricks.com/delta/clustering.html
  • Archival support for Delta Lake introduces a collection of capabilities that enable you to use cloud-based lifecycle policies on cloud object storage containing Delta tables to move files to archival storage tiers. For more information https://docs.databricks.com/optimizations/archive-delta.html


  • You can now upgrade MLflow Model Registry workflows to govern models through Unity Catalog. Unity Catalog provides centralized access control, auditing, lineage, model sharing across workspaces, and better MLOps deployment workflows. Databricks recommends using Models in Unity Catalog instead of the existing workspace model registry, which will be deprecated in the future.
  • System tables are a Databricks-hosted analytical store of an account’s operational data. System tables provide you with easily-accessed account-wide observability data. For more information https://docs.databricks.com/administration-guide/system-tables/index.html

Data Management

  • You can now see data quality metrics for pipelines running in continuous mode when you view dataset details in the Delta Live Tables UI. Previously, data quality metrics displayed for only triggered pipelines. For more information https://docs.databricks.com/delta-live-tables/observability.html#dataset-details
  • Pipelines that use Unity Catalog can now write to catalogs that have a custom storage location. Data is persisted in the catalog storage location when the location is specified. Otherwise, data is persisted in the metastore root location. For more information https://docs.databricks.com/delta-live-tables/unity-catalog.html


Databricks SQL

  • SQL tasks in Workflows are now generally available. You can orchestrate Queries, Dashboards, and Alerts from the Workflows page.
  • A new schema browser is now in Public Preview, featuring an updated UX, a For You tab, and improved filters. The schema browser is available in Databricks SQL, Data Explorer, and notebooks. 
  • New SQL built-in functions, such as array_prepend(array, elem), try_aes_decrypt(expr, key [, mode [, padding]]), and sql_keywords() .
  • You can now use shallow clone to create new Unity Catalog managed tables from existing Unity Catalog managed tables. You can now use CLONE and CONVERT TO DELTA with Iceberg tables that have partitions defined on truncated columns of types int, long, and string. Truncated columns of type decimal are not supported.

Laisser un commentaire

Votre adresse e-mail ne sera pas publiée. Les champs obligatoires sont indiqués avec *