Explore tweets tagged as #UnityCatalog
@JoshCaplan1984
Josh Caplan
4 days
The #OneLake release train keeps rolling. #Databricks #UnityCatalog mirroring via shortcuts is now generally available! #MicrosoftFabric
0
11
33
@AMdatalakehouse
Alex Merced | Open Data Lakehouse Advocate
2 months
BLOG: The Case for Apache Polaris as the Community Standard for Lakehouse Catalogs . READ HERE: . #ApacheIceberg #DeltaLake #UnityCatalog #ApachePolaris #DataLakehouse #DataEngineering
Tweet media one
1
1
7
@unitycatalog_io
Unity Catalog
2 months
A data catalog might look like a thin layer, but as Sida Shen from @CelerData explains, it’s so much more. In this clip, Sida shares how catalogs go beyond simply storing metadata. They’re essential for managing authentication, authorization, and compliance across all your data
0
5
9
@DeltaLakeOSS
Delta Lake
2 days
🚀 From Delta Lake to downstream ML pipelines — without Spark?. In this clip, Daniel Beach shares a real production use case where he used Polars inside an Apache Airflow worker to read from a @unitycatalog_io–backed #DeltaLake table, filter job-specific records, and dynamically
0
1
10
@ankurkumarz
Ankur Kumar
9 months
Unity Catalog is Opensource but has limited functionalities - wish we could have easily created a table using the DuckDB interface:. #opensource #unitycatalog.
Tweet media one
1
0
1
@starburstdata
Starburst
2 days
🚀 Big news from #DataAISummit: Starburst now supports read + write with Unity Catalog—unlocking governed, cross-engine interoperability. Thanks to Databricks, our speakers & amazing customers!. 📖 Recap: #StarburstData #UnityCatalog #AI #DataAccess
Tweet media one
0
1
3
@unitycatalog_io
Unity Catalog
29 days
Unity Catalog ➕ OpenAI = 🤝. If you’re building AI applications with @OpenAI, you know the challenge: data, models, and functions scattered everywhere, each with their own access rules. 🤯 #UnityCatalog gives you a single place to manage your data and AI assets, making it easier
Tweet media one
0
3
6
@kanuazut
konozama
23 hours
PolarisとUnityCatalogの公式ドキュメントにはMCPに関する記述は無さげだった。.DatabricksのマネージドUnityCatalogにはMCPサーバ提供があるみたい。.
1
0
2
@dataenggdude
DataEngDude
24 days
Everyone is talking about "Agentic AI" and data engineering being dead very shortly. Here I am trying to use these "amazing" #databricks tools like #unitycatalog and #mlflow to register some #spark ML models.
Tweet media one
1
0
2
@daftengine
Daft
2 months
Daft was one of the first to natively integrate with @unitycatalog_io! But how do you use it together for modern multimodal workloads?. Kevin & Sammy dive into how Daft’s catalog API enables you to query Unity tables using both SQL and Python DataFrame.
Tweet media one
1
1
2
@taka_aki
Taka YAYOI/弥生 隆明
3 months
Sparkで大量のデータを高速にバッチ処理できるPandas UDFがUnity Catalogで管理できるようになり、バッチPython UDFとして利用できるようになりました。ガバナンスの集中管理や共有が可能となります!. DatabricksのUnity CatalogバッチPythonユーザー定義関数 #UnityCatalog - Qiita
Tweet media one
0
1
10
@jaceklaskowski
Jacek Laskowski
10 months
What a day! 🥰 Just found this issue in #UnityCatalog OSS repo 🤯. ➡️ (I immediately liked #PuppyGraph 😉)
Tweet media one
0
0
3
@daftengine
Daft
17 days
Daft was one of the first to natively integrate with @unitycatalog_io. You can use Daft’s catalog API to query Unity tables using both SQL and Python DataFrame, such as filtering with a WHERE clause in a .sql() expression. Try it yourself and get started: pip install daft[unity]
1
2
2
@unitycatalog_io
Unity Catalog
25 days
Machine learning and AI workflows are full of experimentation—multiple training runs, fine-tuning, and constant iteration. But as models multiply, so does the need for robust governance. 🔒 With tools like @MLflow and Unity Catalog, you can:.🔹 Track and manage every model
Tweet media one
0
4
5
@AdiPolak
Adi Polak
1 year
Exciting to learn about the cross-compatibility for formats approach. That means that with the recently open-sourced UnityCatalog, a plethora of interesting challenges will be solved across companies. It's a great way to build industry-level best practices. Suppose you are
Tweet media one
1
2
10
@databricks
Databricks
5 months
Updating existing code for #UnityCatalog compatibility just got easier! . UCX is a toolkit for enabling Unity Catalog in your Databricks workspace. In this quick demo, learn how to:.- Execute essential UCX commands for code migration.- Adapt existing SQL and Python code to Unity
0
5
13
@taroleo
Taro L. Saito
1 year
既にduckdbを使って、unitycatalog, delta lakeにアクセスできるようになっているとは驚き
Tweet media one
0
0
5
@databricks
Databricks
9 months
Introducing batch inference on Mosaic AI Model Serving: a simpler, faster and more scalable way to apply LLMs to large documents. With this release, run batch inference directly within your workflows – with full governance through #UnityCatalog.
Tweet media one
1
6
31
@unitycatalog_io
Unity Catalog
19 days
Looking to build LLM-powered apps like chatbots, document summarizers, or data explorers without compromising on data governance? 🔐. Our latest blog shows you how to use #UnityCatalog with @llama_index to organize, secure, and efficiently query your data for AI use cases. Unity
Tweet media one
0
0
1