Advancements in Lakehouse AI and MLflow 2.5 for Generative AI
New data-centric approach to AI makes it easier to build, deploy and manage LLM applications, enabling customers to accelerate their generative AI journey.
Databricks has introduced new Lakehouse AI innovations designed to simplify the development of generative AI applications, including large language models (LLMs), directly within the Databricks Lakehouse Platform. These innovations provide a data-centric approach to AI, offering integrated capabilities for the entire AI lifecycle, monitoring, and governance.
The newly unveiled features include Vector Search, a collection of open-source models, LLM-optimized Model Serving, MLflow 2.5 featuring LLM capabilities like AI Gateway and Prompt Tools, and Lakehouse Monitoring.
Generative AI’s demand is driving change across industries, prompting technical teams to build generative AI models and LLMs using their own data to differentiate their offerings. However, AI success depends on data quality, and separating the data platform from the AI platform can lead to challenges in maintaining clean data. Additionally, moving models from experimentation to production and ensuring their operationalization is complex and unreliable.
Databricks addresses these challenges through Lakehouse AI, which unifies the data and AI platform. This enables customers to develop generative AI solutions more efficiently, whether by using foundational models or training custom models securely using their enterprise data. This integration brings together data, LLM operations (LLMOps), AI models, monitoring, and governance on the Databricks Lakehouse Platform, accelerating the generative AI journey for organizations.
Vector Search improves generative AI response accuracy through embeddings search, while fine-tuning in AutoML provides a low-code approach for enhancing LLMs, allowing customers to fine-tune models using their own data and retain ownership of the resulting model. Curated open-source models, backed by optimized Model Serving, make it easy to start with generative AI.
Databricks has also introduced innovations in LLMOps with MLflow 2.5. The updates include:
- MLflow AI Gateway: This feature allows firms to centrally manage credentials for SaaS models or model APIs, offering access-controlled routes for querying. These pathways can then be shared with different teams to integrate into their projects or workflows. Developers have the flexibility to interchange the backend model as needed for better cost and quality, and to switch among various LLM providers. MLflow AI Gateway also facilitates prediction caching for monitoring repeated prompts and rate limiting to manage expenses.
- MLflow Prompt Tools: New, no-code visual tools allow users to compare outputs from various models based on a set of prompts. These prompts are automatically logged within MLflow. By integrating with Databricks Model Serving, users can deploy the relevant model into production.
Moreover, Databricks Model Serving has been optimized for LLM inference, with enhanced latency and GPU-based inference support. Databricks Lakehouse Monitoring enhances data and AI monitoring, offering end-to-end visibility into data pipelines.
“We’ve reached an inflection point for organizations: leveraging AI is no longer aspirational — it is imperative for organizations to remain competitive,” said Ali Ghodsi, Co-Founder and CEO at Databricks. “Databricks has been on a mission to democratize data and AI for more than a decade and we’re continuing to innovate as we make the lakehouse the best place for building, owning and securing generative AI models.”
The MLflow 2.5 features will be available in the July release, while new Databricks capabilities including Vector Search and Lakehouse Monitoring are currently in preview.
About Utility Analytics Institute (UAI)
UAI Enables Utility Transformation Through Analytics
UAI is a utility-led membership organization that provides support to the industry to advance the analytics profession and utility organizations of all types, sizes, and analytics maturity levels, as well as analytics professionals throughout every phase of their career.
Transforming into a data decision-based company is one of the most difficult transitions a utility will have to make to thrive in the new energy economy. It’s more than just managing massive amounts of data, implementing the right tools and technology, and people and process management. It’s ensuring you have proper change management processes in place to address cultural challenges, as well as data management and governance plans, and best practice and compliant security strategies in place. It’s implementing the best organizational structure for your utility, and hiring and retaining talented staff, plus so much more! UAI brings together leading utilities who are serious about tackling these challenges and together we concentrate on utility analytics.
What’s UAI Membership all about? UAI serves multiple audiences providing different membership packages for each audience type. Learn more about how UAI unifies our community, serves each audience to help you meet your goals and address challenges, and how each audience collaborates to better serve the utility industry.
Contact Kevin Praet, Membership Relations, at kpraet@utilityanalytics.com to learn about the benefits of becoming a member of Utility Analytics Institute (UAI).