New Tools, New Mindsets: What the Databricks Data + AI Summit Taught Us

This June, we sent two of our senior data leaders, Jan Barták, Data Engineering Lead, and Martin Nytra, Data Services Team Lead, to the Databricks Data + AI Summit in San Francisco. As a Databricks partner, we’re committed to staying close to the platform’s evolution — and investing in our people to do so. What follows is their take on the biggest signals from the Summit, what impressed them most, and how we’re already putting those ideas into action.

Why We Flew Across the World for Databricks

We travelled to San Francisco with a clear goal: to understand where Databricks is heading and what that means for the way we build data and AI solutions. As a Databricks partner in Europe, we believe in staying close to the source. When a platform evolves as fast as this one, you don’t watch from a distance. You go see it for yourself.

This wasn’t a box-ticking trip. We came to validate the direction we’re already moving in, to challenge what we thought we knew, and to learn from the teams building the next generation of data infrastructure. Every session, demo, and conversation helped us sharpen our view on what’s real, what’s ready, and what’s worth investing in. We sat in on more than 40 expert sessions, so the dive was deep. And we came back with clarity — plus a few ideas we’re already testing.

From Platform to Decision Layer

What stood out this year wasn’t just the pace of innovation, but the direction of travel. Databricks is no longer just a platform for data engineers. It’s becoming the place where business decisions are made.

The summit signalled a clear ambition: bring AI, BI, and analytics together into a unified environment that’s accessible, governed, and ready for production. This means fewer silos, faster feedback loops, and a radically different relationship between data teams and the rest of the business.

In practical terms, it’s a move from building tools to enabling outcomes. The focus has shifted toward helping teams make better decisions. For us, that’s the key takeaway. Databricks is evolving into the decision layer for modern organisations, and that opens up a whole new way of working.

What Impressed Us Most: Three Signals That Matter

Across four days, we saw more ideas, prototypes, and feature launches than we could count. But a few moments stood out, because they felt ready to use. These are the three that stuck with us.

1. Databricks One: Data Access for Everyone

One of the clearest signals of change was the introduction of Databricks One, a new environment designed not for engineers, but for business users. That alone is a mindset shift.

With tools like AI/BI Dashboards and Genie, users can explore governed data in real time, ask questions in natural language, and get meaningful answers without waiting on reports or writing a single line of SQL. For teams still juggling ad hoc Excel sheets and disconnected dashboards, this is a major step forward.

We see real potential here. If adoption is guided well, it can reduce shadow IT, lighten the load on analytics teams, and build a stronger data culture across entire organisations.

What We’ll Be Watching

Data quality and semantic layer design become even more critical. If the foundation is solid, this approach works. If not, it still creates friction. But the intent and usability are exactly where they need to be.

What Genie Can Actually Do
  • Ask: “How did Q2 sales perform in the Czech market?” and get a governed, real-time response with visual context.

  • Use metrics defined in Unity Catalog without duplication or guessing.

  • And it works across all major clouds.

2. Agent Bricks: GenAI That Plays by the Rules

We’ve seen a lot of AI demos this year. Most fall into two categories: clever but shallow, or powerful but unpredictable. Agent Bricks stood out because it’s designed for the real world, where quality, trust, and performance actually matter.

This low-code framework lets teams build domain-specific agents with a clear purpose, embedded evaluation datasets, and automated benchmarks. It even includes “judges”, models that review LLM output and reduce hallucinations. That’s a major leap from traditional chatbot builders.

The result is more than a tech experiment. It’s a structured way to bring AI into production — with the controls, auditability, and iteration cycles that enterprises actually need.

Where We See the Value

This could finally break the GenAI pilot loop. We’ll be testing it internally to see how quickly we can build something useful and trustworthy without starting from scratch.

How Agent Bricks Compares

FeatureTraditional ChatbotAgent Bricks

Feature

Output review

Traditional Chatbot

Manual (if at all)

Agent Bricks

Built-in LLM judges

Feature

Evaluation datasets

Traditional Chatbot

Custom, external

Agent Bricks

Auto-generated

Feature

Governance integration

Traditional Chatbot

None

Agent Bricks

Unity Catalog native

Feature

Task-specific optimisation

Traditional Chatbot

Minimal

Agent Bricks

Continuous learning
3. Lakeflow & Lakebridge: Modernisation Without the Pain

Data platform migrations are rarely fun. They’re long, complex, and risky. But Databricks is clearly working to change that, and Lakeflow plus Lakebridge feel like real progress.

Lakeflow offers a unified, cloud-native approach to building and orchestrating ETL pipelines. Whether you're working in code or dragging and dropping with the Designer, the experience is smoother than anything we’ve seen from Databricks before. Add Lakebridge (which promises to automate up to 80% of legacy DWH migrations), and the path to modernisation starts looking far more realistic.

We’ve already started exploring both tools internally. For cross-functional projects, Lakeflow’s simplicity is a win. And we’re especially interested in how Lakebridge can help clients move away from outdated stacks without years of heavy lifting.

What Surprised Us

The maturity of the design. These aren’t alpha-stage features — they’re clearly built from real feedback, and they solve actual friction points we've encountered in real projects.

Why This Matters for Clients
  • Migration speed-up: Up to 2× faster project delivery.

  • Lower overhead: Serverless scaling and no need for five overlapping tools.

  • Better UX: Even non-technical users can understand the flow.

Beyond the Tech: What We Took Away

What inspired us most at the Summit was the momentum. You could feel it in every session, hallway conversation, and live demo: a shared commitment to making data and AI more accessible, more reliable, and more valuable across organisations.

We’re already shaping ideas for a hands-on demo that puts Genie to work in a real-world scenario. The use case is designed to show how natural language access to governed data can empower teams beyond IT, and we’re excited to bring it to life as a way to explore what’s truly possible.

That’s the kind of environment we want to be part of and contribute to. As consultants, our job is to help teams build what works. To connect strategy with tools, and people with outcomes. Events like this remind us why that matters.

We left San Francisco with new ideas, deeper confidence in the platform, and a clear sense of where we want to go next, both inside FLO and with the partners we work alongside.

There’s a lot to explore. And we’re ready for it.

This article was written by Martin Nytra, Data Services Team Lead, and Jan Barták, Data Engineering Lead at FLO.

Let’s Build What’s Next

If you’re exploring Databricks or scaling your AI and data capabilities, we’re here to help turn possibilities into progress. We work hands-on with the platform — and with the people who need to get real value from it.

Go top

Our growth and expansion are backed by Rockaway, one of the largest investment groups in the CEE region.

© 2025 FLO Group s.r.o., VAT: CZ11983311, Rohanské nábřeží 678/27, 186 00, Prague, Registered in the Commercial Register maintained at the Municipal Court in Prague, File No: C 357494