SAP Business Data Cloud explained: how Snowflake and Databricks fit into SAP’s multi-platform data strategy

Insights from SAP TechEd Berlin 2025: why data became SAP’s strategic core

Reflecting on SAP TechEd Berlin 2025, one theme stood out above all others: data has become the foundation of SAP’s future strategy. 

While AI demonstrations and development tooling attracted attention, the most meaningful announcements focused on how enterprise data is accessed, shared, and activated across platforms. 

At the center of this shift is SAP Business Data Cloud (BDC), a strategic layer designed for organizations operating in increasingly complex, multi-platform landscapes. Its implications are significant for customers navigating increasingly complex data landscapes. 

SAP Business Data Cloud: embracing reality, not fighting it

Enterprise data is no longer confined to a single platform. SAP systems, cloud data warehouses, and analytics tools already coexist in most organizations. What SAP Business Data Cloud does differently is acknowledge this reality instead of trying to replace it. 

BDC is designed as an open data access layer, connecting SAP data natively with platforms such as:

  • Snowflake
  • Databricks
  • Google Cloud
  • and Microsoft Fabric

The emphasis on zero-copy data sharing is particularly important: data can be consumed and analyzed without constant replication. This reduces cost, latency, and architectural complexity. 

From Etlia’s perspective, this is a welcomed and pragmatic shift. Many organizations already run hybrid data architectures, and BDC provides a way to integrate SAP data into those environments without forcing disruptive migrations or one-size-fits-all solutions. 

How SAP Business Data Cloud works with Snowflake, Databricks and Microsoft?

SAP’s message at TechEd was clear: there is no single “correct” data platform. Instead, SAP is enabling a coordinated, multi-platform data strategy where each platform plays a distinct role.

Platform roles in SAP’s multi-platform data architecture

  • SAP Business Data Cloud (BDC)
    • Open data access layer and governance backbone, connecting SAP data with external platforms securely and consistently
  • Snowflake
    • Scalable analytics and data sharing platform, positioned as an SAP-certified solution extension
  • Databricks
    • Scalable analytics and data sharing platform including advanced analytics, machine learning, and data science workloads
  • Microsoft Fabric
    • Tight integration with Microsoft 365, Power BI, and the Power Platform, especially in Microsoft-centric ecosystems

This model reflects how enterprises already work in practice: SAP remains the system of record, while analytics, AI, and innovation happen across multiple specialized platforms.

Snowflake as a Strategic Extension, Not Just an Integration 

One of the most notable announcements at TechEd was SAP’s positioning of Snowflake as a Solution Extension to SAP Business Data Cloud. 

This is more than a technical integration or partnership label. As a solution extension, Snowflake becomes: 

  • Fully certified, packaged, and sold by SAP 
  • Supported by SAP throughout the full lifecycle 
  • Aligned with SAP’s roadmap and enterprise support model 

Crucially, SAP Snowflake includes the full Snowflake feature set while complementing it with SAP-specific strengths, particularly in planning. 

From Etlia’s point of view, this confirms something we already see in customer projects: Snowflake, Databricks, and Fabric are not competing replacements for SAP, but complementary platforms. The strategic shift from SAP is clear: value is created by choosing the right tool for the right workload and ensuring that data flows cleanly, securely, and governably between platforms.

One Data Strategy, Multiple Platforms

A key takeaway from SAP TechEd Berlin 2025 is that SAP is no longer pushing a single “correct” data platform. Instead, it is enabling a multi-platform data strategy, where: 

  • SAP remains the system of record for core business processes 
  • Snowflake excels in scalable analytics and data sharing 
  • Databricks supports advanced analytics and data science 
  • Microsoft Fabric fits naturally into Microsoft-centric ecosystems 

For customers, the challenge is no longer choosing one platform over another. The real challenge is designing an architecture where these platforms work together, with clear ownership, strong governance, and measurable business value.

Data as the Foundation for Enterprise AI 

AI discussions at TechEd consistently came back to one point: AI only works if the data foundation is solid. 

SAP’s introduction of SAP-RPT-1, an AI foundation model for structured business data, reflects this mindset. Rather than focusing on language alone, SAP is investing in models that understand tables, relationships, and enterprise semantics, the kind of data that actually runs businesses. 

With Model Context Protocol (MCP), SAP is also addressing a growing need: enabling Large Language Models to access business data in a governed, contextual way. This opens the door to AI use cases that go beyond chatbots and into real decision support, analytics, and automation. 

From Etlia’s perspective, this reinforces a critical message to customers: AI success is not about tools, but about data readiness. Clean models, shared semantics, and well-designed data architectures matter more than the choice of any single AI platform. 

What This Means in Practice

SAP TechEd Berlin 2025 showed a SAP that is more open, more realistic, and more aligned with how enterprises actually operate today.

SAP Business Data Cloud, deeper partnerships with Snowflake, Databricks and Microsoft Fabric and native support for multi-platform architectures all point in the same direction. 

For organizations, the question is no longer whether SAP data will live alongside Snowflake, Databricks, or Fabric, but how well those environments are integrated. 

And that is where the work for Etlia begins.

– Juuso Maijala, CEO


Etlia is a data consultancy specializing in SAP, Snowflake, Databricks, and Microsoft Fabric architectures for enterprise environments.

Etlia works with organizations designing SAP-centric data architectures where SAP Business Data Cloud acts as the governance and access layer, while analytics, AI, and business intelligence are delivered across multiple best-of-breed platforms.

A Data Journey to Stockholm – Databricks Data + AI World Tour

Last week, the three of us: Raaju, Jaakko, and Petri, traveled to Stockholm to attend the Databricks Data + AI World Tour Stockholm. The trip was combined with Databricks Finland user group providing an opportunity to network with fellow bricksters from Finland databricks community in a relaxed atmosphere and opportunity to deep dive into current topics around data and artificial intelligence in the Data & AI event .

Afternoon Train to Turku and a Databricks Meetup at Sea

Our journey began in the afternoon with a train ride from Helsinki/Tampere to Turku, giving us a chance to step away from daily routines and get into the right mindset for the event. From Turku, we continued by ferry to Stockholm and the conference atmosphere started right away.

Onboard the ferry, a Databricks Meetup was organized, providing an excellent opportunity to meet other Databricks customers, partners, and Databricks representatives in a relaxed setting. Conversations ranged from hands-on experiences to future data and AI solutions. The meetup was a great way to kick off the trip and set the tone for the main event the following day.

Databricks Data + AI World Tour Stockholm

The Databricks Data + AI World Tour in Stockholm brought together a wide audience of Databricks customers and partners. The event is an excellent fit for Databricks customers and partners who want to understand the broader platform vision, Databricks’ direction, and learn how other organizations are leveraging data and AI in practice.

Key themes from the databricks Data+ai World Tour

The agenda highlighted several key themes: 

  • Data intelligence and the Lakehouse architecture as the foundation of modern analytics
  • Generative AI and agent-based solutions, bringing AI closer to real business processes
  • Governance, security, and scalability in data and AI solutions
  • Customer and partner sessions showcasing real-world use cases across industries

The content provided a strong, high-level view of Databricks’ capabilities and current focus areas. Coupling customer stories with Databricks execution, the keynote provided a fresh insight into how everyday problems of data is solved using Databricks. For highly experienced Databricks experts, the event could benefit from a dedicated more technical track, diving deeper into new features, architectural patterns, and advanced implementations.

This was well complemented by Databricks’ own exhibition booth, where deep technical expertise was readily available. Discussions at the booth often went beyond the level of regular sessions, allowing for detailed conversations around specific features, best practices, and real-world challenges.

Conversations and Insights

One of the biggest takeaways from the trip was the quality of conversations. Discussions with customers and partners offered valuable perspectives on how Databricks is being used in different organizations: from data integration and analytics to production-grade AI solutions.

Returning Directly from Stockholm to Helsinki

After the conference day, we returned by direct flight from Stockholm to Helsinki. The short journey provided a good moment to reflect on the key insights and lessons learned before returning to everyday work.

Key Takeaways from the Databricks Data+AI World Tour

We returned with: 

  • new perspectives on Databricks’ technological evolution
  • a clearer understanding of the platform roadmap and use cases
  • strengthened relationships with customers and partners
  • inspiration for upcoming data and AI initiatives

Finally, a big thank you to the Databricks Finland team for a well-organized event, great company, and an excellent overall experience from start to finish.

– Raaju, Jaakko & Petri

Snowflake World Tour 2025 – What’s New in the World of Data, AI and Applications 

The Snowflake World Tour 2025 took place this year on October 14th in Stockholm, once again at the 3 Arena. The event gathered over 1,500 participants and featured 33 sessions, 57 speakers, and 11 business areas. With a wide variety of sessions covering data, artificial intelligence and application development, every attendee was able to build their own agenda for the day — including me.

The day was complemented by numerous partner stands, offering opportunities to explore new technological solutions and exchange ideas with familiar partners. It also provided an excellent chance to network and share thoughts with other participants.

Snowflake World Tour 2025 Keynote Sessions 

The keynote sessions provided a comprehensive overview of how Snowflake sees the interplay of data, AI and applications transforming business. The presentations emphasized that the role of data in business is no longer limited to reporting — it’s becoming a true driver of growth. The synergy between AI, applications and analytics determines who leads the way.

The keynotes also showcased how Snowflake supports the entire data lifecycle — from collection to analytics and applications — in a seamless, scalable, and secure way.

Several global enterprises shared how they leverage Snowflake in their operations:

  • Siemens highlighted ease of use and the simplification of data utilization across hundreds of teams to accelerate innovation globally. 
  • Fiserv / PayPal emphasized enabling secure and real-time data collaboration across the global payments ecosystem. 
  • AstraZeneca discussed secure and controlled data sharing for global research. 

A few notable announcements were also made in Stockholm, such as:

  • Snowflake will be available in AWS European Sovereign Cloud
  • Snowflake is now live in Microsoft Azure’s Sweden Central Region
  • Snowpark Connect for Apache Spark allows Spark code execution directly in the Snowflake warehouse. 

Diverse Sessions on Data, AI and Application Development 

Throughout the day, there were multiple presentations and talks across various domains, including:

  • Technical hands-on sessions
  • Customer and partner solution showcases
  • Deep dives into Snowflake’s capabilities
  • Panel discussions on data-related topics

Although AI and its possibilities were on everyone’s lips, it was great to see several sessions focusing on the practical requirements for leveraging AI effectively.

Below are some highlights from the sessions I personally attended:

Snowflake OpenFlow and AI SQL: Practical Possibilities for 2025

Snowflake OpenFlow – Data Integration Made Simple 

This session showcased OpenFlow’s capabilities for data integrations. The vision behind OpenFlow is to enable loading data from all sources to all destinations.

Key takeaways from Snowflake OpenFlow:

  • Multiple deployment options, including the ability to run loads in your own cloud. 
  • Extensive pre-built connectors, with the ability to define custom ones. 
  • Easy to manage and secure, featuring advanced authentication and access control. 
  • Enables Snowpipe Streaming for near real-time data streaming. 
  • Real-time pipeline monitoring and alerts. 

AI Features for Data Utilization 

This session introduced Snowflake’s AI capabilities that help organizations get more out of their data. It also highlighted partner-provided services that can easily be integrated.

Key takeaways from Snowflake AI features:

  • Cortex Search enables fast, AI-powered text search using a hybrid keyword + vector search model, including automatic indexing and natural language queries for RAG applications. 
  • AI SQL allows natural language queries, automatic code completions, and ML model usage directly within SQL. 
  • Semantic Views enable modeling of business concepts — metrics, dimensions, and facts — directly within the database’s logical layer. 
  • Cortex Analyst lets users query structured data in natural language without writing SQL. 
  • Snowflake Intelligence combines these into an agent-driven AI system for interacting with structured and unstructured data, delivering instant answers, visualizations, and actionable insights. 

Data Management and Optimization at Snowflake World Tour 2025

Data Security and the Use of Sensitive Information in Snowflake

This session explored Snowflake’s capabilities for building sustainable solutions from a data sensitivity and security perspective — ensuring correct handling of data while enabling business value.

Key takeaways:

  • Simple methods like omitting or aggregating data to necessary levels. 
  • Restricting access via Private Listings in the Snowflake Marketplace. 
  • Sharing only selected data through Secure Views or UDFs. 
  • Protecting personal data with Dynamic Masking, or using Projection Policies to combine data without exposing it. 
  • Multi-party analysis using Data Clean Room, enabling joint data analysis without exposing raw data. 

Optimizing Snowflake Usage and Costs

This session covered ways to monitor and optimize Snowflake usage and costs, divided into three key areas: visibility, control, and optimization.

Snowflake provides built-in solutions for all these areas:

Visibility – Understanding Costs and Performance

  • Account- and organization-level cost monitoring. 
  • Detecting anomalies that increase costs unexpectedly. 
  • Real-time and historical query performance tracking. 
  • Grouping related queries (recurring, scheduled, or multi-step) to identify bottlenecks. 

Control – Manage and Limit Consumption 

  • Setting budget limits at account, resource, and tag levels. 
  • Defining resource sizes and scaling through warehouse configuration. 
  • Using auto-suspend to automatically shut down idle warehouses. 
  • Monitoring resources through resource monitors. 
  • Adaptive warehouses that self-adjust based on task requirements. 

Optimization – Cost and Performance Efficiency 

  • Track warehouse utilization metrics to understand capacity use. 
  • Review pruning history for efficient micro-partition filtering. 
  • Query insights that automatically detect performance issues and provide recommendations. 
  • Cost insights that identify potential credit or storage savings. 

Why Snowflake World Tour 2025 Was Worth Attending

 Once again, the Snowflake World Tour proved to be a must-attend event — offering something for everyone and easily tailored to each attendee’s interests. Whether you’re a customer, partner, or considering adopting Snowflake, the event provided a comprehensive view of Snowflake and its potential. Definitely worth attending again next year.


Asko Ovaska 

Partner, Senior Consultant, Etlia Oy 

.