
The numbers paint a picture of remarkable transformation in enterprise data management. Snowflake is running at a $3.8 billion annual pace. Databricks hit $2.6 billion in revenue, growing 57% year-over-year. Industry analysts project that the cloud data warehouse market will reach $58 billion by 2034.
By most measures, the “modern data stack” has fundamentally changed how companies collect, store and process information. But there’s a catch that many organizations are starting to recognize: While the backend infrastructure has evolved dramatically, the way business users actually consume and act on data hasn’t kept pace.
Diagnosing the Dashboard Dilemma
The reality for most business users today looks remarkably similar to what it did a decade ago. Power BI and Tableau continue to dominate the analytics landscape, consistently ranking as the most-deployed BI tools according to Gartner research. These platforms have certainly improved over the years, but they’re still fundamentally built around the same core premise.
The problem becomes apparent when you look at user behavior. Research shows that 90% of users abandon dashboards they find too complex. Static dashboards force users to “know the question before they look,” which fundamentally limits both adoption and insight discovery.
This pattern is consistent across industries. Companies invest millions in sophisticated data infrastructure, then present the results through interfaces that haven’t evolved much since the early 2000s. Meanwhile, Excel remains as fashionable as ever among business users who find it more intuitive than purpose-built analytics tools.
The root cause of this disconnect runs deeper than interface design. The entire modern data stack—from ingestion pipelines to transformation layers to machine learning models—remains largely invisible to business users. What they see is the final artifact: A dashboard. But here’s the fundamental issue: These BI tools were built for data analysts, not business users. Analysts use Power BI and Tableau to conduct analysis and answer business questions, then share their findings through dashboards.
The modern data stack essentially ends at these analyst-focused tools. The final layer—true self-service for business users—was never actually built. Instead, we’ve created an elaborate system where sophisticated pipelines process petabytes in real-time, analysts translate that into static visualizations, and business users are left to manually interpret pre-built dashboards that may or may not address their specific questions.
Assessing the AI Opportunity
Artificial Intelligence is beginning to change how this final mile operates, and early adopters are seeing measurable results. Companies that allow users to ask questions in plain English and return narrative answers alongside visuals are experiencing significant growth.
Analytics vendors are taking different approaches. Some platforms automatically flag statistically significant drivers behind metric changes without requiring users to build queries. Others go further, acting as autonomous analysts that provide prescriptive suggestions and can even trigger downstream actions.
These AI-native applications represent a shift from asking users to interpret charts to providing automatic anomaly detection, natural language explanations of business patterns and specific action recommendations based on data insights.
The challenge is a big one—74% of companies report they still struggle to scale AI impact beyond pilot projects. But with generative AI, we stand on the tantalizing possibility of changing these dynamics for good – capabilities that simply did not exist before.
The Platform Play Problem
The modern enterprise data landscape faces three critical challenges that make AI adoption particularly difficult. First, data sprawl has reached unprecedented levels, with organizations maintaining data across dozens of systems – from legacy warehouses and cloud lakes to SaaS applications and operational databases. This fragmentation means that even sophisticated AI tools can only see fragments of the complete data picture.
Second, most companies remain trapped in long, partial transformation efforts. Years into cloud migrations, many organizations still operate hybrid environments where critical data remains locked in on-premises systems while newer analytics workloads run in the cloud. These incomplete transformations create significant blind spots for AI systems that need comprehensive data access to generate meaningful insights.
Third, traditional data platform cost models bear little relationship to business value creation. Organizations pay based on compute usage, storage volume, or query complexity rather than the actual business outcomes their data generates. This disconnect makes it difficult to justify AI investments when the underlying infrastructure costs don’t align with measurable business results.
Major data platform vendors have recognized this opportunity and are making moves. Databricks launched “Lakehouse IQ,” an AI engine trained specifically on Databricks artifacts and usage patterns. Snowflake introduced “Cortex Intelligence,” which embeds generative AI directly into SQL queries and workflows.
These solutions offer deep integration with their respective platforms, but they also reveal a fundamental limitation. Lakehouse IQ has extensive knowledge of Databricks lineage but limited visibility outside that ecosystem. Cortex Intelligence is optimized for data resident in Snowflake, accessing external sources primarily through connectors.
This platform-centric approach runs counter to how most enterprises actually operate. Recent surveys indicate that 53% of IT executives rank hybrid and multi-cloud data warehouses as “increasingly important” to their strategies.
The reality is that enterprise data lives everywhere—in Snowflake, Databricks, traditional databases, SaaS applications like Sharepoint, cloud storage services like Amazon S3 and other file storage repositories. Any solution that ties organizations to a single platform just recreates the silos they’ve been trying to eliminate.
The Final Mile
Organizations that successfully modernize their data consumption layer share several characteristics. They prioritize AI-native solutions built from the ground up rather than traditional BI tools with AI features bolted on. They demand cross-platform capabilities that work with any data source. They focus on business outcomes rather than data exploration. And they meet end users where they are—becoming enablers of day-to-day work rather than requiring new technical skills. A Financial Analyst’s role is to deliver better financial outcomes, which happens more effectively when their expertise is powered by data rather than constrained by technical barriers.
But technology alone doesn’t determine success. Two additional factors separate successful deployments from expensive pilot programs.
First is governance and trust. AI insight layers must respect existing data access controls and audit trails. Even the most intelligent recommendations fail if users can’t trust the underlying data or if they create compliance risks.
Second is organizational change management. Perfect AI recommendations are worthless if workflows and incentives don’t adapt accordingly. Companies must rethink how they make decisions, not just how they access data.
The modern data stack revolution has transformed the backend infrastructure that powers business intelligence. The next phase will focus on completing that transformation by reimagining how business users actually consume and act on insights. That shift requires recognizing that the last mile isn’t fundamentally a visualization problem—it’s an application problem that demands both technological innovation and organizational evolution.
Central to this evolution is moving away from pricing models based on data volumes, compute hours, or storage capacity toward models tied directly to business outcomes. Instead of paying for terabytes processed or queries executed, organizations should pay based on the measurable business value their data generates – revenue influenced, costs reduced, or decisions improved. This alignment between cost and value creation would fundamentally change how companies think about data investments, making AI-driven insights financially sustainable at scale rather than just technically impressive in pilot projects.
The question facing most organizations isn’t whether this transformation will happen, but how quickly they can adapt to capitalize on it.