Why Power BI Semantic Models Fall Short in an AI-Driven Data Stack
Quick Answer
Power BI’s semantic model is technically capable and widely deployed, but tying your enterprise business logic to a single vendor can limit your ability to grow. In a world where AI agents, voice interfaces, and multi‑tool environments are becoming the dominant data‑consumption layer, organizations need a portable solution that scales with the business. A universal semantic layer provides a single, governed, scalable source for Power BI, Tableau, AI agents, and any future interface. Strategy Mosaic is built for exactly this purpose.
Chris Webb, a Microsoft Fabric CAT team engineer, published a detailed post this week explaining why Power BI won’t work properly with third-party semantic models.
The author is right about the architecture: Power BI’s import mode, DAX query generation, and star‑schema assumptions create real incompatibilities when you try to layer one semantic model on top of another
The conclusion, however, is strategically incomplete.
“If Power BI is how you want your end users to consume data,” he writes, “then Power BI semantic models… are the right place to store your metrics definitions and your business logic.”
Here’s the issue: for data leaders building semantic infrastructure for AI and BI consumption, that verdict simply does not work.
The Assumption Buried in the Architecture Argument
The blog post’s argument is essentially circular: Power BI works best with Power BI semantic models, so if you want to use Power BI, you should use Power BI semantic models.
It’s technically true, but strategically insufficient.
For organizations using Power BI as their permanent, exclusive data interface, this recommendation is solid. But virtually no enterprise operates this way.
- Most run Power BI alongside Tableau
- Excel alongside AI agents
- Databricks ML pipelines alongside operational dashboards
In most environments, the data stack isn’t built around a single tool. If the semantic layer only lives inside Power BI, it’s boxed in and limited to answering only Power BI questions.
Everything else must build its own copy, connect through workarounds, or define the same metric twice and produce two different numbers. The result is logical inconsistency that causes delays, misalignment, and loss of trust in enterprise data.
The real question isn’t “what works best with Power BI?”
It’s: “where should enterprise business logic actually live in a multi-tool, AI-driven environment?”
That distinction matters because it separates tool-level optimization from architectural design decisions.
AI Is Already the Consumption Layer, and It Doesn’t Use XMLA
A glaring omission in the blog is the effect AI has on enterprise analytics.
AI agents are no longer experiments. They are actively driving real‑time decisions across financial services, retail, healthcare, and manufacturing.
- Executives are querying data through voice commands
- Sales teams are getting pipeline answers from conversational bots in Slack
- Customer-facing AI tools are answering questions about inventory, pricing, and delivery status from the same governed data that analysts query in dashboards
None of these consumption surfaces use XMLA or render Power BI visuals, and none benefit from DAX optimization or Import mode caching.
When an AI agent answers a question about revenue by region, it needs authoritative metric definitions, correct aggregation logic, and enterprise governance enforcement.
Power BI’s semantic model meets them for Power BI consumers only.
For AI consumers, it provides nothing.
The Model Context Protocol (MCP) makes this concrete: it’s an open standard that lets AI agents discover, query, and reason over governed data sources. Organizations that have invested in a semantic layer exposing metrics, relationships, and business logic via MCP are now serving that same governed data to ChatGPT, Claude, Copilot, or Gemini — without rebuilding logic for each tool.
Organizations that have stored their business logic inside Power BI’s semantic model are rebuilding from scratch.
Simply put, Power BI’s semantic model doesn’t serve AI consumption. The author doesn’t address this because his architecture ends at the BI layer.
In modern enterprise analytics, the boundary between BI and AI is dissolving, and architectures that stop at the BI layer are increasingly insufficient.
The Lock-In You Don’t See Until It’s Too Late
The post argues that Power BI semantic models are not “closed” because Tableau and Excel can query them. That’s technically true, but it doesn’t matter.
The governance, security, and metric definitions inside a Power BI semantic model are expressed in DAX and enforced by the Power BI engine. They are not portable. If an organization migrates to a different BI platform or routes queries through a new AI interface, those definitions must be rebuilt from scratch in the target system.
The “openness” the author describes is essentially read‑only connectivity to a Power BI artifact, not portability of the business logic itself.
A universal semantic layer works differently. Business logic, metric definitions, and security policies are expressed in a form every consumer can interpret consistently.
- Power BI connects to it and gets the right answer
- Tableau connects to it and gets an identical answer
- An AI agent connects to it via MCP and gets the same answer
Business logic travels because the semantic layer owns it, not the frontend.
Open Standards Are an Engineering Decision, Not a Philosophical One
The reason the modern data stack standardized on SQL, Parquet, Arrow, and Delta is not philosophical. The driver is economic.
Open formats reduce integration cost, survive vendor transitions, and prevent organizations from rebuilding the same logic every time the tool ecosystem shifts.
The semantic layer is following the same arc, and the TCO math is concrete. Strategy Software joined the OSI initiative for exactly this reason, a commitment detailed in this post on what open semantic interoperability actually means for enterprise data teams.
When business logic is expressed in open, portable formats, each new consumption surface costs nothing to connect. When it is locked in DAX inside a Power BI semantic model, each new surface requires re‑expressing that logic from scratch.
At enterprise scale, a DAX‑first semantic layer serving a hundred metric definitions across three new AI interfaces represents hundreds of engineering hours per interface — before accounting for the ongoing cost of keeping separate definitions in sync as underlying data changes.
That cost is not a one-time migration. It compounds with every new tool, interface, or consumption layer introduced into the stack.
Open protocol support is how organizations avoid paying that bill on a loop.
What Future-Proof Semantic Layer Architecture Actually Looks Like
A semantic layer built for the next generation of enterprise data has four properties that Power BI’s semantic model doesn’t fully provide:
- Frontend-agnostic by design: The same metric definitions, relationships, and security policies apply whether the query comes from a BI tool, a Python notebook, a spreadsheet, or an AI agent. No duplicate logic. No reconciliation.
This is the architectural shift behind Strategy Mosaic: a universal semantic layer connecting to 200+ data sources and exposing governed business logic to Power BI, Tableau, Excel, and AI agents from a single source. - AI-native by protocol: Business logic is exposed through open protocols so AI agents can discover and query governed metrics without natural‑language‑to‑SQL guesswork. MCP is the current implementation: semantic definitions can be queried by any MCP‑compatible interface, with no per‑tool rebuild.
- Governance enforced at the layer, not replicated per tool: Row‑level security, column‑level security, and access control live in one place and apply automatically to every consumer. One policy set; every connected tool inherits it.
- Durable across tool cycles: Business logic stored in open, portable formats survives tool migrations, vendor changes, and consumption surfaces that don’t exist yet. The semantic layer outlasts any individual frontend.
The unifying principle across all four is separation: the semantic layer owns the logic; the frontends do not.
This is why organizations running Strategy Mosaic today use Power BI and AI agents on top of it simultaneously. Not because Mosaic replaces Power BI, but because it serves both without belonging to either.
What Data Leaders Should Actually Do
For organizations using Power BI as their permanent, exclusive analytics interface, centralizing logic inside Power BI’s semantic model is the only choice.
For everyone else prioritizing independence and scalability, the goal is to deliver governed business logic into the systems that your consumers use.
That is exactly what a universal semantic layer like Strategy Mosaic supports.
It’s built for organizations that need a portable, governed, AI‑native semantic layer, and delivers a consistent, reliable source of truth across every tool and workflow.
Your business logic should outlast every tool you use. Learn how Strategy Mosaic unifies BI and AI consumption without locking that logic into a single vendor.
Frequently Asked Questions
Q: Is Power BI’s semantic model a universal semantic layer?
No. Power BI’s semantic model is a capable, mature semantic layer for Power BI consumers. Strategy Software defines a universal semantic layer as one that serves any analytics consumer, including BI tools, AI agents, spreadsheets, and operational applications, from a single governed source. Power BI’s semantic model does not meet this definition because its business logic is expressed in DAX and enforced by the Power BI engine, making it incompatible with non-Power BI consumers without duplication.
Q: Can AI agents query Power BI semantic models?
Not natively through open protocols. Power BI semantic models are queryable via XMLA, which BI tools like Tableau can implement, but AI agents using the Model Context Protocol cannot query Power BI semantic models directly. Organizations that need AI agents to query governed business logic require a semantic layer that exposes MCP endpoints, such as Strategy Mosaic’s MCP integration, not a Power BI semantic model.
Q: What is the risk of storing business logic in a Power BI semantic model?
According to Strategy Software, the primary risk is concentration: when business logic lives inside a vendor-specific format like Power BI’s DAX-based semantic model, it cannot be accessed by non-Power BI consumers without rebuilding the logic in each new tool. As AI agents, voice interfaces, and multi-tool environments grow, organizations that have stored business logic exclusively in Power BI face significant rebuilding costs.
Q: Does a universal semantic layer replace Power BI?
No. Strategy Software’s position is that a universal semantic layer works alongside Power BI as its data layer, not instead of it. Strategy Mosaic connects to Power BI via native XMLA/DAX connectivity, so Power BI reports continue to function while the same semantic definitions also serve Tableau, AI agents, and other consumers. The semantic layer sits beneath the frontend, not in competition with it.
Content:
- The Assumption Buried in the Architecture Argument
- AI Is Already the Consumption Layer, and It Doesn’t Use XMLA
- The Lock-In You Don’t See Until It’s Too Late
- Open Standards Are an Engineering Decision, Not a Philosophical One
- What Future-Proof Semantic Layer Architecture Actually Looks Like
- What Data Leaders Should Actually Do
- Frequently Asked Questions






