Blog - Channel Partner
The AI-Powered Leap of Microsoft Fabric

Microsoft Fabric is more than a data platform: it's a unified ecosystem where analytics, machine learning, real-time event processing, and generative AI converge. Built atop OneLake, Spark, Power BI, and unified compute engines, Fabric integrates seamlessly with Azure AI services and Copilot—both natively and via “bring your own key (BYOK).” This integration empowers you to use OpenAI models, text analytics, translation, and more, all within the Fabric environment.
This blog breaks down Fabric’s AI stack:
- Architecture & integration layers
- Prebuilt AI models & Fabric AI functions
- Token consumption and consumption rates
- Copilot & conversational agents
- SynapseML & programmable AI pipelines
- Monitoring, governance, and cost control
- Sample applications and real-world scenarios
Strap in—this is a deep journey into Fabric‑powered AI.
1. Fabric AI Architecture: Layers That Power Intelligence
1.1 OneLake: Your Single Fabric Data Lake
Every AI experience in Fabric begins with OneLake—the centralized data storage layer built on ADLS‑Gen2. It provides a unified namespace across all workloads: lakehouses, warehouses, eventhouses, and semantic models. Whether you’re doing batch scoring or real-time processing, your inputs and outputs can live side‑by‑side.
1.2 Compute Engines
Fabric offers diverse compute options:
- Spark (Synapse‑backed) for large-scale ETL, training, inference
- Eventhouses + Eventstreams for streaming ingestion and near-real-time querying
- SQL/PBI warehouse & semantic models for OLAP
- Notebooks that blend pandas, PySpark, and AI functions
AI capabilities slot directly into this compute layer—so you write one‑line Fabric SQL or Python and get generative AI under the covers.
2. Prebuilt AI Services in Fabric (Preview)
Fabric provides two consumption paradigms:
- Prebuilt AI models (currently in preview, Fabric-native)
- Bring‑Your‑Own‑Key (BYOK) for broader Azure AI services
2.1 Prebuilt AI Models in Fabric
These are baked into Fabric with built-in authentication and are charged against your Fabric Capacity. Available services include:
- Azure OpenAI (GPT‑4o / GPT‑4o‑mini, text‑embedding‑ada‑002)
- Text Analytics: sentiment, PII, key phrases, entities, entity linking, summarization
- Azure AI Translator: translation + transliteration
These are delivered via Ready‑to‑use Python, REST, or SynapseML bindings. No extra keys, simple code, billed in CU against Fabric capacity.
2.2 BYOK via Azure AI Services
If you need custom models, Vision, Speech, Document Intelligence, or extra OpenAI capabilities, you can provide your own Azure key. SynapseML facilitates that integration, with full flexibility.
3. Token & CU Consumption in Fabric AI
Since November 1, 2024, AI usage has been billed via Fabric Capacity Units (CUs), not tokens. Here’s a breakdown:
3.1 OpenAI Language Models
Model |
Context Limit |
Input (1K Tokens) |
Output (1K Tokens) |
gpt‑4o‑2024‑05‑13 |
128K tokens |
84.03 CU‑sec |
336.13 CU‑sec |
gpt‑4o‑mini‑0718 |
128K tokens |
5.04 CU‑sec |
20.17 CU‑sec |
3.2 OpenAI Embeddings
- text‑embedding‑ada‑002: 3.36 CU‑sec per 1K tokens
3.3 Text Analytics
Applies a flat rate per 1K text records:
- Language detection, sentiment, key phrases, PII, entity recognition, entity linking: 33,613.45 CU‑sec / 1K records
- Summarization: 67,226.89 CU‑sec / 1K records
3.4 Translator
- Translate or transliterate: 336,134.45 CU‑sec / 1M characters
4. One-Line AI with Fabric AI Functions (Preview)
Fabric's AI Functions bring GenAI directly into SQL or Python Spark:
- ai.similarity
- ai.classify
- ai.analyze_sentiment
- ai.extract
- ai.fix_grammar
- ai.summarize
- ai.translate
- ai.generate_response
You write:
In SQL:
SELECT ai.summarize(text) AS summary FROM reviews;
or in Python:
df = df.withColumn('summary', aifunc.summarize(df.text))
Under the hood, Fabric uses the gpt‑4o‑mini model by default and handles authentication, scaling, and cost charging. Customers just need F2/P SKU and enable Copilot/Azure OpenAI.
5. Copilot & Generative AI Agents Across Workloads
5.1 Copilot in Notebooks & Data Engineering/Science
Copilot accelerates workflows by:
- Generating or completing code
- Adding comments and debugging
- Explaining analyses or Tensor pipelines
Deploy via a chat panel or magics. Powered by Azure OpenAI behind Fabric Capacity.
5.2 Copilot in SQL Database & Warehouse
Cleanup, autocomplete, natural-language-to-SQL: write “Show top 10 customers…” and get valid queries, explanations, or fixes.
5.3 Copilot in Power BI & Real-Time Intelligence
Craft visuals, summarize power reports, and generate KQL queries within Real-Time dashboards with Copilot magic.
These experiences consume Fabric Capacity Units and must be managed to prevent throttling.
6. Fabric Data Agent: Build Your Own Conversational AI
Fabric Data Agent is a preview tool to build Q&A agents on your data sources—lakehouse, warehouse, Power BI, KQL—with simple configuration:
- Max 5 sources, with table-level control
- Add domain-specific instructions and examples
- Uses Azure OpenAI Assistant APIs, converting natural questions into SQL/DAX/KQL
- Validates, runs, then returns structured answers, while honoring row-level security
Limitations currently include: read-only, simple query support, English-only, 25 tables max.
7. SynapseML: Massively Scalable AI Pipelines
SynapseML provides a Spark-native plug-and-play toolkit:
- Wrappers for Azure Cognitive services like vision, document translation, anomaly detection
- LLM integrations (OpenAI) mapped to DataFrame transformers
- Works equally well in Fabric prebuilt or with BYOK Azure keys
This stems from the same Fabric engine used for AI Functions, just with added flexibility and custom Azure services.
8. Monitoring & Managing AI Usage in Fabric
8.1 Billing & Capacity Tracking
Usage shows up under:
- Spark workload meter, for notebook and pipeline executions
- Fabric Capacity Metrics app for drill-down reporting
8.2 Quotas & Throttling
Preview quotas: e.g., 20 requests/min per user for OpenAI models. Watch out—heavy use may cause throttling.
8.3 CU Budgeting
Cap and monitor CU consumption to balance AI, compute, and BI needs. Automatic alerts are your ally.
9. Key Scenarios: Use Cases in the Real World
9.1 Intelligent ETL & Data Transformation
Use ai.summarize() to extract clean narrative from messy text, or ai.classify() to auto-categorize support tickets.
9.2 Semantically Enriched Models
Generate embeddings via ADA, cluster customer feedback, and feed summaries into Power BI narrative visuals.
9.3 Conversational BI
Deploy Data Agent over warehouse and lakehouse: “What were last month’s lost customers?” → SQL → answer.
9.4 Generative Documentation & Dashboards
Copilot annotates notebooks, explains DAX measures, or builds dashboards on demand.
9.5 Multilingual Insights
Translate inventory descriptions or sentiment analytics into local languages—without writing complex pipelines.
10. Future Outlook & Best Practices
10.1 Feature Evolution
Expect new AI functions and OpenAI models, richer Data Agent scenarios, tighter Copilot integration, and broader region availability.
10.2 Governance & Compliance
Ensure tenant settings are tuned for data residency, COPPA/HIPAA compliance, and responsible AI. Monitor region mappings—some services process outside home region.
10.3 Cost Optimization
Balance gpt‑4o vs gpt‑4o‑mini, combine AI Functions where possible, embed summarization or embedding in upstream pipelines to reduce CU spend.
Fabric = Data + AI, Ready to Scale
Microsoft Fabric brings AI to your data stack in an integrated, scalable, and governed way. Whether you're using prebuilt models, AI Functions, Copilot, Data Agents, SynapseML or BYOK, the platform empowers you to:
- Add generative intelligence to notebooks, ETL, and analytics
- Build conversational agents and report helpers
- Scale to large data volumes with unified CU billing
- Keep control through quotas, monitoring, and Azure governance
By understanding costs, capacities, and capabilities, you can architect AI‑driven solutions that are both powerful and sustainable.
Reach out to us at This email address is being protected from spambots. You need JavaScript enabled to view it. to discuss how you or your customer can gain AI-driven capabilities.
Write once. Compute forever. AI on Fabric is here.
Further Reading & Resources
- For my blog post, I’ve utilized the Microsoft Learn pages and you can read further here: https://learn.microsoft.com/en-us/fabric/data-science/ai-services/ai-services-overview