
Today’s world is awash with data—ever-streaming from the devices we use, the applications we build, and the interactions we have. Organizations across every industry have harnessed this data to digitally transform and gain competitive advantages. And now, as we enter a new era defined by AI, this data is becoming even more important.
Generative AI and language model services, such as Azure OpenAI Service, are enabling customers to use and create everyday AI experiences that are reinventing how employees spend their time. Powering organization-specific AI experiences requires a constant supply of clean data from a well-managed and highly integrated analytics system. But most organizations’ analytics systems are a labyrinth of specialized and disconnected services.
And it’s no wonder given the massively fragmented data and AI technology market with hundreds of vendors and thousands of services. Customers must stitch together a complex set of disconnected services from multiple vendors themselves and incur the costs and burdens of making these services function together.
Introducing Microsoft Fabric
Today we are unveiling Microsoft Fabric—an end-to-end, unified analytics platform that brings together all the data and analytics tools that organizations need. Fabric integrates technologies like Azure Data Factory, Azure Synapse Analytics, and Power BI into a single unified product, empowering data and business professionals alike to unlock the potential of their data and lay the foundation for the era of AI.
Watch a quick overview:

What sets Microsoft Fabric apart?
Fabric is an end-to-end analytics product that addresses every aspect of an organization’s analytics needs. But there are five areas that really set Fabric apart from the rest of the market:
1. Fabric is a complete analytics platform
Every analytics project has multiple subsystems. Every subsystem needs a different array of capabilities, often requiring products from multiple vendors. Integrating these products can be a complex, fragile, and expensive endeavor.
With Fabric, customers can use a single product with a unified experience and architecture that provides all the capabilities required for a developer to extract insights from data and present it to the business user. And by delivering the experience as software as a service (SaaS), everything is automatically integrated and optimized, and users can sign up within seconds and get real business value within minutes.
Fabric empowers every team in the analytics process with the role-specific experiences they need, so data engineers, data warehousing professionals, data scientists, data analysts, and business users feel right at home.

Fabric comes with seven core workloads:
- Data Factory (preview) provides more than 150 connectors to cloud and on-premises data sources, drag-and-drop experiences for data transformation, and the ability to orchestrate data pipelines.
- Synapse Data Engineering (preview) enables great authoring experiences for Spark, instant start with live pools, and the ability to collaborate.
- Synapse Data Science (preview) provides an end-to-end workflow for data scientists to build sophisticated AI models, collaborate easily, and train, deploy, and manage machine learning models.
- Synapse Data Warehousing (preview) provides a converged lake house and data warehouse experience with industry-leading SQL performance on open data formats.
- Synapse Real-Time Analytics (preview) enables developers to work with data streaming in from the Internet of Things (IoT) devices, telemetry, logs, and more, and analyze massive volumes of semi-structured data with high performance and low latency.
- Power BI in Fabric provides industry-leading visualization and AI-driven analytics that enable business analysts and business users to gain insights from data. The Power BI experience is also deeply integrated into Microsoft 365, providing relevant insights where business users already work.
- Data Activator (coming soon) provides real-time detection and monitoring of data and can trigger notifications and actions when it finds specified patterns in data—all in a no-code experience.
You can try these experiences today by signing up for the Microsoft Fabric free trial.
2. Fabric is lake-centric and open
Today’s data lakes can be messy and complicated, making it hard for customers to create, integrate, manage, and operate data lakes. And once they are operational, multiple data products using different proprietary data formats on the same data lake can cause significant data duplication and concerns about vendor lock-in.
OneLake—The OneDrive for data
Fabric comes with a SaaS, multi-cloud data lake called OneLake that is built-in and automatically available to every Fabric tenant. All Fabric workloads are automatically wired into OneLake, just like all Microsoft 365 applications are wired into OneDrive. Data is organized in an intuitive data hub, and automatically indexed for discovery, sharing, governance, and compliance.
OneLake serves developers, business analysts, and business users alike, helping eliminate pervasive and chaotic data silos created by different developers provisioning and configuring their own isolated storage accounts. Instead, OneLake provides a single, unified storage system for all developers, where discovery and sharing of data are easy with policy and security settings enforced centrally. At the API layer, OneLake is built on and fully compatible with Azure Data Lake Storage Gen2 (ADLSg2), instantly tapping into ADLSg2’s vast ecosystem of applications, tools, and developers.
A key capability of OneLake is “Shortcuts.” OneLake allows easy sharing of data between users and applications without having to move and duplicate information unnecessarily. Shortcuts allow OneLake to virtualize data lake storage in ADLSg2, Amazon Simple Storage Service (Amazon S3), and Google Storage (coming soon), enabling developers to compose and analyze data across clouds.
Open data formats across analytics offerings
Fabric is deeply committed to open data formats across all its workloads and tiers. Fabric treats Delta on top of Parquet files as a native data format that is the default for all workloads. This deep commitment to a common open data format means that customers need to load the data into the lake only once and all the workloads can operate on the same data, without having to separately ingest it. It also means that OneLake supports structured data of any format and unstructured data, giving customers total flexibility.
By adopting OneLake as our store and Delta and Parquet as the common format for all workloads, we offer customers a data stack that’s unified at the most fundamental level. Customers do not need to maintain different copies of data for databases, data lakes, data warehousing, business intelligence, or real-time analytics. Instead, a single copy of the data in OneLake can directly power all the workloads.
Managing data security (table, column, and row levels) across different data engines can be a persistent nightmare for customers. Fabric will provide a universal security model that is managed in OneLake, and all engines enforce it uniformly as they process queries and jobs. This model is coming soon.
3. Fabric is powered by AI
We are infusing Fabric with Azure OpenAI Service at every layer to help customers unlock the full potential of their data, enabling developers to leverage the power of generative AI against their data and assisting business users to find insights in their data. With Copilot in Microsoft Fabric in every data experience, users can use conversational language to create dataflows and data pipelines, generate code and entire functions, build machine learning models, or visualize results. Customers can even create their own conversational language experiences that combine Azure OpenAI Service models and their data and publish them as plug-ins.
Copilot in Microsoft Fabric builds on our existing commitments to data security and privacy in the enterprise. Copilot inherits an organization’s security, compliance, and privacy policies. Microsoft does not use organizations’ tenant data to train the base language models that power Copilot.
Copilot in Microsoft Fabric will be coming soon. Stay tuned to the Microsoft Fabric blog for the latest updates and public release date for Copilot in Microsoft Fabric.
4. Fabric empowers every business user
Customers aspire to drive a data culture where everyone in their organization is making better decisions based on data. To help our customers foster this culture, Fabric deeply integrates with the Microsoft 365 applications people use every day.
Power BI is a core part of Fabric and is already infused across Microsoft 365. Through Power BI’s deep integrations with popular applications such as Excel, Microsoft Teams, PowerPoint, and SharePoint, relevant data from OneLake is easily discoverable and accessible to users right from Microsoft 365—helping customers drive more value from their data
With Fabric, you can turn your Microsoft 365 apps into hubs for uncovering and applying insights. For example, users in Microsoft Excel can directly discover and analyze data in OneLake and generate a Power BI report with a click of a button. In Teams, users can infuse data into their everyday work with embedded channels, chat, and meeting experiences. Business users can bring data into their presentations by embedding live Power BI reports directly in Microsoft PowerPoint. Power BI is also natively integrated with SharePoint, enabling easy sharing and dissemination of insights. And with Microsoft Graph Data Connect (preview), Microsoft 365 data is natively integrated into OneLake so customers can unlock insights on their customer relationships, business processes, security and compliance, and people productivity.
5. Fabric reduces costs through unified capacities
Today’s analytics systems typically combine products from multiple vendors in a single project. This results in computing capacity provisioned in multiple systems like data integration, data engineering, data warehousing, and business intelligence. When one of the systems is idle, its capacity cannot be used by another system causing significant wastage.
Purchasing and managing resources is massively simplified with Fabric. Customers can purchase a single pool of computing that powers all Fabric workloads. With this all-inclusive approach, customers can create solutions that leverage all workloads freely without any friction in their experience or commerce. The universal compute capacities significantly reduce costs, as any unused compute capacity in one workload can be utilized by any of the workloads.
Explore how our customers are already using Microsoft Fabric
Ferguson
Ferguson is a leading distributor of plumbing, HVAC, and waterworks supplies, operating across North America. And by using Fabric to consolidate their analytics stack into a unified solution, they are hoping to reduce their delivery time and improve efficiency.
“Microsoft Fabric reduces the delivery time by removing the overhead of using multiple disparate services. By consolidating the necessary data provisioning, transformation, modeling, and analysis services into one UI, the time from raw data to business intelligence is significantly reduced. Fabric meaningfully impacts Ferguson’s data storage, engineering, and analytics groups since all these workloads can now be done in the same UI for faster delivery of insights.”
—George Rasco, Principal Database Architect, Ferguson(Video) Webinar Series: Introduction to Microsoft Fabric
See Fabric in action at Ferguson:
T-Mobile
T-Mobile, one of the largest providers of wireless communications services in the United States, is focused on driving disruption that creates innovation and better customer experiences in wireless and beyond. With Fabric, T-Mobile hopes they can take their platform and data-driven decision-making to the next level.
“T-Mobile loves our customers and providing them with new Un-Carrier benefits! We think that Fabric’s upcoming abilities will help us eliminate data silos, making it easier for us to unlock new insights into how we show our customers even more love. Querying across the lakehouse and warehouse from a single engine—that’s a game changer. Spark compute on-demand, rather than waiting for clusters to spin up, is a huge improvement for both standard data engineering and advanced analytics. It saves three minutes on every job, and when you’re running thousands of jobs an hour, that really adds up. And being able to easily share datasets across the company is going to eliminate so much data duplication. We’re really looking forward to these new features.”
—Geoffrey Freeman, MTS, Data Solutions and Analytics, T-Mobile
Aon
Aon provides professional services and management consulting services to a vast global network of customers. With the help of Fabric, Aon hopes that they can consolidate more of their current technology stack and focus on adding more value to their clients.
“What’s most exciting to me about Fabric is simplifying our existing analytics stack. Currently, there are so many different PaaS services across the board that when it comes to modernization efforts for many developers, Fabric helps simplify that. We can now spend less time building infrastructure and more time adding value to our business.”
—Boby Azarbod, Data Services Lead, Aon
What happens to current Microsoft analytics solutions?
Existing Microsoft products such as Azure Synapse Analytics, Azure Data Factory, and Azure Data Explorer will continue to provide a robust, enterprise-grade platform as a service (PaaS) solution for data analytics. Fabric represents an evolution of those offerings in the form of a simplified SaaS solution that can connect to existing PaaS offerings. Customers will be able to upgrade from their current products into Fabric at their own pace.
Get startedwith Microsoft Fabric
Microsoft Fabric is currently in preview. Try out everything Fabric has to offer by signing up for the free trial—no credit card information is required. Everyone who signs up gets a fixed Fabric trial capacity, which may be used for any feature or capability from integrating data to creating machine learning models. Existing Power BI Premium customers can simply turn on Fabric through the Power BI admin portal. After July 1, 2023, Fabric will be enabled for all Power BI tenants.

Microsoft Fabric resources
If you want to learn more about Microsoft Fabric, consider:
- Signing up for the Microsoft Fabric free trial.
- Visiting the Microsoft Fabric website.
- Reading the more in-depth Fabric experience announcement blogs:
- Exploring the Fabric technical documentation.
- Exploring Fabric through the Guided Tour.
- Joining the Fabric community to post your questions, share your feedback, and learn from others.
FAQs
What is Microsoft data fabric? ›
On May 23rd 2023 (at the Microsoft Build conference), Microsoft Fabric was announced in Public Preview, a brand new unified data & analytics platform that brings together, and improves upon, Microsoft's existing suite of data products.
Which Azure tool can help you build AI applications? ›Use familiar tools like Jupyter and Visual Studio Code, alongside frameworks like PyTorch on Azure, TensorFlow, and Scikit-Learn.
What is Microsoft Azure machine learning AI platform? ›Azure Machine Learning is an end-to-end platform for building, training, and deploying machine learning models at scale, while helping organizations understand, protect, and control data, models and processes.
Which Azure service provide an analytics service that can be used for the purposes of machine learning? ›Azure Kubernetes Service Edge Essentials is an on-premises Kubernetes implementation of Azure Kubernetes Service (AKS) that automates running containerized applications at scale.
What is Azure service fabric used for? ›Azure Service Fabric is a distributed systems platform that makes it easy to package, deploy, and manage scalable and reliable microservices and containers. Service Fabric also addresses the significant challenges in developing and managing cloud native applications.
What is the purpose of data fabric? ›A data fabric ultimately helps your organization unleash the power of data to meet business demands and gain a competitive edge. It allows your IT organization to better harness the power of hybrid cloud, build a hybrid multicloud experience, and modernize storage through data management.
Which software is best for making AI? ›- Comparison Table of AI Software.
- #1) Google Cloud Machine Learning Engine.
- #2) Azure Machine Learning Studio.
- #3) TensorFlow.
- #4) H2O.AI.
- #5) Cortana.
- #6) IBM Watson.
- #7) Salesforce Einstein.
- Register the model.
- Prepare an entry script.
- Prepare an inference configuration.
- Deploy the model locally to ensure everything works.
- Choose a compute target.
- Deploy the model to the cloud.
- Test the resulting web service.
Modernize your business processes faster with Azure Applied AI Services. Applied AI Services bring together Azure Cognitive Services, task-specific AI, and business logic to offer you turnkey AI services for common business processes.
Is Microsoft Azure AI good? ›Microsoft Azure Machine Learning provides highest availability and is very pocket friendly for any sized company. Its intelligent bot service provides great customer service by interacting them with very high speed.
What are four processes of Azure automated machine learning? ›
Using Azure Machine Learning, you can design and run your automated ML training experiments with these steps: Identify the ML problem to be solved: classification, forecasting, regression, computer vision or NLP.
Who uses Azure machine learning? ›Azure Machine Learning is a cloud service for accelerating and managing the machine learning project lifecycle. Machine learning professionals, data scientists, and engineers can use it in their day-to-day workflows: Train and deploy models, and manage MLOps.
Which three types of services are provided by Azure? ›In addition, Azure offers four different forms of cloud computing: infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS) and serverless functions.
Which are the two services in Azure that can be used to process the data? ›Two services that are especially important are Azure SQL Database and Azure Cosmos DB. Azure SQL Database is a managed service for hosting SQL Server databases (although it's not 100% compatible with SQL Server).
Why use Azure for Machine Learning? ›If you have a lot of data to train your ML model, Microsoft Azure's Machine Learning API can help you accelerate the process. The ML API provides powerful tools to do ML that can help you build intelligent applications and gain insights from your data.
What types of services are support by Azure service Fabric? ›Service Fabric is an open source project and it powers core Azure infrastructure as well as other Microsoft services such as Skype for Business, Intune, Azure Event Hubs, Azure Data Factory, Azure Cosmos DB, Azure SQL Database, Dynamics 365, and Cortana.
What is the difference between Azure functions and service Fabric? ›Azure Functions are tiny. Service Fabric, and App Services have focused on deploying complete services. An Azure Function is really just a method call. As result a complete microservice may actually be made up of a collection of Azure Functions.
Is Azure service Fabric still relevant? ›Service Fabric is definitely not dying. In fact, it's evolving. The other thing is, it's evolution is not as public as other alternatives, but there are reasons for it. First of all, SF came out of Microsoft as it's internal product.
What problem does data fabric solve? ›A data fabric is used to reduce the amount of data management required, and to provide a single point of control for managing resources and settings across multiple physical and virtual resources.
What is data fabric in simple words? ›What is a data fabric? Data fabric is an architecture that facilitates the end-to-end integration of various data pipelines and cloud environments through the use of intelligent and automated systems.
What is the impact of data fabric? ›
A data fabric architecture eliminates the negative consequences of data silos. Remember, it's a tool that has an interconnected approach, integrating data across silos and creating a smoother data exchange and analysis flow within the same organization.
What is the smartest AI in 2023? ›The best overall AI chatbot is the new Bing due to its exceptional performance, versatility, and free availability. It uses OpenAI's cutting-edge GPT-4 language model, making it highly proficient in various language tasks, including writing, summarization, translation, and conversation.
What is the easiest AI program? ›Python. Python is one of the most popular go-to choices for AI programmers. Python has a variety of features that make it well-suited for AI programming. Firstly, the language is easy to learn and read.
How do you implement an AI model? ›- Step 1: The First Component to Consider When Building the AI Solution Is the Problem Identification.
- Step 2: Have the Right Data and Clean It.
- Step 3: Create Algorithms.
- Step 4: Train the Algorithms.
- Step 5: Opt for the Right Platform.
- Step 6: Choose a Programming Language.
- Step 7: Deploy and Monitor.
Azure OpenAI Service runs on the Azure global infrastructure to meet your production needs, such as critical enterprise security, compliance, and regional availability. Make your deployment more secure and trusted with role-based authentication and private network connectivity.
Does Azure AI require coding? ›You do not need coding skills to use Microsoft Azure.
The Microsoft Azure web portal provides all the functionality you need to manage your cloud infrastructure without previous coding experience.
These services include: Azure cognitive services, including a variety services related to language and language processing (speech recognition, speech formation, translations), text recognition, and image and character recognition. The services can be used, for example, in various bot-based solutions.
What are the roles of Azure AI engineer? ›Azure AI engineers have experience developing solutions that use languages such as Python or C# and should be able to use REST-based APIs and software development kits (SDKs) to build secure image processing, video processing, natural language processing (NLP), knowledge mining, and conversational AI solutions on Azure ...
How difficult is Azure AI Fundamentals? ›Preparing for the Exam
The AI-900 exam required by this certification is a relatively easy one and will also help prepare you for more specific certifications such as the Azure AI Engineer certification or Azure Data Scientist certification.
The Exam AI-900 becomes a little more challenging as a result of all of this. Some questions are really tricky, so make sure you understand the difference between the terms and choose the best solution in the real environment. Moreover, there is no straightforward rule to ace the exam.
What is Microsoft's AI called? ›
Microsoft recently announced its artificial intelligence (AI)-powered digital assistant named 'Copilot'. According to Microsoft CEO, Satya Nadella, the AI software has the potential to "Revolutionize how people operate fundamentally and unleash a new wave of productivity increase."
What are the 4 stages of an AI workflow? ›- Digitalise and collect data. Collecting and storing data is one of the most important steps of the AI workflow. ...
- Transform and build model. ...
- Build and train. ...
- Execute. ...
- Translate the action.
There are three types of machine learning: Supervised Learning, Unsupervised Learning and Reinforcement Learning.
Which automation tool used in Azure? ›Terraform is an automation tool that allows you to define and create an entire Azure infrastructure with a single template format language - the HashiCorp Configuration Language (HCL).
What language Azure uses? ›Azure supports the most popular programming languages in use today, including Python, JavaScript, Java, . NET and Go.
When and who uses Azure? ›Interestingly, 80 percent of the Fortune 500 companies use Azure services for their cloud computing needs. Azure supports multiple programming languages, including Java, Node Js, and C#. Another benefit of Azure is the number of data centers it has around the world.
What are the 4 types of Azure? ›- Azure Blob Storage. Blob is one of the most common Azure storage types. ...
- Azure Files. Azure Files is Microsoft's managed file storage in the cloud. ...
- Azure Queue Storage. ...
- Azure Table. ...
- Azure Managed Disks.
- Limitless analytics with unmatched time to insight.
- Design AI with Apache Spark™-based analytics.
- Microsoft Purview. ...
- Hybrid data integration at enterprise scale, made easy.
- Provision cloud Hadoop, Spark, R Server, HBase, and Storm clusters.
- Azure Stream Analytics. ...
- Azure Machine Learning.
- Azure Blob Storage.
- Azure AD (Active Directory)
- Azure Cosmos DB (Database)
- Logic Apps.
- Azure Data Factory.
- Azure CDN (Content Delivery Network)
- Azure Backup.
- Azure API (Application Programming Interfaces) Management.
The Azure cloud platform is more than 200 products and cloud services designed to help you bring new solutions to life—to solve today's challenges and create the future. Build, run, and manage applications across multiple clouds, on-premises, and at the edge, with the tools and frameworks of your choice.
What are the main components of Azure platform? ›
The Windows Azure runtime environment provides a scalable compute and storage hosting environment along with management capabilities. It has three major components: Compute, Storage and the Fabric Controller.
What data types are supported in Azure? ›bigint, int, smallint, tinyint, all string types (ntext, nvarchar, char, …) float, real, decimal, numeric, all string types ( ntext, nvarchar, char, …) datetime, datetime2, datetimeoffset, all string types ( ntext, nvarchar, char, …) bigint, int, smallint, tinyint, bit, all string types (ntext, nvarchar, char, …)
What is the benefit of using Azure? ›Scalability
Azure allows organizations to scale their resources up or down as needed, which can help to reduce costs and ensure that they have the resources they need to meet demand.
Process Automation in Azure Automation allows you to automate frequent, time-consuming, and error-prone management tasks. This service helps you focus on work that adds business value. By reducing errors and boosting efficiency, it also helps to lower your operational costs.
What are the benefits and usage of Azure virtual machines? ›An Azure virtual machine gives you the flexibility of virtualization without having to buy and maintain the physical hardware that runs it. However, you still need to maintain the virtual machine by performing tasks, such as configuring, patching, and installing the software that runs on it.
What is the difference between data catalog and data fabric? ›A data catalog is the foundation of a Data Fabric structure – it is the first (gray) layer. It supports the identification, collection, and analysis of all data sources and all types of metadata. The data catalog is a starting point for a Data Fabric.
What is data fabric for dummies? ›Data fabric is an architecture that facilitates the end-to-end integration of various data pipelines and cloud environments through the use of intelligent and automated systems.
What is the difference between data hub and data fabric? ›In my opinion, a data hub is the only concept that is a true architecture, as it defines a topology. The data fabric idea builds on many concepts introduced by the data hub, but really defines a stack of technologies with which to augment your data architectures.
Is data fabric same as data virtualization? ›Data fabric is used to simplify data discovery, governance and active metadata management. Data virtualization is used when there is a need to integrate data quickly. Data fabric should be used when an organization requires a centralized platform to access, manage and govern all data.
Who needs data fabric? ›Businesses can employ a data fabric to harness data from customer activities and understand how interacting with customers can offer more value. This could include consolidating real-time data of different sales activities, the time it takes to onboard a customer, and customer satisfaction KPIs.
What is data fabric software? ›
Data fabric is an end-to-end data integration and management solution, consisting of architecture, data management and integration software, and shared data that helps organizations manage their data.
What is data fabric components? ›More specifically, a data fabric: Connects to any data source via pre-packaged connectors and components, eliminating the need for coding. Provides data ingestion and integration capabilities – between and among data sources as well as applications. Supports batch, real-time, and big data use cases.
What are the disadvantages of data fabric? ›Cons include - complexity, integration challenges, data security, potential lack of vendor support, and limited integration options. These pros and cons are not exhaustive but provide a good starting point for organizations evaluating their options.
How do you implement data fabric? ›- Use a DataOps Process Model.
- Build A Data Fabric, Not A Data Lake.
- Understand Data Compliance Requirements.
- Use Graph-Based Analytics Instead Of Relational Databases.
- Use Open-Source Solutions.
- Improved Data Accessibility.
The centralized data hub is a critical element of the data fabric. It's the place where all your company's data is accessed in a single location. The data hub stores and manages structured and unstructured data. It also lets you process data and run analytics from one location.
Is data fabric a technology? ›Data fabric combines key data management technologies – such as data catalog, data governance, data integration, data pipelining, and data orchestration.
How does the data fabric enable technology transformation? ›Data fabric is essential for powering digital transformation because it enables organizations to seamlessly weave together disparate sources of data and deliver integrated views. These views help with business intelligence, machine learning, advanced analytics, and other methodologies that drive business innovation.
What is agile data fabric? ›Agile Data Fabric
A data fabric is a modern distributed data architecture that includes shared data assets and optimized data fabric pipelines that you can use to address today's data challenges in a unified way.
Data becomes more transparent, from its lineage to its usage, which means it is readily apparent when something goes wrong. This builds trust because users have an efficient and reliable data system that allows them to increase the velocity at which they can leverage data to make decisions.