Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Find articles, guides, information and community news

Most Recent
ruicunha
Microsoft Employee
Microsoft Employee

Unlock the full potential of teamwork in Microsoft Fabric! Discover how a simple, effective Git branching strategy—paired with smart workspace design—can transform collaboration, streamline development, and empower your teams to deliver high-quality analytics solutions faster. Whether you’re a data engineer, developer, or business user, this article reveals practical steps to minimize conflicts, boost agility, and bring software engineering best practices to your data projects in Microsoft Fabric.

Read more...

bhanu_gautam
Super User
Super User

Over the last few months, I’ve been diving deep into Microsoft Fabric — especially the analytics side of things. After passing the DP-700, I set my sights on its counterpart: the DP-600: Fabric Analytics Engineer Associate certification.

I’m excited to share that I’ve officially cleared the DP-600 exam — and in this article, I’ll walk you through my preparation strategy, what the exam covers, and some tips to help you succeed if you’re planning to take it next.

Read more...

Olufemi7
Frequent Visitor

Celebrating 10 years of Power BI and a future powered by Microsoft Fabric.

In finance, tools like OneLake and Data Activator are helping us unify data and automate insights faster than ever.

Here’s how Power BI and Fabric are shaping the next decade of finance analytics. 

What’s New: Microsoft Fabric, OneLake & Data Activator

With the launch of Microsoft Fabric, we’re stepping into a new era of finance analytics:
Finance Analytics with Microsoft FabricFinance Analytics with Microsoft Fabric

  • OneLake unifies your data across sources—no more scattered silos.
  • Data Activator adds automation to your insights—get alerts before problems escalate.
  • DirectLake mode improves performance for large financial datasets.

This shift makes it easier than ever for finance teams to own their data pipelines without relying solely on IT.

 

bhanu_gautam
Super User
Super User

If you’re diving into Microsoft Fabric, you’ve likely heard about its powerful capabilities for unified analytics. One of its standout features? Notebooks. Whether you’re coming from a data science, engineering, or BI background, notebooks provide a flexible space to explore, transform, and visualize data — all within the Fabric ecosystem.

 

In this guide, we’ll walk through how to get started with Notebooks in Microsoft Fabric — from creating your first notebook to running code and visualizing results. No fluff, just a practical walkthrough to help you hit the ground running.

Read more...

pankaja_ms
Microsoft Employee
Microsoft Employee

This post is second in series that helps Partners and Customers plan Fabric Configuration for new and existing deployments. You can find the first post in this series here : 

https://community.fabric.microsoft.com/t5/Fabric-platform-Community-Blog/Microsoft-Fabric-CAF-Config...

 

Read more...

Rufyda
Kudo Kingpin
Kudo Kingpin

Working with Data in Fabric — Tables, Lakehouses & Warehouses.png
As organizations move toward unified analytics platforms, Microsoft Fabric offers two powerful data storage models: Lakehouses and Data Warehouses. Each has unique advantages, and understanding when and how to use them is essential for effective data engineering and analytics.

This article explains the difference between Lakehouses and Warehouses in Fabric, shows how to create simple tables and load data, and includes links to practical exercises to get started.

 

Lakehouse vs. Data Warehouse in Microsoft Fabric
🔹 What Is a Lakehouse?
A Lakehouse combines the scalability of a data lake with the structure of a data warehouse. It stores structured, semi-structured, and unstructured data in open formats such as Parquet and Delta.

Key Features:

Supports files and tables in the same environment.
Ideal for big data analytics, data science, and machine learning.
Works seamlessly with Notebooks, Spark, and PySpark.

 

What Is a Data Warehouse?
A Data Warehouse in Fabric is optimized for structured data and BI reporting. It follows traditional SQL-based modeling and is best suited for OLAP workloads, business dashboards, and high-performance queries.

 

Creating Tables and Uploading Data
Let’s walk through how to create tables in both Lakehouses and Warehouses inside Microsoft Fabric.

🔹 In a Lakehouse:
Open your Lakehouse from the Fabric workspace.
Upload a CSV or Excel file.
Use the Data Engineering interface or a Notebook to create a Delta table


df = spark.read.csv("/Files/sales.csv", header=True, inferSchema=True)
df.write.format("delta").saveAsTable("SalesData")


In a Warehouse:
Go to your Warehouse in Fabric.
Use the SQL Editor to create a table:

CREATE TABLE Products (
ProductID INT PRIMARY KEY,
Name NVARCHAR(100),
Price DECIMAL(10,2)
);

3. Insert data manually or use the Load Data tool to import from OneLake, Excel, or an external source.

 

Practice Labs and Learning Resources
Here are some official and community resources where you can practice working with Lakehouses and Warehouses:

🔗Implement a Lakehouse with Microsoft Fabric - Training | Microsoft Learn
🔗Implement a data warehouse with Microsoft Fabric DP-602T00 - Training | Microsoft Learn
🔗 Sign in to your account


Choosing between a Lakehouse and a Warehouse depends on your workload. For data science and big data exploration, Lakehouse is ideal. For business intelligence and structured data analytics, the Data Warehouse provides better performance and SQL capabilities.

 

 

let’s connect on LinkedIn: https://www.linkedin.com/in/rufyda-abdelhadirahma/

suparnababu8
Super User
Super User

Thinking of taking the DP-700 exam? Here’s how I passed it—and why it could change your career in data engineering with Microsoft Fabric.

image (2).png

Read more...

suparnababu8
Super User
Super User

Are you looking to move your SQL Server on-premises data into the Microsoft Fabric Lakehouse for advanced analytics and scalability? You're in the right place!

 

image.png

Read more...

Rufyda
Kudo Kingpin
Kudo Kingpin

DP-600 is a Microsoft certification called:
Implementing Analytics Solutions Using Microsoft Fabric

In short: If you want to prove that you can work with modern data tools and build end-to-end data solutions, this exam is for you.
Learning Microsoft Fabric is exactly what you need to pass it.


 What can you do with Fabric?

Here are just a few examples:
• Combine messy data from different places (Excel, SQL, APIs)
• Clean and prepare it with easy tools
• Visualize it in Power BI (right inside Fabric!)
• Run machine learning models
• Share reports with your team instantly
 Let’s break down the core building blocks of Fabric

 Tool  What it does ?

Lakehouse A place to store raw + processed data (like a folder in the cloud, but smarter)
Warehouse For structured, SQL-style analytics (think: tables and rows)
Notebook A coding environment for data science and AI
Power BI Build interactive dashboards and reports
Pipelines Automate data flows from source to report
Don’t worry if these sound new — we’ll explain each of them one by one in the next articles.

Microsoft Fabric is:
• A one-stop platform for all your data tasks
• Perfect for beginners who want to level up
• The core tool to learn if you're aiming for the DP-600 exam
• Easier than it sounds .. and kinda fun once you get into it 

 

let’s connect on LinkedIn: https://www.linkedin.com/in/rufyda-abdelhadirahma/

Rufyda
Kudo Kingpin
Kudo Kingpin
jbarry15
Microsoft Employee
Microsoft Employee

Overview

Over the past year, I’ve had the incredible opportunity to help Microsoft customers unlock the potential of Microsoft Fabric Capacities (FSKUs) by transitioning from Power BI Premium Capacities (PSKUs) or as they embark on their Fabric journey. During these engagements, we’ve tackled not just technical features, architecture design, deployment patterns, capacity planning, but also business considerations such as reservation purchasing strategies and scoping options to demystify how reservations work. Common questions I’ve frequently encountered are how to confidently navigate the reservation purchasing process, understand the scoping flexibility, and calculate consumption units effectively to maximize both value and efficiency.  

My hope is that this article will make your voyage smoother, offer clarity on these decisions, and empower you to make the most of your reservations when stepping into the world of Microsoft Fabric capacity planning!

Read more...

cpatra
Microsoft Employee
Microsoft Employee

🔧

 img_400.jpeg

 

Logic App Solution to Pause Fabric Capacity:

"Take control of your Microsoft Fabric costs with a smart, automated Logic App that pauses unused capacity—seamlessly, securely, and on your schedule. Whether it's nights, weekends, or idle hours, this solution ensures you're only paying for what you use—no more, no less."

 

Read more...

jennratten
Super User
Super User

𝗖𝗮𝗹𝗹𝗶𝗻𝗴 𝗮𝗹𝗹 𝗱𝗮𝘁𝗮 𝗲𝗻𝘁𝗵𝘂𝘀𝗶𝗮𝘀𝘁𝘀!

 

Are you ready to level up your impact in the Microsoft Fabric Community? The new Fabric Super User landing page is where you can get all the details. Become the hero our community deserves!

Read more...

burakkaragoz
Community Champion
Community Champion

Facing issues with data integration, performance, or governance in Microsoft Fabric? This guide breaks down the most common challenges and offers practical, real-world solutions to help you get the most out of your Fabric environment.

Read more...

jensheerin
Microsoft Employee
Microsoft Employee

In today's digital landscape, securing connections between services is more important than ever. This article will guide you through the process of connecting Microsoft Fabric to an Azure Database for PostgreSQL using Managed Private Endpoints (MPE). By following these steps, you can ensure that all outbound traffic remains on the private network, enhancing both security and performance. Whether you're an admin with Fabric workspace access or have sufficient rights in Azure, this comprehensive guide will provide you with the necessary prerequisites and step-by-step instructions to achieve a secure connection. Dive in to learn how to set up, approve, and validate your private endpoint, ensuring your Fabric environment can securely communicate with the Azure PostgreSQL database without exposing it to the public internet. 

Read more...

pankaja_ms
Microsoft Employee
Microsoft Employee

This blog is first in a series that is intended to help readers with most commonly used configuration options available when working with Microsoft Fabric. The guidance in this article is aligned to Microsoft Cloud Adoption Framework's 8 design areas.

Read more...

djurecicK2
Super User
Super User

Interested in learning the key differences between the new org apps (preview) and traditional workspace apps? Click here for an overview of the differences and a guide to creating your first org app.

Read more...

jenbeiser
Microsoft Employee
Microsoft Employee

Managing tenant settings in a Fabric environment can be a complex task, especially when it comes to tracking changes, updates, and new features that are sometimes enabled by defalut. To simplify this process, I’ve outlined two options to ensure you stay up to date on modified, new, and upcoming Fabric settings.

Read more...

suparnababu8
Super User
Super User

In this article, I will demonstrate how to dynamically load a CSV file with space-delimited headers into a Lakehouse as Delta tables.

Read more...

SachinNandanwar
Super User
Super User

Let's say you have one or more CSV files and you want to convert them to Parquet format and also upload them to a Lakehouse table. The available options for this in the Fabric environment are either through a notebook or a data pipeline but there are aren't any pre-built out-of-the-box solutions.

 

For instance, you might have an application that generates CSV files, and you want to upload the CSV data directly to the Lakehouse at that moment. One approach could be to make application store the CSV files in ADLS2 storage and use an event-based pipeline triggered by storage events to upload the data to the Lakehouse. However, what if storing the files on cloud storage isn't an option and the files will always be stored on a on prem storage ?

Article was originally publised here

Read more...

KevinChant
Super User
Super User

One Microsoft Fabric announcement during Microsoft Ignite that I think should get more attention is the fact that Sustainability data solutions in Fabric is now Generally Available (GA).

 

For those who are not aware Sustainability data solutions in Fabric is one of solutions offered as part of the Industry Solutions workload. Which is now also GA and officially recognized as a Microsoft Fabric workload.

 

Microsoft Fabric architecture diagram with Industry Solutions highlightedMicrosoft Fabric architecture diagram with Industry Solutions highlighted

I intend to cover why I think this is significant in this post. Along the way I share some links.

Read more...

rsaprano
Most Valuable Professional
Most Valuable Professional

It’s official – the Fabric Analytics certification is the fastest growing in Microsoft history!  But the associated DP-600 exam is a very broad syllabus, covering everything from data ingestion/transformation using T-SQL and KQL (for Real-time intelligence), to semantic modelling and managing a Fabric environment using CI/CD.

 

In this series of articles, I cover some of the key topics under each of the three syllabus areas (Maintaining Data Analytics Solutions, Prepare Data and Building Semantic models) with examples and scenarios. These are based on a DP-600 exam preparation guide prepared by Abu Bakar, which is available for free along with over 200 practice questions at MS Fabric Training (www.msfabrictraining.com).

 

This article is focused on Module 3 - Implement and manage semantic models - which comprises 25-30% of the overall exam syllabus

Read more...

rsaprano
Most Valuable Professional
Most Valuable Professional

It’s official – the Fabric Analytics certification is the fastest growing in Microsoft history!  But the associated DP-600 exam is a very broad syllabus, covering everything from data ingestion/transformation using T-SQL and KQL (for Real-time intelligence), to semantic modelling and managing a Fabric environment using CI/CD.

 

In this series of articles, I cover some of the key topics under each of the three syllabus areas (Maintaining Data Analytics Solutions, Prepare Data and Building Semantic models) with examples and scenarios. These are based on a DP-600 exam preparation guide prepared by Abu Bakar, which is available for free along with over 200 practice questions at MS Fabric Training (www.msfabrictraining.com).

 

This article is focused on Module 2 - Prepare Data - which comprises 40-45% of the overall exam syllabus

Read more...

rsaprano
Most Valuable Professional
Most Valuable Professional

It’s official – the Fabric Analytics certification is the fastest growing in Microsoft history!  But the associated DP-600 exam is a very broad syllabus, covering everything from data ingestion/transformation using T-SQL and KQL (for Real-time intelligence), to semantic modelling and managing a Fabric environment using CI/CD.

 

In this three part series of articles, I cover some of the key topics under each of the three syllabus areas (Maintaining Data Analytics Solutions, Prepare Data and Building Semantic models) with examples and scenarios. These are based on a DP-600 exam preparation guide prepared by Abu Bakar and I, which is available for free along with over 200 practice questions at MS Fabric Training (www.msfabrictraining.com).

 

This article covers Module 1 - Maintaining Data Analytics solutions - which comprises 25-30% of the overall syllabus.

Read more...

Helpful resources

Join Blog
Interested in blogging for the community? Let us know.