Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Find articles, guides, information and community news

Most Recent
rajendraongole1
Super User
Super User

Unleash the full potential of your Lakehouse! We show you the critical steps to deploy AI Data Agents in Microsoft Fabric—the breakthrough solution that allows anyone to talk to their data, not code it.

Read more...

vojtechsima
Super User
Super User

tl;dr The Fabric Lakehouse has a SQL Analytics Endpoint (SAE) that lets you query your data with SQL. It reads from your Lakehouse Delta tables, which live as Parquet files plus Delta logs. SAE sits on top of that storage, uses the metadata, and exposes a SQL friendly layer. That layer can lag a bit behind the Lakehouse, so you may want an extra sync step to keep data ready for Power BI, for example. For that, you can call the Fabric REST API’s Refresh SQL Endpoint Metadata operation.
 
Disclaimer: This article talks about the Microsoft Fabric Lakehouse item and its SQL Analytics Endpoint. I also don't use built-in lakehouse sql analytics endpoint's semantic model from in this case.
Read more...

rajendraongole1
Super User
Super User

I recently got a chance to explore GraphQL, it was an absolute revelation. I realized I could finally have a more elegant way to expose the clean, curated data from our new Microsoft Fabric environment to modern applications. This blog post is a summary of that journey, explaining exactly what GraphQL is, why you need it, and how it transforms data access.

Read more...

KevinChant
Super User
Super User

In this post I want to cover studying for Fabric certifications during the age of AI. In order to highlight some AI-related aspects to others.Ask Learn functionalityAsk Learn functionality

Read more...

bhanu_gautam
Super User
Super User

Over the last few months, I’ve been diving deep into Microsoft Fabric — especially the analytics side of things. After passing the DP-700, I set my sights on its counterpart: the DP-600: Fabric Analytics Engineer Associate certification.

I’m excited to share that I’ve officially cleared the DP-600 exam — and in this article, I’ll walk you through my preparation strategy, what the exam covers, and some tips to help you succeed if you’re planning to take it next.

Read more...

bhanu_gautam
Super User
Super User

If you’re diving into Microsoft Fabric, you’ve likely heard about its powerful capabilities for unified analytics. One of its standout features? Notebooks. Whether you’re coming from a data science, engineering, or BI background, notebooks provide a flexible space to explore, transform, and visualize data — all within the Fabric ecosystem.

 

In this guide, we’ll walk through how to get started with Notebooks in Microsoft Fabric — from creating your first notebook to running code and visualizing results. No fluff, just a practical walkthrough to help you hit the ground running.

Read more...

Rufyda
Memorable Member
Memorable Member

Working with Data in Fabric — Tables, Lakehouses & Warehouses.png
As organizations move toward unified analytics platforms, Microsoft Fabric offers two powerful data storage models: Lakehouses and Data Warehouses. Each has unique advantages, and understanding when and how to use them is essential for effective data engineering and analytics.

This article explains the difference between Lakehouses and Warehouses in Fabric, shows how to create simple tables and load data, and includes links to practical exercises to get started.

 

Lakehouse vs. Data Warehouse in Microsoft Fabric
🔹 What Is a Lakehouse?
A Lakehouse combines the scalability of a data lake with the structure of a data warehouse. It stores structured, semi-structured, and unstructured data in open formats such as Parquet and Delta.

Key Features:

Supports files and tables in the same environment.
Ideal for big data analytics, data science, and machine learning.
Works seamlessly with Notebooks, Spark, and PySpark.

 

What Is a Data Warehouse?
A Data Warehouse in Fabric is optimized for structured data and BI reporting. It follows traditional SQL-based modeling and is best suited for OLAP workloads, business dashboards, and high-performance queries.

 

Creating Tables and Uploading Data
Let’s walk through how to create tables in both Lakehouses and Warehouses inside Microsoft Fabric.

🔹 In a Lakehouse:
Open your Lakehouse from the Fabric workspace.
Upload a CSV or Excel file.
Use the Data Engineering interface or a Notebook to create a Delta table


df = spark.read.csv("/Files/sales.csv", header=True, inferSchema=True)
df.write.format("delta").saveAsTable("SalesData")


In a Warehouse:
Go to your Warehouse in Fabric.
Use the SQL Editor to create a table:

CREATE TABLE Products (
ProductID INT PRIMARY KEY,
Name NVARCHAR(100),
Price DECIMAL(10,2)
);

3. Insert data manually or use the Load Data tool to import from OneLake, Excel, or an external source.

 

Practice Labs and Learning Resources
Here are some official and community resources where you can practice working with Lakehouses and Warehouses:

🔗Implement a Lakehouse with Microsoft Fabric - Training | Microsoft Learn
🔗Implement a data warehouse with Microsoft Fabric DP-602T00 - Training | Microsoft Learn
🔗 Sign in to your account


Choosing between a Lakehouse and a Warehouse depends on your workload. For data science and big data exploration, Lakehouse is ideal. For business intelligence and structured data analytics, the Data Warehouse provides better performance and SQL capabilities.

 

 

let’s connect on LinkedIn: https://www.linkedin.com/in/rufyda-abdelhadirahma/

suparnababu8
Super User
Super User

Thinking of taking the DP-700 exam? Here’s how I passed it—and why it could change your career in data engineering with Microsoft Fabric.

image (2).png

Read more...

Rufyda
Memorable Member
Memorable Member

DP-600 is a Microsoft certification called:
Implementing Analytics Solutions Using Microsoft Fabric

In short: If you want to prove that you can work with modern data tools and build end-to-end data solutions, this exam is for you.
Learning Microsoft Fabric is exactly what you need to pass it.


 What can you do with Fabric?

Here are just a few examples:
• Combine messy data from different places (Excel, SQL, APIs)
• Clean and prepare it with easy tools
• Visualize it in Power BI (right inside Fabric!)
• Run machine learning models
• Share reports with your team instantly
 Let’s break down the core building blocks of Fabric

 Tool  What it does ?

Lakehouse A place to store raw + processed data (like a folder in the cloud, but smarter)
Warehouse For structured, SQL-style analytics (think: tables and rows)
Notebook A coding environment for data science and AI
Power BI Build interactive dashboards and reports
Pipelines Automate data flows from source to report
Don’t worry if these sound new — we’ll explain each of them one by one in the next articles.

Microsoft Fabric is:
• A one-stop platform for all your data tasks
• Perfect for beginners who want to level up
• The core tool to learn if you're aiming for the DP-600 exam
• Easier than it sounds .. and kinda fun once you get into it 

 

let’s connect on LinkedIn: https://www.linkedin.com/in/rufyda-abdelhadirahma/

burakkaragoz
Community Champion
Community Champion

Facing issues with data integration, performance, or governance in Microsoft Fabric? This guide breaks down the most common challenges and offers practical, real-world solutions to help you get the most out of your Fabric environment.

Read more...

Helpful resources

Join Blog
Interested in blogging for the community? Let us know.