Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
Dhairya
Solution Supplier
Solution Supplier

Does Dataflow Gen2 Store Data?

In Fabric, if we create Dataflow Gen 2 does it store data with or without specifying a destination, while importing data in Power BI Desktop using Dataflow Gen 2 as a source it displays the data and I haven't specified the destination while creating the Dataflow.

1 ACCEPTED SOLUTION
v-nuoc-msft
Community Support
Community Support

Hi @Dhairya 

 

When you create a Dataflow Gen 2, you have the option to specify a data destination for each query in your dataflow.

 

This means that you can choose where to store the output of your data transformation, and use different destinations for different queries within the same dataflow.

 

However, specifying a data destination is not mandatory. If you do not specify a data destination, the data will be stored in the dataflow’s internal storage by default.

 

Dataflows gen 2 data destinations and managed settings | Microsoft Fabric Blog | Microsoft Fabric

 

When setting up Dataflow Gen2, you must connect it to an Azure Data Lake Storage Gen2 account.

 

This acts as the specified destination for your data. The integration of Power BI dataflows with ADLS Gen2 enables enhanced data management, storage, and security features.

 

Configuring dataflow storage to use Azure Data Lake Gen 2 - Power BI | Microsoft Learn

 

Once your dataflow is configured to store data in ADLS Gen2, you can import this data into Power BI Desktop using the Azure Data Lake Storage Gen2 connector.

 

This process involves connecting to your ADLS Gen2 account from Power BI Desktop and navigating to the data you wish to import.

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

View solution in original post

5 REPLIES 5
v-nuoc-msft
Community Support
Community Support

Hi @Dhairya 

 

When you create a Dataflow Gen 2, you have the option to specify a data destination for each query in your dataflow.

 

This means that you can choose where to store the output of your data transformation, and use different destinations for different queries within the same dataflow.

 

However, specifying a data destination is not mandatory. If you do not specify a data destination, the data will be stored in the dataflow’s internal storage by default.

 

Dataflows gen 2 data destinations and managed settings | Microsoft Fabric Blog | Microsoft Fabric

 

When setting up Dataflow Gen2, you must connect it to an Azure Data Lake Storage Gen2 account.

 

This acts as the specified destination for your data. The integration of Power BI dataflows with ADLS Gen2 enables enhanced data management, storage, and security features.

 

Configuring dataflow storage to use Azure Data Lake Gen 2 - Power BI | Microsoft Learn

 

Once your dataflow is configured to store data in ADLS Gen2, you can import this data into Power BI Desktop using the Azure Data Lake Storage Gen2 connector.

 

This process involves connecting to your ADLS Gen2 account from Power BI Desktop and navigating to the data you wish to import.

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hi @v-nuoc-msft 

Thank you for your valuable answer.
"If you do not specify a data destination, the data will be stored in the dataflow’s internal storage by default."
Does that mean if we specify a destination then the data will not be stored internally..?

Hi @Angith_Nair 

 

It depends on the type of data destination you specify.

 

Some data destinations, such as Azure Data Lake Storage or Azure SQL Database, allow you to copy the data from the dataflow’s internal storage to the destination, while others, such as Power BI datasets, allow you to move the data from the internal storage to the destination.

 

If you copy the data, it will still be stored internally, but if you move the data, it will not be stored internally.

 

Regards,

Nono Chen

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Got it. Thanks a lot.

3CloudThomas
Super User
Super User

Yes, the is an Azure storage account plus container hidden behind the scenes that stores the data in parquet structure

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.

Top Kudoed Authors