Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now

Reply
Daryl-Lynch-Bzy
Resident Rockstar
Resident Rockstar

Dataflow Incremental Refresh Update Specific Partitions

I understand how a specific partition can be refreshed in Power BI Dataset using SSMS or Tabular Editor via the XLMA Endpoints.  Thanks @GuyInACube 

My question is how can I refresh specific partition for Power BI Dataflow.  The XLMA Endpoint connect will only show the Datasets because these are SSAS Tabular Model.  The Dataflow are not SSAS.  They are objects in the Datalake.

 

How can I refresh the partitions between 5 Years and 14 Days without deactivating and re-activating the Incremental Load?

DarylLynchBzy_0-1642589961197.png

 

19 REPLIES 19
fragnn
New Member

The possibility to refresh a single partition in incremental dataflows is indeed a necessity for many.
I haven't found a way to do it yet. 

The product team should definitely add the feature.

I just added my vote to the corresponding idea: Microsoft Idea (powerbi.com)

 

v-janeyg-msft
Community Support
Community Support

Hi, @Daryl-Lynch-Bzy 

 

PowerBI dataflow supports incremental refresh, It doesn't matter if it's not SSAS. 

 

As long as these two conditions are met, incremental refresh of dataflow can be performed:

  • Using incremental refresh in dataflows created in Power BI requires that the dataflow reside in a workspace in Premium capacity. Incremental refresh in Power Apps requires Power Apps Plan 2.
  • In either Power BI or Power Apps, using incremental refresh requires that source data ingested into the dataflow have a DateTime field on which incremental refresh can filter.

 

More details in doc:

Using incremental refresh with dataflows - Power Query | Microsoft Docs

 

Have you tried it? Any errors?

 

Did I answer your question? Please mark my reply as solution. Thank you very much.
If not, please feel free to ask me.

Best Regards,
Community Support Team _ Janey

@v-janeyg-msft  - Yes - the Incremental Refresh is up an running, but you have not understood the question.  I want some of the older partitions.  In my situation, the refresh is only updating records within the last 14 calendar day.  I would like to update the partitions for 2021 to include additional data.  If my incremental load was into a Power BI Dataset, I can use XLMA endpoint to refresh the older partitions.  However, I need to update the partitions in Power BI Dataflow.  These do not show up in the XLMA Endpoint.

 

In theory, I should be able to overwrite the old partitions with my update.

Hi, @Daryl-Lynch-Bzy 

 

You can check it:

vjaneygmsft_0-1643074540193.png

vjaneygmsft_1-1643074946119.png

Can't you see the partition when you turn it on?

 

Did I answer your question? Please mark my reply as solution. Thank you very much.
If not, please feel free to ask me.

Best Regards,
Community Support Team _ Janey

 

@v-janeyg-msft 

This is the error message using SQL Database connection:

DarylLynchBzy_1-1643102049679.png

And there are only two visible options for the Analysis Services connector:

DarylLynchBzy_2-1643102138697.png

 

Hi, @Daryl-Lynch-Bzy 

 

In this doc:

incremental-refresh-implementation-details

More details:

vjaneygmsft_0-1643103765510.png

What's the data source of your dataflow? 

@v-janeyg-msft 
It is another Dataflow in the same workspace.  The JSON file is loading into a Standard Dataflow.  The Incremental Dataflow is reading from this initial Dataflow.  Are both the Standard and Incremental Dataflows not considered "Azure Data Lake Storage".  Would the Workspace need to be configured to use Gen 2 storage? 

DarylLynchBzy_0-1643188885249.png

 

Or do both the dataflow need to be switched to "Enhanced Compute Engine" (I thought be default this would be on because the Dataflows are linked?):

DarylLynchBzy_1-1643188954745.png

 



@Daryl-Lynch-Bzy 

 

The incrementally refreshed partitions should not be visible in the service and need to use external tools like ssms. Your data source may not support.

Incremental refresh for datasets and real-time data in Power BI - Power BI | Microsoft Docs

 

Best Regards,
Community Support Team _ Janey

 

thank for your help @v-janeyg-msft - I think will need to close this and leave it as unsolved.  Perhaps the documentation on the following page needs to updating:
Using incremental refresh with dataflows - Power Query | Microsoft Docs

DarylLynchBzy_0-1643284936663.png

I believe the sentence "After XMLA-endpoints for Power BI Premium are available, the partition become visible."  This is true for Datasets (i.e. because they are Analysis Services Tabular Databases, but it is not true for Dataflow.

 

Dataflows would be JSON and CSV files in the Power BI's Azure Data Store.  These can't be accessed using the XMLA Endpoints because it is not analysis service object.

Perhaps the documentation needs to be updated by: @bensack , @DCtheGeek and @steindav

to say that this is not possible or to explain how it is possible.

Hi, @Daryl-Lynch-Bzy  @Anonymous 

 

Update: XMLA Endpoint can be only used to connect to Power BI Premium Dataset for now, and Dataflow cannot use this way to refresh some specific partition. The customer needs to either do full refresh or re-configure the Incremental refresh policy to include the year data to be refreshed. 

 

Best Regards,
Community Support Team _ Janey

Hi, @Daryl-Lynch-Bzy 

 

vjaneygmsft_1-1643337378583.png

 

vjaneygmsft_0-1643337329389.png

I only have the data source of the SQL database. After I set the incremental refresh of the dataflow, check the refresh history and download the results, and I can see the partition name in excel.  As for what can be done to the partition, I didn't find any settings on the service.

 

Best Regards,
Community Support Team _ Janey

Thanks for the link, @v-janeyg-msft 
The Incremental Dataflow sits between an initial Dataflow and the final Datasets.  The Initial Dataflow is import a JSON file and transforms it into a couple of Tables.  These initial dataflow tables are limited to 30 days of activity.  The Incremental Dataflow is working as designed.  Each day is reads the last 30 days of activity, but only updates the last 14 days.   My Incremental load contains data back through Oct 2021.  If was hoping to have the flexibity to update some of the old partitions to add, remove or replace them.

@Daryl-Lynch-Bzy 

 

What ‘s your refresh history? Does your json file support query folding?

@v-janeyg-msft 

Could I have a link to the documentation page please?  


No, I can't see the Dataflow in the XLMA Endpoint.  I only see the Datasets. 

 

Here is a screenshot of the capacities settings:

DarylLynchBzy_0-1643101959666.png

 

Are the Dataflows hidden from the list, but I still connect with the right "Database" or "Initial Catalog" name.


Do I connect using Analysis Services or SQL Database or a different one?

Anonymous
Not applicable

Hello,

 

I'm also very interested in this topic.

 

Given a dataflow, which has incremental refresh set up (keeping last 12 months of data, with incremental refresh of 2 months), meaning every exeution refresh only current month and previous, I would like to have a way to refresh only one partition 4 month ago.

 

For Datasets this is easy

- Datasets are stored in underneath Analysis services. This analysis service is accessible via SSMS(SqlServerManagementStudio) using workspace XMLA endpoint.

- From SSMS, refresh to single partition can be executed, so you are in full control of which partition you want to refresh. In my example, I could manually refresh my partition of 4 months ago.

 

For Dataflows:

- This is fully diferent story. Dataflows are stored in ADLS, not in Analysis Services, so obviously they cannot be reached from SSMS via analysis services connection. So same approach used for datasets do not apply to dataflows.

- Question is: Is there any way to refresh a dataflow concrete partition? (I know how to do full refresh, and how to run incremental refresh for last n partitions. But questio is how to process only one partition, which is not in the range of the incremental refresh last last n partitions).

 

Many thanks

I believe this is required feature for many.
In my case, I have huge table loaded into my dataflow and would like to process the partitions seperately when I need reprocess all the partitions.
Only option now is loading all of the partitions sequentally and then incrementally after that.

Can someone confirm that there is no way of processing dataflow partitions sepperately?

Hey Solvisig,

  • Can you please elaborate this in details regarding only way available. I am stuck in same situation. I have a dataflow 0.8 Million rows. After setting up the incremental refresh when I refresh the dataflow, it timeouts in 3 hours. Thinking to do the partitioning in SQL server then bringing the table in dataflow. Will there be any way then to refresh the partitions one by one? If yes, please do tell. Also, if there are any other ways, it will be really helpful. Thanks in advance.

The only option is using the refresh button which refreshes all the partitions sequentially for the initial load and then incrementally after that. There is currently no way to have control over your partitions when using dataflows. Thats why I am not using dataflows for many of my workloads so I use standard datasets instead and can easily manage the partitions in tabular editor and other tools.

However with Fabric now being the new hot thing, this must be on the development backlog.

This is definitely a required feature.   I need to move a dataset currently working under incremental refresh and managed through XMLA endpoint to a dataflow, so we can improve the refresh performance by doing all of the transformation within dataflows.    I cannot currently see a way to manage the partitioning, beyond the initial setup of the incremental refresh. 

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.

Nov PBI Update Carousel

Power BI Monthly Update - November 2024

Check out the November 2024 Power BI update to learn about new features.

October NL Carousel

Fabric Community Update - October 2024

Find out what's new and trending in the Fabric Community.

Top Solution Authors
Top Kudoed Authors