Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Compete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.

Reply
Anonymous
Not applicable

Really need help with dataflow setup

Hello Community:  

 

We have a P1 SKU.  

 

We use dataflows to connect to our ERP system and also to bring in SF data.   The SF dataflows have no issues.  

 

Where I am running into trouble is with a few of our ERP dataflows, and specifically ones that we have broken up into multiple separate dataflows.   For example: 

 

We have 2021 Orders   2022 Orders   and  2023 Orders.    We then have a "salesordersall" dataflow which contains all three of those.    Same for shipments.    I refresh all of them including the singlular dataflows and the ones that contain all of them.  

 

Of course all of these then feed into a dataset.  

 

I'm having issues such as either the refreshes causing blank rows to come in to the dataset (likely because of some sort of overlap either), or I get a 'cannot acquire lock for model because it is in use" error.  

I do have the singluar dataflows "chained" in the "all" dataflows.    I even tried turning off the refreshes for the singluar dataflows thinking the aggregated "chained" dataflows would refresh all of the ones inside it  (it didn't).   

Any guidance is very appreciated!

4 REPLIES 4
Anonymous
Not applicable

@collinq    Let me digest your answer a bit, but first, thank you for the response!  

So, yes, the 2021 dataflows are not being refreshed, but the 2022 sales order is because we have orders in ERP that have not shipped yet, and when they do, the data in that dataflow will change (for example a status flag from "open" to "shipped".    And I can turn off the 2021 and 2022 shipment dataflow as well.  

As for incremental refresh, I did not think I could use it based on the similar logic above.  If I decided to only refresh the last 3 months of data on sales orders dataflow, and we have a backlog order from 4 months ago that ships tomorrow, I don't think it would properly updated to show the new status.   But certainly with data that is 100% static, I could use that.   Perhaps add a condition to the logic that says if Order Status = Complete than don't refresh.  

I think the key is your last sentence which is the crux of my question.   I have three objects that essentially need updating, and proper staging:   In simple terms:   Putting aside 2022, I have a 2023 dataflow, an "all" dataflow, and the dataset.   Imagine I want our company to have fresh data (I use import mode for various reasons) at least once an hour.   If the 2 dataflows have to refresh independently, before the dataset can, than essentially they have to be scheduled 30 minutes apart (one hour).  And then the dataset on the next half hour slot.   So 1.5 hours transpires between the first dataflow refresh.  

The aggregated dataflows feed into a dataset that is comprised of many other ERP and SF related dataflows.  



Hey @Anonymous ,

 

A quick clarification for 2022 or 2023 - you should use a date field like "last updated" and not a range of dates (like only get the last 3 months of data).  That way if a 12 month old or a 1 day old record is updated then it will be in the refresh. 

 

You are right for the last issue - that it will take 1.5 hours and not every 30 minutes.  BUT, maybe you can get it down in time if you are using incremental refresh???  I understand about import and I strongly prefer import for most things most of the time.  If you can't change your connection method then I think you are stuck with the 1.5 problem. 

 

One way to get that 1.5 hour problem down is to go with Premium.  You may be able to use PPU rather than full Premium but that will give you faster dataflow times.  Note - everybody that looks at that report then should have PPU or Premium license.  This gets down to "what is 1.5 hour data worth vs .5 or .75 hour data" to your organization - that is the question for going with Premium.

As a test, you might get one userid as PPU and just run the same queries and see how much faster that is and then that will answer how much time you can save and thereby help determine how much it is worth to go with Premium licensing.

 

 




Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!
Private message me for consulting or training needs.




Anonymous
Not applicable

Yes, we have a P1 SKU.   So pretty fast.  

 

I attempted once to use Power Automate to automate the updating of the dataset immediately following a successful dataflow refresh, but it appears our Power Automate license wouldn't allow it.  Great tip about the "last updated" field...that makes sense and I will incorporate that.   My understanding is that any modification to the dataflow will require me to re-do the incremental refresh again.   Not sure if that is true or not.   

collinq
Super User
Super User

Hi @Anonymous ,

 

My first question is - Why are you refreshing the 2021 and 2022 orders?  Is that data changing?  Shouldn't it be static now?  If so, I would not refresh those two at all.

Then, that leaves the 2023 data.  Are you using incremental refresh?  That would trim the length of time down dramatically.  You don't have to use incremental refresh but not doing so means that it will just get longer and longer refresh times.

Are you appending into a Dataflow that you are using as a dataset or are you using an actual dataset where you are appending the three outputs?  

 

IN a nutshell, I would break this into three separate Dataflows - 2021 Orders and Shipments, 2022 Orders and Shipments, 2023 Orders aned Shipments and then when I have a succesfull 2021 and 2022 I would never refresh those again.   Then, I would either just refresh the 2023 Dataflow (with or without incremental Refresh).  Then, I would have Dataflow 4 wich takes the other 3 dataflows and appends the data and that would be my result set to hit in my Power BI reports.  That means that I only have to worry about Dataflow 3 refresh time and make sure that Dataflow 4 refresh time runs after Dataflow 3 is totally done.




Did I answer your question? Mark my post as a solution!

Proud to be a Datanaut!
Private message me for consulting or training needs.




Helpful resources

Announcements
August Power BI Update Carousel

Power BI Monthly Update - August 2025

Check out the August 2025 Power BI update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.