March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hi Everyone!
due to some refreshing problems of the dataflow I have a question, for which I couldn't have found answer in manuals or open internet.
Are there any limitations for dataflows in Power BI Pro using on-premises data gateway (enterprise mode)?
I'm using predefined ADLS for Power BI, so I haven't found any options to check/modify some variables (also checked at azure portal), as it is possible for Premium licences.
I know, that in Premium capacity there are options to change, for example, the size of containers.
(e.g. https://blog.crossjoin.co.uk/2019/04/21/power-bi-dataflow-container-size/)
I'm wondering whether my failures with data tables refreshing in the dataflow are procured by the default limitations of the storage settings or maybe they are caused by the local server.
Some more informations (maybe valuable):
- I'm using the standard, predefault dataflow storage
- data source is SQL Server,
- the data table is quite big, but not enormous (index space 160 MB, row count >600k),
- I've tried refreshing the smaller table in separate dataflow and it went OK,
- I've tried refreshing the last 1k rows of that table in another separate dataflow and it also went OK,
- my next try was to prepare new power bi desktop file .pbix with this table as data source, I'he published it and dataset also refreshed correctly,
- the failure id is unknown in the internet, and the text of it (getted out of the gateway side) says sth about bad credentials, which is quite weird, as I didn't change anything,
- local sql server is set at quite old and "week" hardware, so it is possible, that the problems maybe caused by that 🙂
Looking forward to hearing from You, Thank You in advance!
Regards,
Piotr
Solved! Go to Solution.
Thank You for Your response!
The refresh failed after ~30 min, so it's far from the 2h edge 🙂
Second tip is more convenient, but I have used other solution - I have reduced the queries from that biggest table, and after that refreshing has finally finished correctly.
Thank You once again!
Regards,
Piotr
Hi @PWasniewski ,
I think you ran into the refresh time out( 2 hours ).
You could broke up a big model into entities in 5-6 different dataflows in the same workspace. Then stagger the refresh time.
Thank You for Your response!
The refresh failed after ~30 min, so it's far from the 2h edge 🙂
Second tip is more convenient, but I have used other solution - I have reduced the queries from that biggest table, and after that refreshing has finally finished correctly.
Thank You once again!
Regards,
Piotr
Hi @PWasniewski ,
I am glad that you could find a solution. You could accept your own reply to let other users find it more quickly.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
32 | |
24 | |
13 | |
11 | |
9 |
User | Count |
---|---|
47 | |
46 | |
23 | |
12 | |
9 |