Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started
Hi Everyone!
due to some refreshing problems of the dataflow I have a question, for which I couldn't have found answer in manuals or open internet.
Are there any limitations for dataflows in Power BI Pro using on-premises data gateway (enterprise mode)?
I'm using predefined ADLS for Power BI, so I haven't found any options to check/modify some variables (also checked at azure portal), as it is possible for Premium licences.
I know, that in Premium capacity there are options to change, for example, the size of containers.
(e.g. https://blog.crossjoin.co.uk/2019/04/21/power-bi-dataflow-container-size/)
I'm wondering whether my failures with data tables refreshing in the dataflow are procured by the default limitations of the storage settings or maybe they are caused by the local server.
Some more informations (maybe valuable):
- I'm using the standard, predefault dataflow storage
- data source is SQL Server,
- the data table is quite big, but not enormous (index space 160 MB, row count >600k),
- I've tried refreshing the smaller table in separate dataflow and it went OK,
- I've tried refreshing the last 1k rows of that table in another separate dataflow and it also went OK,
- my next try was to prepare new power bi desktop file .pbix with this table as data source, I'he published it and dataset also refreshed correctly,
- the failure id is unknown in the internet, and the text of it (getted out of the gateway side) says sth about bad credentials, which is quite weird, as I didn't change anything,
- local sql server is set at quite old and "week" hardware, so it is possible, that the problems maybe caused by that 🙂
Looking forward to hearing from You, Thank You in advance!
Regards,
Piotr
Solved! Go to Solution.
Thank You for Your response!
The refresh failed after ~30 min, so it's far from the 2h edge 🙂
Second tip is more convenient, but I have used other solution - I have reduced the queries from that biggest table, and after that refreshing has finally finished correctly.
Thank You once again!
Regards,
Piotr
Hi @PWasniewski ,
I think you ran into the refresh time out( 2 hours ).
You could broke up a big model into entities in 5-6 different dataflows in the same workspace. Then stagger the refresh time.
Thank You for Your response!
The refresh failed after ~30 min, so it's far from the 2h edge 🙂
Second tip is more convenient, but I have used other solution - I have reduced the queries from that biggest table, and after that refreshing has finally finished correctly.
Thank You once again!
Regards,
Piotr
Hi @PWasniewski ,
I am glad that you could find a solution. You could accept your own reply to let other users find it more quickly.
Check out the September 2024 Power BI update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.