The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi Community,
I am receiving an error when trying to refresh a dataset on the powerbi service:
When refreshing, there is a memory error. Each time I try to refresh the database size is different (before command execution). The datasource is not particularly large, but a parquet file in azure blob storage with a few thousand rows. I have another report in the same workspace and refreshing that dataset works fine (Same size parquet file).
I run refreshes on larger files and there is no issue.
Where can I start to look for a root cause? Increasing the memory of the premium capacity is not an option at this point
@aj1973 is there a reason that this particular dataset is exceeding memory allocation whereas larger datasets do not (All on the same capacity)? I also do not understand the variable dataset size (before running the execution).
This might help
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
it is clear that the Dataset is located in a Shared capacity WS instead of a Premium. You have no choice but to increase the capacity of the WS or to Optimise your dataset.
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
What size dataset would you consider small? Our dataset is 9MB, yet we seem to be exceed the capacity of 3000mb during a refresh, this seems like something is not right. We also have very limited operations in power query for our dataset
Can you please show the 'Manage Storage' from the WS where the dataset is hosted related to your report in question?
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
@aj1973 in my dataset settings in the workspace, I can see this:
The dataset in question that is not refreshing is the 2nd one (the first one, very similar dataset is working fine)
Ok, as i mentioned in the screenshot and the URL i sent you, it is not the size of the Dataset that matters in your case, it is the memory limit of the commend that is triggering the error shown in your screenshot initially.
My guessing you will need to revise and optimise your Model, know that a simple DAX formula that is not optimised can take a lot a memory, or some relationships and Bidirectional once can cause a problem as well
Regards
Amine Jerbi
If I answered your question, please mark this thread as accepted
and you can follow me on
My Website, LinkedIn and Facebook
Your dataset is really 9MB??? This is very very very very small size.
Seems a lot of thigs going bad when 9MB dataset is blowing up to the limit of 3,000 MB.
Sounds crazy, I would like to see this process.
--------------------------------------------------
But your question is valid. What is small dataset...it depends, e.g. on your License-model etc.
User | Count |
---|---|
35 | |
14 | |
11 | |
11 | |
8 |
User | Count |
---|---|
44 | |
44 | |
19 | |
18 | |
17 |