Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
One of the sources of my PowerBI file uses a mammoth view. This view aggregates data from three huge tables (SQL database), each with more than 10 million rows. The view filters down the output to the transactions from 1st July 2018 on. Even so, there are well in excess 7 million rows.
Due to the size of the PowerBI file (badly in need of optimization), 650Mb, the automatic data refresh fails sometimes; its visuals do not always display the content expected.
The file does not function properly.
My question for you is: do you know of a method to import efficiently data coming from such a view? I understand that incremental refreshing does not work on views, but on tables without any filtering.
Solved! Go to Solution.
Maybe you should try to remove unnecessarily fields and then transform some specific fields with many decimals (with 2 decimals or whole number whenever is possible).
With this way i made a good reduction on my data from 200 mbyte to 100mbyte.
Hi @amirabedhiafi ,
Check the blog below:
https://www.sqlbi.com/articles/data-import-best-practices-in-power-bi/
Best Regards,
Kelly
Did I answer your question? Mark my post as a solution!
Maybe you should try to remove unnecessarily fields and then transform some specific fields with many decimals (with 2 decimals or whole number whenever is possible).
With this way i made a good reduction on my data from 200 mbyte to 100mbyte.
User | Count |
---|---|
73 | |
70 | |
38 | |
23 | |
23 |
User | Count |
---|---|
96 | |
94 | |
50 | |
42 | |
40 |