Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
Hi,
I'm evaluating Power Bi and are trying to understand how to do the best setup and I ran in to serveral problems that i dont really understand.
"For best performance use the import instead of direct query."
Direct query
Direct query will will translate the query language in a VERY VERY "simple/bad" way.
Import
The import/publish have a size limit of 1 024mb for a desktop file but the PRO account gives me 10 GB????
We have One file that has a table that contains 22 million rows and it contains 2 years of transactions, why is that to BIG for Power BI?
Best Regards
Robert
Solved! Go to Solution.
@Robbisweden A couple things I'll mention:
1) Direct Query to a database is reliant on the speed at which the query can be executed and returned, this can be slow in some cases and should be evaluated to determine if it is the best option.
2) Import is faster because all the data is in memory/in you model within the Desktop. I can't really comment on the limitations to size as that is most likely determined by performance, load, etc.
3) Another solution would be to use a "Live Connection" to a SSAS cube or model. I use a lot of Tabular models that the live connection works extremely well and fast.
As to the size of your file, it all depends on how wide, and how compressable the data within is that will dictate size limits.
In order of implementation on my end, I personally use import the most, and use SSAS Tabular models with live connections for all business use cases. I use Direct Query only on rare occasions.
Hi @Robbisweden,
The import/publish have a size limit of 1 024mb for a desktop file but the PRO account gives me 10 GB????
No, 10 GB is overall storage for Pro user. For the single PBIX file, the limitation is 1 GB when publish to services.
22 million rows dataset size can be very large, can also be small. It depends on your data structure, data type. In your scenario, if your file size exceed 1GB, you need to delete the unnecessary column to reduce the file size.
Regards,
Charlie Liao
Hi @Robbisweden,
The import/publish have a size limit of 1 024mb for a desktop file but the PRO account gives me 10 GB????
No, 10 GB is overall storage for Pro user. For the single PBIX file, the limitation is 1 GB when publish to services.
22 million rows dataset size can be very large, can also be small. It depends on your data structure, data type. In your scenario, if your file size exceed 1GB, you need to delete the unnecessary column to reduce the file size.
Regards,
Charlie Liao
Hi
Power bi Import or Direct - import quicker? Well if you have a lot of data - it can be painful downloading all the rows and slow especially if you are developing queries..
The real issue with desktop is if you join an XLS to a DB table it forces to "import mode" so you can't use the native DB smarts.
The initial data retrieve is very slow.
One solution is to run a 'push down' query and run query in the native db, this way you get the best of both worlds as you only bring back the rows you need. For this you need have a list of keys or some criteria. See this post for more
This is an example from a HANA db but i think would apply to other DBMSs
let
Source2 = Excel.Workbook(File.Contents("\\yourpath\KeyList.xlsx"), null, true),
Sheet1_Sheet = Source2{[Item="Sheet1",Kind="Sheet"]}[Data],
#"Promoted Headers1" = Table.PromoteHeaders(Sheet1_Sheet, [PromoteAllScalars=true]),
Source1 = #"Promoted Headers1",
KeyList = Text.Combine(Source1[KEYfield],"','") , // column to single line of text
sel2 = "select * from ""ZXXX1.ZXXX_REP.bp::Query"" where ""KeyField"" IN ('" & KeyList & "') ",
Source = SapHana.Database("server:portnumber", [Query=sel2])
in
Source
@Robbisweden A couple things I'll mention:
1) Direct Query to a database is reliant on the speed at which the query can be executed and returned, this can be slow in some cases and should be evaluated to determine if it is the best option.
2) Import is faster because all the data is in memory/in you model within the Desktop. I can't really comment on the limitations to size as that is most likely determined by performance, load, etc.
3) Another solution would be to use a "Live Connection" to a SSAS cube or model. I use a lot of Tabular models that the live connection works extremely well and fast.
As to the size of your file, it all depends on how wide, and how compressable the data within is that will dictate size limits.
In order of implementation on my end, I personally use import the most, and use SSAS Tabular models with live connections for all business use cases. I use Direct Query only on rare occasions.
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
User | Count |
---|---|
37 | |
30 | |
17 | |
16 | |
8 |
User | Count |
---|---|
49 | |
39 | |
32 | |
17 | |
14 |