Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I have a table in Azure Datawarehouse with a billion records,the table structure is very simple just include two columns:
id int
text nvarchar(512)
My requirement is not very complicated , just search keyword in field text(using powerbi advanced filter) and return the unique id and then join with another table(about 1-2 million records).
My question is which way is best for my scenario, import or directy query?
Other question is the mechanisms for this two ways, for example if I choose import, when I publish and share the report, the data will published together and the query will execute online? And if the data is update how can I refresh the report with the latest data, do I need to re-import data to powerbi desktop and re-publish?
If I choose directy query, dose this mean when filtering the share report will creat new connection to Azure Datawarehouse for each user who access the report?
My last question is are there any restrictions on these two ways.
Thanks!
1. Direct query may be faster, but has some limitation. Even with a billion records you should be able to use the Import method. However, as stated there is a limitation on how much data can be imported through the import method.
2. Once you publish the Pbix file to Power BI Service you can set up a gateway and schedule an automatic refresh for a time when no one is on the server.
3. No. Direct query will not create new connections for each user of the report. That is all done on the back end. They will merely be playing with the front end visuals.
4. I would refer to this article in the Power BI Desktop tutorials to make your decision: https://docs.microsoft.com/en-us/power-bi/desktop-use-directquery
Hi @changhai ,
For your first question, we think using Import Mode might reach limitation of dataset (For example 1,999,999,997 limit on the number of distinct values that can be stored in a column and 1-GB limit for datasets stored in Shared capacities). And If your first table will update frequently, it might take much time in scheduled refresh because of the large dataset. The most important thing is, although DirectQuery Mode has some limitation in tranforming data and DAX, but we can easily change dataset from DirectQuery to Import. Consider your large dataset, we suggest you to try the DirectQuery Mode first, if it cannot meet your requirement, you can change it to Import Mode.
For the second question, When using Import Mode, the data will be refresh and stored in the model, we can execute the on-demand refresh or set the scheduled refresh to make it up-to-date in service and do not need to re-publish again.
For the last question, Use of DirectQuery does have potentially negative implications, as detailed in this section. Some of those limitations are slightly different depending upon the exact source that is being used. Please refer to this document about the Implications of using DirectQuery. The implication of Import mode mostly is the limitation on size of dataset and the refresh time if your dataset is large, also it will take much more menory while refresh.
Best regards,
@v-lid-msft Thanks for your reply.
My data will not exceed 1 billion so I think for import mode it will not reach limitation.
According to my test, I imported almost 100 million data, I used advanced filter for the big table and found it too slow, most of the time it will time out.
I want to know how to set the timeout to a longer time. And on the other hand how to speed up the query?
Thanks!
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!