Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
Hello, I have experience with Power BI, but currently am working with importing huge sets of data from SQL for the first time. My question is what is the best practice for working with multiple huge sets up data (transactional data with hundreds of millions of rows). I haven't been able to find a resource that has talked about importing this much data explicitly?
1) Write a long SQL statement with many joins on these tables, then load using the statement?
2) Bring in the tables, filter in Power Query as best I can, and map in the Modeling section of Desktop?
3) Bring in the tables, filter in Power Query as best I can, and merge in Power Query?
Thanks!
Solved! Go to Solution.
You'll still likely want to load in your fact and dimension tables into your model separately as a star schema, rather than merging into a big monster table. One option when working with large tables is to use DirectQuery rather than Import to load your tables.
From your three listed options, #2 is probably the closest to best practice, assuming you don't have to do lots of data manipulation other than some filtering. If you do need lots of data manipulation, then you may want to do that in SQL first and have clean tables (or views) to load into Power BI.
You'll still likely want to load in your fact and dimension tables into your model separately as a star schema, rather than merging into a big monster table. One option when working with large tables is to use DirectQuery rather than Import to load your tables.
From your three listed options, #2 is probably the closest to best practice, assuming you don't have to do lots of data manipulation other than some filtering. If you do need lots of data manipulation, then you may want to do that in SQL first and have clean tables (or views) to load into Power BI.
Thanks for your response! There is a good amount of maniuplation to be done. I tried loading using a long SQL query and the update was awful. I also tried loading multiple tables and filtering best I could, and this still took some good time PRIOR to manipulation.
So if I just created a new table or view in SSMS based on the query I was trying to run, I could then use that to access in Power BI? This makes some sense since it only took about 6 minutes to run in SSMS compared to 4+ hours in Power BI (I canceled it).
Yes. Power Query is powerful but it's often better to push SQL manipulations upstream before trying to load the data into Power BI.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
77 | |
74 | |
57 | |
40 | |
33 |
User | Count |
---|---|
70 | |
63 | |
57 | |
49 | |
46 |