Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.

Reply
rocky09
Solution Sage
Solution Sage

Working with Large Data

Just want to know, How you guys work with Large Data?

 

1. Do You completely imports all the tables from database like SQL, Oracle into PowerBI.

2. Do You create a new table with SQL query (combining different tables with different criteria) and import into PowerBI.

 

Please share your thoughts.

 

Thank you,

 

 

6 REPLIES 6
vanessafvg
Super User
Super User

Define large?  Ive been struggling a bit with direct query being very slow, so much so i was pulling my hair out.  it could be our network though, so ive taken to reducing my model to only what i need (which is what is best pracise anyway - i wanted to do that via sql code but i was unable to) - importing in has worked better.   

 

The issue I was having regarding not being able to write sql queries is that i log onto a a dev domain from my local, when i try to write sql queries as opposed to bringing in the whole table, i get user not authorised.  Still haven't worked out why that is happening.  I did then create views, however still found the direct query slow and it was using a lot of my memory too, even though i was expecting that to be on the server side.





If I took the time to answer your question and I came up with a solution, please mark my post as a solution and /or give kudos freely for the effort 🙂 Thank you!

Proud to be a Super User!




v-sihou-msft
Microsoft Employee
Microsoft Employee

@rocky09

 

I suggest you separate date into multiple views to import. Or use Direct Query instead of import mode.

 

Reference:
Data Import Best Practices in Power BI

 

Regards,

@v-sihou-msft

 

Thanks for the link. Most of the content helped me.

 

I am using individual queries to get the data from the server. But, sometimes, I am missing a common column in tables. So, I am not able to make relationships with it. Is there a way to create a custom common column ?

@rocky09

 

 I don't know what your common column looks like. You can add custom columns like index column.

 

Regards,

dkay84_PowerBI
Microsoft Employee
Microsoft Employee

It depends.

 

If you have a very large data set but you need to work with only a subset of the data, then when you set up your filters in the Query Editor it will only import the subset.  Keep in mind that there are file size limits and workspace storage limits for your PBI service.

 

If you need to work with the entire data set, I would recommend using Direct Query (if the underlying source is compatible with DQ).  There are some limitations here with regards to DAX, but DQ will not import any data into the data model meaning size is not an issue.

I'd love to know how the import filtering works and how to optimize the query to minimize the import. I have a table with 60M records, which I need to filter on two different criteria, which then selects a few thousand rows which are actually part of the final imported table. However whenever I refresh the query, Power BI is importing about 16M records over ODBC from our CRM which literally takes hours. It's obvioulsy doing some filtering, because it doesn't grab the whole database, but it's pulling in way more than make it through the filters and into the table

Helpful resources

Announcements
FabCon Global Hackathon Carousel

FabCon Global Hackathon

Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!

September Power BI Update Carousel

Power BI Monthly Update - September 2025

Check out the September 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors