Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Next up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now

Reply
rocky09
Solution Sage
Solution Sage

Working with Large Data

Just want to know, How you guys work with Large Data?

 

1. Do You completely imports all the tables from database like SQL, Oracle into PowerBI.

2. Do You create a new table with SQL query (combining different tables with different criteria) and import into PowerBI.

 

Please share your thoughts.

 

Thank you,

 

 

6 REPLIES 6
vanessafvg
Community Champion
Community Champion

Define large?  Ive been struggling a bit with direct query being very slow, so much so i was pulling my hair out.  it could be our network though, so ive taken to reducing my model to only what i need (which is what is best pracise anyway - i wanted to do that via sql code but i was unable to) - importing in has worked better.   

 

The issue I was having regarding not being able to write sql queries is that i log onto a a dev domain from my local, when i try to write sql queries as opposed to bringing in the whole table, i get user not authorised.  Still haven't worked out why that is happening.  I did then create views, however still found the direct query slow and it was using a lot of my memory too, even though i was expecting that to be on the server side.





If I took the time to answer your question and I came up with a solution, please mark my post as a solution and /or give kudos freely for the effort 🙂 Thank you!

Proud to be a Super User!




v-sihou-msft
Microsoft Employee
Microsoft Employee

@rocky09

 

I suggest you separate date into multiple views to import. Or use Direct Query instead of import mode.

 

Reference:
Data Import Best Practices in Power BI

 

Regards,

@v-sihou-msft

 

Thanks for the link. Most of the content helped me.

 

I am using individual queries to get the data from the server. But, sometimes, I am missing a common column in tables. So, I am not able to make relationships with it. Is there a way to create a custom common column ?

@rocky09

 

 I don't know what your common column looks like. You can add custom columns like index column.

 

Regards,

dkay84_PowerBI
Microsoft Employee
Microsoft Employee

It depends.

 

If you have a very large data set but you need to work with only a subset of the data, then when you set up your filters in the Query Editor it will only import the subset.  Keep in mind that there are file size limits and workspace storage limits for your PBI service.

 

If you need to work with the entire data set, I would recommend using Direct Query (if the underlying source is compatible with DQ).  There are some limitations here with regards to DAX, but DQ will not import any data into the data model meaning size is not an issue.

Anonymous
Not applicable

I'd love to know how the import filtering works and how to optimize the query to minimize the import. I have a table with 60M records, which I need to filter on two different criteria, which then selects a few thousand rows which are actually part of the final imported table. However whenever I refresh the query, Power BI is importing about 16M records over ODBC from our CRM which literally takes hours. It's obvioulsy doing some filtering, because it doesn't grab the whole database, but it's pulling in way more than make it through the filters and into the table

Helpful resources

Announcements
New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.

March Power BI Update Carousel

Power BI Community Update - March 2026

Check out the March 2026 Power BI update to learn about new features.