Skip to main content
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Not applicable

RE: Larger Datasets from Azure Data bricks

The dataset that I have is around 30GB in the source ( Azure Data bricks) and I tried the 2 methods (Import & Direct Query)

Direct Query: This is taking too long for the visualization to load (around 20 mins) for a single field in a table visualization. 


I am facing this error after 30-40 mins after applying data. 

Error Message: There isn't enough memory to complete this operation. Please try again later when there may be more moemory available

1. What is the largest dataset that we can Import into Power BI?

2. Solution to the above challenge?


Super User
Super User

Hey @Anonymous ,


1. In Power BI Desktop the limit is the memory on your computer. In Power BI Service the limit is 1 GB in RAM for a Pro account and 100 GB for a Premium per User account. For a premium capacity the limit is 400 GB

2. Solution is either to get a Premium per User account or Premium, to use DirectQuery and optimize your data source, to use an Azure Analysis Service and import the data there, to analyze what makes your data set so big. Maybe you can optimize things or there is a column that makes the size so big and you don't even use it.


If you need any help please let me know.
If I answered your question I would be happy if you could mark my post as a solution ✔️ and give it a thumbs up 👍
Best regards

Helpful resources

RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.


Power BI Monthly Update - May 2024

Check out the May 2024 Power BI update to learn about new features.


Fabric certifications survey

Certification feedback opportunity for the community.