Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Get Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now

Reply
Anonymous
Not applicable

Importing 6 billion Data tables from Snowflake

Hi friends,

 

I am trying to import 6 billion data from snowflake to powerbi using my power bi desktop. I have 8 gb machine and I am able to import the data in 40 minutes which is very long waiting time for us. Specially when in Snowflake, same query returns all 6 billion records in 23 seconds.  I am using odbc connector for snowflake connection.

 

Could you please help me by suggesting what needs to be done in my case to improve the report data refresh time.

 

Same thing is happening (40 Minutes data refresh time) in PowerBI service as well when I publish report in workspace.

 

Thanks,

Prabhat Omker  

1 ACCEPTED SOLUTION
JarroVGIT
Resident Rockstar
Resident Rockstar

Hi @Anonymous ,

Despite of the executionspeed of Snowflake itself, the data still has to be transferred to your local machine. A table with 6 billion rows, that is a massive amount of data to transfer from Snowflake onto your laptop or Power BI service. To give an example: if every row holds only 1Kb of data (which is not a lot to be honest), then your data size would mount to 6,000,000,000 / 1,024 (=mb) / 1,024 (=gb) = 5,722 GB. There would be some smart compressing be going on but just to illustrate that the amount of time of getting results from a query of an external source is not just the query execution time; it is also the transfer time. Especially in cases like this. 

If this is too long for you, you might want to look into incremental updates (but that requires Power BI premium);

https://docs.microsoft.com/en-us/power-bi/service-premium-incremental-refresh 

Hope this gives you some clarification as to why this takes longer than you expected 🙂

 

Kind regards

Djerro123

-------------------------------

If this answered your question, please mark it as the Solution. This also helps others to find what they are looking for.

Keep those thumbs up coming! 🙂





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




View solution in original post

1 REPLY 1
JarroVGIT
Resident Rockstar
Resident Rockstar

Hi @Anonymous ,

Despite of the executionspeed of Snowflake itself, the data still has to be transferred to your local machine. A table with 6 billion rows, that is a massive amount of data to transfer from Snowflake onto your laptop or Power BI service. To give an example: if every row holds only 1Kb of data (which is not a lot to be honest), then your data size would mount to 6,000,000,000 / 1,024 (=mb) / 1,024 (=gb) = 5,722 GB. There would be some smart compressing be going on but just to illustrate that the amount of time of getting results from a query of an external source is not just the query execution time; it is also the transfer time. Especially in cases like this. 

If this is too long for you, you might want to look into incremental updates (but that requires Power BI premium);

https://docs.microsoft.com/en-us/power-bi/service-premium-incremental-refresh 

Hope this gives you some clarification as to why this takes longer than you expected 🙂

 

Kind regards

Djerro123

-------------------------------

If this answered your question, please mark it as the Solution. This also helps others to find what they are looking for.

Keep those thumbs up coming! 🙂





Did I answer your question? Mark my post as a solution!

Proud to be a Super User!




Helpful resources

Announcements
Fabric Data Days Carousel

Fabric Data Days

Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!

October Power BI Update Carousel

Power BI Monthly Update - October 2025

Check out the October 2025 Power BI update to learn about new features.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors