Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Power BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.

Reply
tecumseh
Resolver II
Resolver II

Export Dataset to Excel File On Schedule

Hi all,

How can I export a dataset to Excel. Currently 88,600 rows in the dataset
I added a Python Script as last step in Power Query but when I publish I cannot figure out how to setup Gatway for Snowflake refresh and get the Python Script to execute?

If not that process any other ideas? I tried Power Automate, but that has some limits that I am exceeding for number of values, number of rows, etc...

Thanks,

w

1 ACCEPTED SOLUTION

Hi @tecumseh ,

 

You are correct, while the Power BI REST API doesn't impose a strict row limit, it does have a payload limit of approximately 1 MB per response, which can lead to truncated results for large or wide datasets, like what you experienced with Power Automate. You can find the official documentation for the API here: Execute Queries in Group – Power BI REST API.

 

To work around this, consider reducing the number of columns returned and implement pagination logic in your DAX queries (e.g., filtering on row numbers or date ranges) to retrieve data in smaller chunks. However, for greater control and scalability, the most reliable method would be extracting data directly from Snowflake using Python (via the Snowflake Python Connector) and then exporting to Excel using libraries like pandas or openpyxl. This approach avoids API and gateway constraints and can be fully automated using Windows Task Scheduler or Azure Automation.

 

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thankyou.

 

View solution in original post

10 REPLIES 10
KonradZawel
Helper I
Helper I

Power BI doesn't support scheduled export of a dataset directly to Excel. However, you can achieve this using Power Automate by creating a flow that queries the dataset on a schedule and writes the data to an Excel file stored in OneDrive or SharePoint. If you're using Power BI Premium, you can also consider exporting data via a paginated report.

v-tsaipranay
Community Support
Community Support

Hi @tecumseh ,

 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

 

Thank you.

v-tsaipranay
Community Support
Community Support

Hi @tecumseh ,

 

May I ask if you have resolved this issue? If so, please mark the helpful reply and accept it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

 

Thank you.

v-tsaipranay
Community Support
Community Support

Hi @tecumseh ,

Thank you for reaching out to the Microsoft fabric community forum. Also thank you @ibarrau  for the helpful suggestion around using the Power BI REST API.

 

To clarify your question, Power BI does not support using both a Personal Gateway and an Enterprise Gateway on the same dataset. A dataset can only be connected to one gateway type at a time, so combining Snowflake (via Enterprise Gateway) with Python scripts (which require a Personal Gateway) in the same model isn't supported.

 

As a workaround, consider offloading the Python logic outside Power BI either by using the REST API externally or exporting Snowflake data directly via Python and automating the export.

 

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thankyou.

Hi @tecumseh ,

I wanted to check if you had the opportunity to review the information provided. Please feel free to contact us if you have any further questions. If my response has addressed your query, please accept it as a solution and give a 'Kudos' so other members can easily find it.


Thank you.

@v-tsaipranay 

I need to spend some time on it. I was pulled away my more pressing projects. on REST API approach isn't that going to be limited in size of data? I tried using Power Automate, but that has many limitations on size of data so I only received 9.2K rows out of expected 86K rows of data.

 

Thanks,

w

Hi @tecumseh ,

 

You are correct, while the Power BI REST API doesn't impose a strict row limit, it does have a payload limit of approximately 1 MB per response, which can lead to truncated results for large or wide datasets, like what you experienced with Power Automate. You can find the official documentation for the API here: Execute Queries in Group – Power BI REST API.

 

To work around this, consider reducing the number of columns returned and implement pagination logic in your DAX queries (e.g., filtering on row numbers or date ranges) to retrieve data in smaller chunks. However, for greater control and scalability, the most reliable method would be extracting data directly from Snowflake using Python (via the Snowflake Python Connector) and then exporting to Excel using libraries like pandas or openpyxl. This approach avoids API and gateway constraints and can be fully automated using Windows Task Scheduler or Azure Automation.

 

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thankyou.

 

ibarrau
Super User
Super User

Hi. In order to work with python at transforming data you need a Personal Gateway to make it work. Python only works like that. For the alternative you can use the power bi rest api. Execute a DAX query to the table and reduce unnecessary columns to prevent limits.

This is the request: https://learn.microsoft.com/en-us/rest/api/power-bi/datasets/execute-queries-in-group

You can make it work with SimplePBI library at python to make it easy or use Power Automate.

I hope that helps


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

Thanks @ibarrau 

Can I use personal gateway for pythin bit in conjunction with Gateway I am currently using for Snowflake refresh? How do I setup so I can use the 2 different Gateways together?

 

Thanks,

w

Like v-tsaipranay said you can't do both of them. Just configure the Personal Gateway for the whole semantic model. It should work. It's not the best approach for snowflake to download at the VM notebook and upload again but it will work.

I hope that make sense.


If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Happy to help!

LaDataWeb Blog

Helpful resources

Announcements
June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.