Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by attending the DP-600 session on April 23rd (pacific time), live or on-demand.
Learn moreNext up in the FabCon + SQLCon recap series: The roadmap for Microsoft SQL and Maximizing Developer experiences in Fabric. All sessions are available on-demand after the live show. Register now
I’m working on a report in Microsoft Fabric, and I need to export around 1 million rows and 600 columns to Excel.
What’s the best way to achieve this, considering the current platform limitations? Are there any workarounds or recommended practices for exporting large datasets?
Thank you in advance for your help!
Solved! Go to Solution.
Hi @gloulakis ,
Thanks for reaching out to the Microsoft fabric community forum.
Can you please try below options:
1. Use a Paginated Report: Paginated reports can handle large tables. You can publish the paginated report to Fabric and export directly to Excel or CSV.
How to: Use Power BI Report Builder to create the paginated report. Connect to your Fabric dataset or a shared semantic model. Export the full table.
Note: Use CSV if Excel runs into memory issues.
2. Use Dataflows or Notebooks to Export to OneLake or Azure: If your data model is backed by a Lakehouse or Warehouse in Fabric: Use a Notebook (Spark or SQL) to extract the data into CSV or Parquet files in OneLake. From there, download the file or import into Excel.
Advantages: No export limit, flexible output, better for automation.
3. Use DAX Studio or Tabular Editor for Direct Export: Connect to your Fabric semantic model using DAX Studio. Run a DAX query to pull your table and export to CSV. Much faster and bypasses Power BI export limitations.
4. Break It Down into Chunks: Create filters or slicers in your report to break the export into manageable pieces (e.g., by month or BU).Export chunks manually or automate with Power Automate (if feasible).
Note: Excel is not ideal for handling 1M rows × 600 columns. Consider using Parquet or CSV format and tools like Python, R, or Power Query for downstream analysis. Try Power BI Dataflow Gen2 to export the data to a Lakehouse, which integrates easily with Fabric and has no export limits. Consider using Azure Data Factory or Logic Apps if you need to automate large data movements out of Fabric.
If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS. Feel free to reach out if you need further assistance.
Thanks and Regards
Hi @gloulakis There are limitations in exporting large datasets from Fabric to Excel. To work around this, consider exporting to CSV, which handles larger datasets, or use paginated reports for better export flexibility. If you could then also try to export the data to Azure SQL Database or Blob Storage for processing. For Excel-specific needs, split the dataset into multiple files or connect directly via "Analyze in Excel."
Hi @gloulakis ,
Thanks for reaching out to the Microsoft fabric community forum.
Can you please try below options:
1. Use a Paginated Report: Paginated reports can handle large tables. You can publish the paginated report to Fabric and export directly to Excel or CSV.
How to: Use Power BI Report Builder to create the paginated report. Connect to your Fabric dataset or a shared semantic model. Export the full table.
Note: Use CSV if Excel runs into memory issues.
2. Use Dataflows or Notebooks to Export to OneLake or Azure: If your data model is backed by a Lakehouse or Warehouse in Fabric: Use a Notebook (Spark or SQL) to extract the data into CSV or Parquet files in OneLake. From there, download the file or import into Excel.
Advantages: No export limit, flexible output, better for automation.
3. Use DAX Studio or Tabular Editor for Direct Export: Connect to your Fabric semantic model using DAX Studio. Run a DAX query to pull your table and export to CSV. Much faster and bypasses Power BI export limitations.
4. Break It Down into Chunks: Create filters or slicers in your report to break the export into manageable pieces (e.g., by month or BU).Export chunks manually or automate with Power Automate (if feasible).
Note: Excel is not ideal for handling 1M rows × 600 columns. Consider using Parquet or CSV format and tools like Python, R, or Power Query for downstream analysis. Try Power BI Dataflow Gen2 to export the data to a Lakehouse, which integrates easily with Fabric and has no export limits. Consider using Azure Data Factory or Logic Apps if you need to automate large data movements out of Fabric.
If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS. Feel free to reach out if you need further assistance.
Thanks and Regards
Hi @gloulakis ,
If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS. Feel free to reach out if you need further assistance.
Thanks and Regards
Hi @gloulakis ,
If you find this post helpful, please mark it as an "Accept as Solution" and consider giving a KUDOS. Feel free to reach out if you need further assistance.
Thanks and Regards
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.