The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I am trying to copy data from a small Azure sql database table into an Excel file, but I dont see an Excel type in "Destination" section.
Currently, I am using "Delimited-text" format and creating csv file. But asking, if there is a way to directly put the data in Excel file tabs ?
Solved! Go to Solution.
# Fabric Copy Activity and Excel File Output
Currently, in Microsoft Fabric's Data Pipeline Copy Activity, there isn't a native Excel file format option for destinations.
You're correct that the available formats include Delimited-text (CSV), Parquet, JSON, and others, but not Excel (.xlsx) directly.
Workarounds for Excel Output:
1. CSV to Excel Conversion:
Continue using the delimited-text format for the initial copy
Add a subsequent step to convert the CSV to Excel using:
A Notebook activity with Python (using pandas)
A Dataflow activity with transformation
2. Power BI Dataset:
Load your data into a Power BI dataset
Use "Analyze in Excel" feature from Power BI
3. Logic Apps or Azure Functions:
Create an automated flow that takes your CSV and converts it to Excel
4. Direct Query:
If users just need to analyze the data, consider creating a Direct-Query connection from Excel to your source
## Python Example for Conversion (in Notebook activity):
import pandas as pd
# Read the CSV created by your copy activity
df = pd.read_csv('/path/to/your/output.csv')
# Write to Excel
df.to_excel('/path/to/output.xlsx', sheet_name='Data', index=False)
# Fabric Copy Activity and Excel File Output
Currently, in Microsoft Fabric's Data Pipeline Copy Activity, there isn't a native Excel file format option for destinations.
You're correct that the available formats include Delimited-text (CSV), Parquet, JSON, and others, but not Excel (.xlsx) directly.
Workarounds for Excel Output:
1. CSV to Excel Conversion:
Continue using the delimited-text format for the initial copy
Add a subsequent step to convert the CSV to Excel using:
A Notebook activity with Python (using pandas)
A Dataflow activity with transformation
2. Power BI Dataset:
Load your data into a Power BI dataset
Use "Analyze in Excel" feature from Power BI
3. Logic Apps or Azure Functions:
Create an automated flow that takes your CSV and converts it to Excel
4. Direct Query:
If users just need to analyze the data, consider creating a Direct-Query connection from Excel to your source
## Python Example for Conversion (in Notebook activity):
import pandas as pd
# Read the CSV created by your copy activity
df = pd.read_csv('/path/to/your/output.csv')
# Write to Excel
df.to_excel('/path/to/output.xlsx', sheet_name='Data', index=False)
Thanks for the response! There are 3 tables, one has 20 rows, second has 300 rows and third has 2000+ rows. Since multiple people keeps updating it, sometimes via the Pipeline using excel files and sometimes queries, I want to build a pipeline that can create an excel file with current snapshot of the table data.
How small? You might have better luck with Power Automate. Worst case you can use the Graph batch API to avoid RBAR.
User | Count |
---|---|
18 | |
10 | |
6 | |
3 | |
3 |
User | Count |
---|---|
45 | |
22 | |
17 | |
13 | |
12 |