Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
I am trying to copy data from a small Azure sql database table into an Excel file, but I dont see an Excel type in "Destination" section.
Currently, I am using "Delimited-text" format and creating csv file. But asking, if there is a way to directly put the data in Excel file tabs ?
Solved! Go to Solution.
# Fabric Copy Activity and Excel File Output
Currently, in Microsoft Fabric's Data Pipeline Copy Activity, there isn't a native Excel file format option for destinations.
You're correct that the available formats include Delimited-text (CSV), Parquet, JSON, and others, but not Excel (.xlsx) directly.
Workarounds for Excel Output:
1. CSV to Excel Conversion:
Continue using the delimited-text format for the initial copy
Add a subsequent step to convert the CSV to Excel using:
A Notebook activity with Python (using pandas)
A Dataflow activity with transformation
2. Power BI Dataset:
Load your data into a Power BI dataset
Use "Analyze in Excel" feature from Power BI
3. Logic Apps or Azure Functions:
Create an automated flow that takes your CSV and converts it to Excel
4. Direct Query:
If users just need to analyze the data, consider creating a Direct-Query connection from Excel to your source
## Python Example for Conversion (in Notebook activity):
import pandas as pd
# Read the CSV created by your copy activity
df = pd.read_csv('/path/to/your/output.csv')
# Write to Excel
df.to_excel('/path/to/output.xlsx', sheet_name='Data', index=False)
# Fabric Copy Activity and Excel File Output
Currently, in Microsoft Fabric's Data Pipeline Copy Activity, there isn't a native Excel file format option for destinations.
You're correct that the available formats include Delimited-text (CSV), Parquet, JSON, and others, but not Excel (.xlsx) directly.
Workarounds for Excel Output:
1. CSV to Excel Conversion:
Continue using the delimited-text format for the initial copy
Add a subsequent step to convert the CSV to Excel using:
A Notebook activity with Python (using pandas)
A Dataflow activity with transformation
2. Power BI Dataset:
Load your data into a Power BI dataset
Use "Analyze in Excel" feature from Power BI
3. Logic Apps or Azure Functions:
Create an automated flow that takes your CSV and converts it to Excel
4. Direct Query:
If users just need to analyze the data, consider creating a Direct-Query connection from Excel to your source
## Python Example for Conversion (in Notebook activity):
import pandas as pd
# Read the CSV created by your copy activity
df = pd.read_csv('/path/to/your/output.csv')
# Write to Excel
df.to_excel('/path/to/output.xlsx', sheet_name='Data', index=False)
Thanks for the response! There are 3 tables, one has 20 rows, second has 300 rows and third has 2000+ rows. Since multiple people keeps updating it, sometimes via the Pipeline using excel files and sometimes queries, I want to build a pipeline that can create an excel file with current snapshot of the table data.
How small? You might have better luck with Power Automate. Worst case you can use the Graph batch API to avoid RBAR.
User | Count |
---|---|
19 | |
9 | |
8 | |
5 | |
4 |
User | Count |
---|---|
56 | |
27 | |
17 | |
17 | |
17 |