Check your eligibility for this 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700.
Get StartedDon't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.
I am ingesting JSON data from an API as a source in Microsoft Fabric and handling pagination within a loop. The data is successfully parsed and written to a CSV file using the Copy Data activity. However, each time the loop runs, the previous data in the CSV file gets replaced instead of being appended.
How can I configure the Copy Data activity to append new data to the existing CSV file instead of overwriting it? Any guidance or best practices would be greatly appreciated.
Solved! Go to Solution.
Hello @bhavya5903
@NandanHegde is absolutely right. Here are more details
appending data directly to an existing CSV file using the Copy Data activity in Microsoft Fabric is not natively supported. The Copy Data activity in Fabric is designed to write to a destination in either overwrite or create mode, but it does not support appending data to an existing file directly.
Why Appending Is Not Supported
The Copy Data activity overwrites the file in the sink by default because it treats each execution as a new write operation. CSV files are not structured like databases, so they lack native support for incremental updates or appends within the same file.
Workarounds to Achieve Appending
If you need to append data to a CSV file, you can use one of the following approaches:
1. Use a Staging File and Merge
• Write each iteration of data into separate files (e.g., `file1.csv`, `file2.csv`).
• At the end of the loop or pipeline execution, use a script or another process (e.g., a notebook or Dataflow) to merge all these files into a single CSV file.
better approach would be to in Dataflow Gen2, ingest JSON by parsing or flattening it, then configure the Delta table as the destination with “Append” mode.
hope this helps
if this is helpful please accept the answer and give kudos
Files can be merged using Copy activity as well:
https://www.sqlservercentral.com/articles/merge-multiple-files-in-azure-data-factory
no need of Dataflow Gen 2 or custom code specifically
Hi @bhavya5903 ,
we wanted to check in as we haven't heard back from you. Did our solution work for you? If you need any more help, please don't hesitate to ask. Your feedback is very important to us. We hope to hear from you soon.
Thank You.
Hi @bhavya5903 ,
Thank you for reaching out to the Microsoft Fabric Community with your question. The solution provided by the @NandanHegde is correct. Unfortunately, the Copy Data activity does not support directly appending data to an existing CSV file. However, the workarounds suggested by the @NandanHegde might be helpful in addressing your issue.
A special thank you to @NandanHegde , for their valuable contribution.
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
Hello @bhavya5903
@NandanHegde is absolutely right. Here are more details
appending data directly to an existing CSV file using the Copy Data activity in Microsoft Fabric is not natively supported. The Copy Data activity in Fabric is designed to write to a destination in either overwrite or create mode, but it does not support appending data to an existing file directly.
Why Appending Is Not Supported
The Copy Data activity overwrites the file in the sink by default because it treats each execution as a new write operation. CSV files are not structured like databases, so they lack native support for incremental updates or appends within the same file.
Workarounds to Achieve Appending
If you need to append data to a CSV file, you can use one of the following approaches:
1. Use a Staging File and Merge
• Write each iteration of data into separate files (e.g., `file1.csv`, `file2.csv`).
• At the end of the loop or pipeline execution, use a script or another process (e.g., a notebook or Dataflow) to merge all these files into a single CSV file.
better approach would be to in Dataflow Gen2, ingest JSON by parsing or flattening it, then configure the Delta table as the destination with “Append” mode.
hope this helps
if this is helpful please accept the answer and give kudos
Files can be merged using Copy activity as well:
https://www.sqlservercentral.com/articles/merge-multiple-files-in-azure-data-factory
no need of Dataflow Gen 2 or custom code specifically
yes, it is 🙂
It is not possible to append data directly.
You need to parameterize your destination file name to be dynamic, so that for every pagination loop a new file is generated.
Then you can merge all files into a single file and get rid of all individual files
User | Count |
---|---|
39 | |
10 | |
4 | |
3 | |
2 |
User | Count |
---|---|
48 | |
16 | |
7 | |
6 | |
5 |