Skip to main content
cancel
Showing results forย 
Search instead forย 
Did you mean:ย 

Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.

Reply
Ka13
Frequent Visitor

Copy Activity Fabric

How to extract the values of rowsCopied and rowsRead from the below Copy activity output in Fabric and store in Fabric Warehouse to maintain the log table of rowsCopied and rowsRead in fabric warehouse


{
"dataRead": 5480,
"dataWritten": 5680,
"filesWritten": 1,
"sourcePeakConnections": 1,
"sinkPeakConnections": 1,
"rowsRead": 30,
"rowsCopied": 30,
"copyDuration": 24
1 ACCEPTED SOLUTION

16 REPLIES 16
tayloramy
Community Champion
Community Champion

Hi @Ka13

 

Let's go through this step by step. 

 

First, I created a metadata table in my warehouse: 

CREATE TABLE metadata  (
runid UNIQUEIDENTIFIER, 
rowsRead int, 
rowsCopied int, 
CopyDuration int

)

 

Next I add a script activity after the copy activity: 

tayloramy_0-1762285056527.png

 

Next, in the expression builder, I will build the expression: 

 

@concat('INSERT INTO dbo.metadata (runid, rowsRead, rowsCopied, CopyDuration) VALUES
(''',pipeline().RunId,''', '''
,activity('Copy data1').output.rowsRead, ''','''
,activity('Copy data1').output.rowsCopied, ''','''
,activity('Copy data1').output.copyDuration, ''');' )

 

THis runs successfully: 

tayloramy_1-1762285528849.png

 

 

 and wehn I look at the warehouse, I see my run recorded: 

tayloramy_2-1762285549918.png

 

 

Hope this helps. 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

@tayloramy Thanks for response.

 

I tested above taking output from Copy Activity and reading in the Script activity is working.

 

But I had the Lookup Query from Fabric Warehouse which takes the TableName from it and I wanted to add the TableName in the log table also. I tried adding the Table name as below but was getting the Error. Can you please suggest


Error - concat is primitive and does not support nested properties 

 

Ka13_0-1762305032179.png

 

tayloramy
Community Champion
Community Champion

Hi @Ka13

 


You will need to reference the lookup activity the same way you reference the copy data activity. activity("Lookup Query").output.XXXX When you run the pipeline, after the lookup is successful, you can click on the output to see exactly what the lookup query is returning

 

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

@tayloramy Thanks above worked 

tayloramy
Community Champion
Community Champion

Hi @Ka13

 

Glad I was able to help. If I solved your problem, kindly mark my post as the solution to help others find it in the future.  

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

@tayloramy

Can you pls suggest 


1. Below is the output of the lookup activity. But in the Script activity how to reference the table name.

{
"count": 1,
"value": [
{
"id": 1,
"TableName": "emp",
}
]
}


2. But in the Script activity ( which is after the Copy Activity) how to reference the table name?

  Below insert is working fine but if I pass the TableName getting Error -

-- Working INSERT query
@concat('INSERT INTO dbo.metadata (runid, rowsRead, rowsCopied, CopyDuration) VALUES (''',pipeline().RunId,''', ''' ,activity('Copy data1').output.rowsRead, ''',''' ,activity('Copy data1').output.rowsCopied, ''',''' ,activity('Copy data1').output.copyDuration, ''');' )


-- Below error in the Insert Query (  I tried adding the Table name as below but was getting the Error. Can you please  suggest.

   Error - concat is primitive and does not support nested properties )
   
    @concat('INSERT INTO [dbo].[log] (RowsRead,TableName) VALUES (''' ,activity('Copy data1').output.rowsRead, ''',
  ''', item().TableName, '''');' )  

 

anilgavhane
Responsive Resident
Responsive Resident

@Ka13 

Create a Log Table in Fabric Warehouse

 

CREATE TABLE CopyActivityLog ( ActivityDate DATETIME, RowsRead INT, RowsCopied INT );

 

2. Use a Data Pipeline with Script Activity

  • After the Copy activity, add a Script activity in the pipeline.
  • Use dynamic content to pass rowsRead and rowsCopied from the Copy activity output.

3. Insert Values Using SQL Script

Example SQL script using pipeline parameters:

 

INSERT INTO CopyActivityLog (ActivityDate, RowsRead, RowsCopied) VALUES (GETDATE(), @{activity('CopyData').output.rowsRead}, @{activity('CopyData').output.rowsCopied});

๎ท™๎ทš

 

 

Replace 'CopyData' with the actual name of your Copy activity.

 

4. Validate Pipeline Execution

  • Run the pipeline and check the log table.
  • Ensure the Script activity executes after the Copy activity.

@anilgavhane @Thanks for the response

 

i tried adding the Script activity after Copy Activity as below but getting Error. Seems The Script dynamic content is not correct. Can you please suggest 

 

Ka13_3-1762279904308.pngKa13_4-1762279934618.png

 

 

 

 

 

tayloramy
Community Champion
Community Champion

Hi @Ka13

 

Here's my set up for this: 

 

I have a pipeline with a copy data activity and an email activity. 

tayloramy_0-1762283372092.png

 

In the body of the email activity, I click "View in expression builder" to open up the expression builder: 

tayloramy_1-1762283405707.png

And then for the expression, I put 

@concat('Rows Read ', activity('Copy data1').output.rowsRead)

 

This results in the following email: 

tayloramy_2-1762283509533.png

 

As you can see, the rowsRead output of 6 is added to the email. 

 

Here is the output from my copy data activity: 

{
	"dataRead": 6608,
	"dataWritten": 4572,
	"filesRead": 1,
	"filesWritten": 1,
	"sourcePeakConnections": 4,
	"sinkPeakConnections": 1,
	"rowsRead": 6,
	"rowsCopied": 6,
	"copyDuration": 14,
	"throughput": 1.101,
	"errors": [],
	"usedDataIntegrationUnits": 4,
	"usedParallelCopies": 1,
	"executionDetails": [
		{
			"source": {
				"type": "Lakehouse"
			},
			"sink": {
				"type": "Lakehouse"
			},
			"status": "Succeeded",
			"start": "11/4/2025, 12:10:42 PM",
			"duration": 14,
			"usedDataIntegrationUnits": 4,
			"usedParallelCopies": 1,
			"profile": {
				"queue": {
					"status": "Completed",
					"duration": 6
				},
				"transfer": {
					"status": "Completed",
					"duration": 6,
					"details": {
						"listingSource": {
							"type": "Lakehouse",
							"workingDuration": 0
						},
						"readingFromSource": {
							"type": "Lakehouse",
							"workingDuration": 0
						},
						"writingToSink": {
							"type": "Lakehouse",
							"workingDuration": 0
						}
					}
				}
			},
			"detailedDurations": {
				"queuingDuration": 6,
				"transferDuration": 6
			}
		}
	],
	"dataConsistencyVerification": {
		"VerificationResult": "NotVerified"
	}
}

 

Hope this helps. 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.
tayloramy
Community Champion
Community Champion

Hi @Ka13

 

As mentioned in the Fabric Discord, you can add a script task that references the output of a previous step. 

Something along the lines of: @activity(Copy-activity).output.rowsRead should work for you

 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution. 

If you found this helpful, consider giving some Kudos. If I answered your question or solved your problem, mark this post as the solution.

@tayloramy 

i tried adding the Script activity after Copy Activity as below but getting Error. Seems The Script dynamic content is not correct. Can you please suggest 

Thanks for the response

 

@tayloramy i tried adding the Script activity after Copy Activity as below but getting Error. Seems The Script dynamic content is not correct. Can you please suggest 

@tayloramy 

Thanks for the response

 

i tried adding the Script activity after Copy Activity as below but getting Error. Seems The Script dynamic content is not correct. Can you please suggest 

 

Ka13_0-1762281996205.pngKa13_1-1762282021406.png

 

 

@tayloramy 

Thanks for the response

 

i tried adding the Script activity after Copy Activity as below but getting Error. Seems The Script dynamic content is not correct. Can you please suggest 

 

Ka13_0-1762281800096.pngKa13_1-1762281833482.png

 

 

Vinodh247
Solution Sage
Solution Sage

use dataflow/notebook after the copy activity to extract metrics from the output JSON. I have copied some example (from web, hope this will show you the path) in a notebook using Pipeline().activity() output, this stores the rowsRead and rowsCopied values into your Fabric warehouse log table.

 

import json
from notebookutils import mssparkutils

# Get Copy activity output
output = mssparkutils.pipeline.getOutput("CopyActivityName")
data = json.loads(output)

rows_read = data["rowsRead"]
rows_copied = data["rowsCopied"]

# Write to fabric warehouse tbl
spark.sql(f"""
INSERT INTO log_table (activity_name, rows_read, rows_copied, load_time)
VALUES ('CopyActivityName', {rows_read}, {rows_copied}, CURRENT_TIMESTAMP)
""")

Make sure to customize/modify to suit your req.

 

 

 

Please 'Kudos' and 'Accept as Solution' if this answered your query.

Regards,
Vinodh
Microsoft MVP [Fabric]

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors