Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.
How do you read Activity Statistics?
E.g. this is before and after of df gen2 refresh
Between V1 and V2, it
dropped 2 rows, [{doc_1, 2, 3000},{doc_2, 112211, 4778}],
updated 1 row [{doc_1, 1, 1000}]->[{doc_1, 1, -1000}]
inserted 2 rows [{doc_2, 478, 900},{doc_2, 584, 65}]
and now, the log shows me this
How do you read this? thank you in advance
Edited by admin: Removing tagged users
Solved! Go to Solution.
Hi @smpa01
Thanks for joining the Microsoft Fabric Community conversation. Sorry for the late response.
Please do chekc the below points to resolve your issue.
Ensure that your dataflow has the correct transformation logic that commits the changes (inserts, updates, or upserts) back to the target system.
Run the dataflow in debug mode to check if the changes are applied correctly during execution and look for any issues or warnings related to writing data back.
Confirm that the service or account running the dataflow has the required permissions to write data to the destination database/storage.
Ensure that the data is being written to the database in the DEV environment. The log shows that 0 rows were written, indicating that data isn't being committed. Double-check your dataflow’s output destination configuration.
If needed, manually trigger the write operation for your dataflow and ensure that the statistics show the changes being committed.
Ensure that the changes in the DEV environment are correctly propagated to PROD. Verify that the same dataflow logic is applied in both environments, and the necessary steps are followed to move data from DEV to PROD.
If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.
Hi @smpa01
I wanted to follow up since I haven't heard from you in a while. Have you had a chance to try the suggested solutions?
If your issue is resolved, please consider marking the post as solved. However, if you're still facing challenges, feel free to share the details, and we'll be happy to assist you further.
Looking forward to your response!
Best Regards,
Community Support Team _ C Srikanth.
Hi @smpa01
We haven't heard from you since last response and just wanted to check whether the solution provided has worked for you. If yes, please Accept as Solution to help others benefit in the community.
Thank you.
If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.
Hi @smpa01
It's been a while since I heard back from you and I wanted to follow up. Have you had a chance to try the solutions that have been offered?
If the issue has been resolved, can you mark the post as resolved? If you're still experiencing challenges, please feel free to let us know and we'll be happy to continue to help!
Looking forward to your reply!
Best Regards,
Community Support Team _ C Srikanth.
Hi @smpa01
Thanks for joining the Microsoft Fabric Community conversation. Sorry for the late response.
Please do chekc the below points to resolve your issue.
Ensure that your dataflow has the correct transformation logic that commits the changes (inserts, updates, or upserts) back to the target system.
Run the dataflow in debug mode to check if the changes are applied correctly during execution and look for any issues or warnings related to writing data back.
Confirm that the service or account running the dataflow has the required permissions to write data to the destination database/storage.
Ensure that the data is being written to the database in the DEV environment. The log shows that 0 rows were written, indicating that data isn't being committed. Double-check your dataflow’s output destination configuration.
If needed, manually trigger the write operation for your dataflow and ensure that the statistics show the changes being committed.
Ensure that the changes in the DEV environment are correctly propagated to PROD. Verify that the same dataflow logic is applied in both environments, and the necessary steps are followed to move data from DEV to PROD.
If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.
Hi @smpa01
How to Read Activity Statistics Status:
Shows if the activity succeeded or failed.
Start time / End time / Duration:
Tells you when the job started, ended, and how long it took.
Activity Statistics Table:
This table breaks down the data movement for each endpoint (Lakehouse, SharePoint, SQL):
|
Endpoint |
Bytes read |
Rows read |
Bytes written |
Rows written |
|
Lakehouse |
83,658 |
0 |
39,538 |
0 |
|
SharePoint |
18,638,242 |
0 |
0 |
0 |
|
SQL |
0 |
2 |
0 |
0 |
Bytes read: How much data (in bytes) was read from this endpoint.
Rows read: How many rows were read from this endpoint.
Bytes written: How much data was written to this endpoint.
Rows written: How many rows were written to this endpoint.
In your example:
Your V1 and V2 screenshots show how the data changed as a result of the refresh.
If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.
Is there a connection between following? if yes, I am unable to connect (I thught Activity Statistics would provide a visual verification of UPSERT of the DEV before puting elements on PROD)
this
dropped 2 rows (-2)
updated 1 row
inserted 2 rows )+2)
and
that
SQL: