Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Special holiday offer! You and a friend can attend FabCon with a BOGO code. Supplies are limited. Register now.

Reply
smpa01
Super User
Super User

Activity Statistics

How do you read Activity Statistics?

 

E.g. this is before and after of df gen2 refresh

 

smpa01_0-1746034713226.png

 

Between V1 and V2, it 

dropped 2 rows, [{doc_1, 2, 3000},{doc_2, 112211, 4778}], 

updated 1 row [{doc_1, 1, 1000}]->[{doc_1, 1, -1000}]

inserted 2 rows [{doc_2, 478, 900},{doc_2, 584, 65}]

 

and now, the log shows me this

smpa01_1-1746034998696.png

How do you read this? thank you in advance


Edited by admin: Removing tagged users

Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs
1 ACCEPTED SOLUTION
v-csrikanth
Community Support
Community Support

Hi @smpa01 
Thanks for joining the Microsoft Fabric Community conversation. Sorry for the late response.
Please do chekc the below points to resolve your issue.

  • Ensure that your dataflow has the correct transformation logic that commits the changes (inserts, updates, or upserts) back to the target system.

  • Run the dataflow in debug mode to check if the changes are applied correctly during execution and look for any issues or warnings related to writing data back.

  • Confirm that the service or account running the dataflow has the required permissions to write data to the destination database/storage.

  • Ensure that the data is being written to the database in the DEV environment. The log shows that 0 rows were written, indicating that data isn't being committed. Double-check your dataflow’s output destination configuration.

  • If needed, manually trigger the write operation for your dataflow and ensure that the statistics show the changes being committed.

  • Ensure that the changes in the DEV environment are correctly propagated to PROD. Verify that the same dataflow logic is applied in both environments, and the necessary steps are followed to move data from DEV to PROD.

If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.

 

View solution in original post

6 REPLIES 6
v-csrikanth
Community Support
Community Support

Hi @smpa01 

I wanted to follow up since I haven't heard from you in a while. Have you had a chance to try the suggested solutions?
If your issue is resolved, please consider marking the post as solved. However, if you're still facing challenges, feel free to share the details, and we'll be happy to assist you further.
Looking forward to your response!

Best Regards,
Community Support Team _ C Srikanth.

v-csrikanth
Community Support
Community Support

Hi @smpa01 

We haven't heard from you since last response and just wanted to check whether the solution provided has worked for you. If yes, please Accept as Solution to help others benefit in the community.
Thank you.

If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.

v-csrikanth
Community Support
Community Support

Hi @smpa01 

It's been a while since I heard back from you and I wanted to follow up. Have you had a chance to try the solutions that have been offered?
If the issue has been resolved, can you mark the post as resolved? If you're still experiencing challenges, please feel free to let us know and we'll be happy to continue to help!
Looking forward to your reply!

Best Regards,
Community Support Team _ C Srikanth.

v-csrikanth
Community Support
Community Support

Hi @smpa01 
Thanks for joining the Microsoft Fabric Community conversation. Sorry for the late response.
Please do chekc the below points to resolve your issue.

  • Ensure that your dataflow has the correct transformation logic that commits the changes (inserts, updates, or upserts) back to the target system.

  • Run the dataflow in debug mode to check if the changes are applied correctly during execution and look for any issues or warnings related to writing data back.

  • Confirm that the service or account running the dataflow has the required permissions to write data to the destination database/storage.

  • Ensure that the data is being written to the database in the DEV environment. The log shows that 0 rows were written, indicating that data isn't being committed. Double-check your dataflow’s output destination configuration.

  • If needed, manually trigger the write operation for your dataflow and ensure that the statistics show the changes being committed.

  • Ensure that the changes in the DEV environment are correctly propagated to PROD. Verify that the same dataflow logic is applied in both environments, and the necessary steps are followed to move data from DEV to PROD.

If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.

 

v-csrikanth
Community Support
Community Support

Hi @smpa01  
How to Read Activity Statistics Status:

Shows if the activity succeeded or failed.

Start time / End time / Duration:
Tells you when the job started, ended, and how long it took.

  1. Start time: 4/30/2025, 1:32:53 PM
  2. End time: 4/30/2025, 1:32:59 PM
  3. Duration: 00:00:05 (5 seconds)

Activity Statistics Table:

This table breaks down the data movement for each endpoint (Lakehouse, SharePoint, SQL):

Endpoint

Bytes read

Rows read

Bytes written

Rows written

Lakehouse

83,658

0

39,538

0

SharePoint

18,638,242

0

0

0

SQL

0

2

0

0

Bytes read: How much data (in bytes) was read from this endpoint.

Rows read: How many rows were read from this endpoint.

Bytes written: How much data was written to this endpoint.

Rows written: How many rows were written to this endpoint.

In your example:

  1. Lakehouse:
    • 83,658 bytes read, 0 rows read
    • 39,538 bytes written, 0 rows written
      (Suggests data was processed at the file or batch level, not row-by-row.)
  2. SharePoint:
    • 18,638,242 bytes read, 0 rows read
    • 0 bytes/rows written
      (A large file was read from SharePoint, but not written anywhere in this step.)
  3. SQL:
    • 0 bytes read, 2 rows read
    • 0 bytes/rows written
      (Only 2 rows were read from SQL, nothing was written.)

Your V1 and V2 screenshots show how the data changed as a result of the refresh.

  • Dropped rows: Rows that existed before but were removed after refresh.
  • Updated rows: Rows where some values changed.
  • Inserted rows: New rows that appeared after refresh.


If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.

 

@v-csrikanth 

Is there a connection between following? if yes, I am unable to connect (I thught Activity Statistics would provide a visual verification of UPSERT of the DEV before puting elements on PROD)

this

dropped 2 rows (-2)

updated 1 row

inserted 2 rows )+2)

and

that

SQL:

  • 0 bytes read, 2 rows read
  • 0 bytes/rows written
    (Only 2 rows were read from SQL, nothing was written.)
Did I answer your question? Mark my post as a solution!
Proud to be a Super User!
My custom visualization projects
Plotting Live Sound: Viz1
Beautiful News:Viz1, Viz2, Viz3
Visual Capitalist: Working Hrs

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors