Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
db042190
Post Prodigy
Post Prodigy

running pipeline in fabric only works if deactivated

Hi , i just got started on pipelines and fabric.   It seems to me if i run the pipeline while its activated i get an error about fixing the activity but running it deactivated works.   I can tell by looking at a count of records in my lakehouse target.   if i run what seems to be the by product generated job, i think it works regardless of the activate status.   Why doesnt it run when activated?

1 ACCEPTED SOLUTION

Hi @db042190 ,

 

When an activity is “deactivated,” Fabric simply skips it. Means your pipeline appears to “run,” but no Copy activity is actually executed. So the reason it “runs successfully” is, the job that actually copied your data was NOT the pipeline it was the stand-alone Copy Job that the Copy Data Assistant produced in the background. your lakehouse got rows because the Copy Job runs independently, not because the pipeline ran the activity.


Please try below things to fix the issue.

 

1. Build the pipeline manually, Create a new blank pipeline and Add Copy Data activity manually not the copy job activity, choose the regular copy activity. Source: SQL DB / SQL server and Destination: Lakehouse table

 

Note: This works every time and requires no Copy Job Definition, no Connection, no Job Definition object.

 

2. Delete everything produced by the assistant, Delete: The “Copy job” activity inside the pipeline. The “Copy Job Definition” object under
Data Factory --> Jobs. Create a normal Copy Data activity manually.


Note: When you run the Copy Job directly, it will load the data And When you run the pipeline , it tries to orchestrate the Copy Job, fails, and refuses to run unless disabled.

 

I hope this information helps. Please do let us know if you have any further queries.

 

Regards,

Dinesh

View solution in original post

10 REPLIES 10
db042190
Post Prodigy
Post Prodigy

its interesting that when i start with a blank canvas, i have none of these issues.   the pipeline runs without deactivation.   i suspect what i reported is a bug thta occurs when one starts with copy assistant.  

I don't suggest to use copy assistant.
build pipelines/copy jobs on your own.


--
Riccardo Perico
BI Architect @ Lucient Italia | Microsoft MVP

Blog | GitHub

If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Hi @db042190 ,

 

When an activity is “deactivated,” Fabric simply skips it. Means your pipeline appears to “run,” but no Copy activity is actually executed. So the reason it “runs successfully” is, the job that actually copied your data was NOT the pipeline it was the stand-alone Copy Job that the Copy Data Assistant produced in the background. your lakehouse got rows because the Copy Job runs independently, not because the pipeline ran the activity.


Please try below things to fix the issue.

 

1. Build the pipeline manually, Create a new blank pipeline and Add Copy Data activity manually not the copy job activity, choose the regular copy activity. Source: SQL DB / SQL server and Destination: Lakehouse table

 

Note: This works every time and requires no Copy Job Definition, no Connection, no Job Definition object.

 

2. Delete everything produced by the assistant, Delete: The “Copy job” activity inside the pipeline. The “Copy Job Definition” object under
Data Factory --> Jobs. Create a normal Copy Data activity manually.


Note: When you run the Copy Job directly, it will load the data And When you run the pipeline , it tries to orchestrate the Copy Job, fails, and refuses to run unless disabled.

 

I hope this information helps. Please do let us know if you have any further queries.

 

Regards,

Dinesh

db042190
Post Prodigy
Post Prodigy

itsreallyacopyjob.png

db042190
Post Prodigy
Post Prodigy

these are the steps i generally followed.   In this last attempt, when i clicked the error, it appeared that the copy activity completely disappeared from the pipeline, or at least i was being presented with options as if i just hit new.  in previous attempts, when i came back into the service, at least the activity was still in the prev pipelines.  in this iteration though, the source and dest seem to have remained in the "copy job" that was generated,  and the copy job ran successfully.   its as if the copy job isnt tethered to any pipeline but is rather just a job with copy behavior.   i'll post 2 images separately.  this isnt a flow as was previously suggested.

 

new item->under get data, pipeline, ->start with guidance, copy data assistant->
sql server as data source (entered server, db and already had a windows conn from prior pl's)->
chose my cust table after hitting next->
next->new fabric item lakehouse->ws already populated, LH4 as name->create and connect->
full copy->next->
map to dest specs were 1) table, 2) append , 3) source and dest table names were the same->next->
review and save screen shows source and dest conns graphically and by name->save
you see the activity (copy job 837) in the pipeline, hitting save (diskette icon) for posterity once
more, GET SAVING ERROR, connection is required, copy job 1 did show up, in this iteration it appeared activity disappeared from pipeline

 

copyjobran.png

db042190
Post Prodigy
Post Prodigy

this is what i saw after hitting validate.  feels like a wild goose chase.   MS's instructions were vague in this exercise.   I dont even see how to go in and modify the source and dest originally set to a sql view and lake house table respectively.

errorwhenactivated3.png

v-dineshya
Community Support
Community Support

Hi  @db042190,

Thank you for reaching out to the Microsoft Community Forum.

 

As you mentioned, your pipeline only runs successfully when it’s deactivated, but fails when it’s activated.


When you deactivate a pipeline in Fabric, it essentially stops enforcing certain validation rules and scheduling constraints. Running the pipeline manually in this state often bypasses checks related to triggers, dependencies, or incomplete configurations.

 

The “by-product generated job” you mentioned is likely the Dataflow or Notebook job created by the pipeline. These jobs can run independently because they don’t require pipeline-level validation, they just execute the underlying transformation logic.

 

Please try below things to fix the issue.

 

1. In Fabric, go to your pipeline and click Validate. This will show which activities are incomplete.

2. Verify Linked Services & Datasets, Ensure all sources and targets have proper connections and credentials.

3. If your pipeline uses parameters or scheduled triggers, make sure they are correctly set.

4. Check the error Details, the error message should point to the specific activity that needs fixing. If possible please share the error details, that will be help us to fix the issue.

 

I hope this information helps. Please do let us know if you have any further queries.

 

Regards,

Dinesh

 

 

 

 

 

thank you i'll post 3 images in 3 posts.    at this point the copy job (whatever that was) is already deleted and the subject of a different post.  im following MS's instructions on this but they seem vague on the 1st pipeline hands on exercise.  i cant imagine what that connection represents in an activity tha has a source and dest.

errorwhenactivated1.png

 

 

errorwhenactivated2.png

Helpful resources

Announcements
New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Power BI DataViz World Championships carousel

Power BI DataViz World Championships - June 2026

A new Power BI DataViz World Championship is coming this June! Don't miss out on submitting your entry.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Power BI Update Carousel

Power BI Community Update - March 2026

Check out the March 2026 Power BI update to learn about new features.

Top Solution Authors