Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.

Reply
Swapnil_CR
New Member

Fabrics to Dataverse - Dataflow Alternative

Hi,

I have been using Dataflow in power apps to move data from Fabrics lakehouse to Dataverse table. Also mentioned in other posts as Reverse Integration.

I am fully aware Dataflow does the job but I am facing long runs which usually takes ~3 mins to complete. 

Swapnil_CR_1-1757919454676.png

Keeping that in mind, I need to find a reliable & consitent approach rather than using Dataflow.

 

My requirements:

1. Scheduling 4 to 6 times day

2. Merge capability based on ID

3. Lookup columns should also be map-able.

 

I have tested:

1. Dataflow ( Inconsitent with run time)

2. Copy Job from Fabrics (Does not allow mapping for lookup columns, even when i convert the mappings w.r.t. the value of the lookup).

 

Please do let me know if i am missing anything. Thanks in advance.

 

2 ACCEPTED SOLUTIONS
AntoineW
Responsive Resident
Responsive Resident

Hello @Swapnil_CR,

 

For my understanding, for now, Dataflow is the only native option that fully supports mapping to lookup columns. You can try to optimize it by:

  • Using incremental refresh (if possible) to reduce the number of rows processed each run.

  • Splitting transformations so heavy shaping happens in Fabric (with notebooks and store gold data in a lakehouse), and Dataflow only handles the last step (merge + lookup mapping)

 

Hope it can help you ! 

Best regards,

Antoine

 

View solution in original post

v-tsaipranay
Community Support
Community Support

Hi @Swapnil_CR ,

Thank you for reaching out to the Microsoft fabric community forum and sharing your scenario in detail.

 

Based on your requirements running 4–6 scheduled refreshes per day, supporting merge/upsert based on ID, and mapping lookup columns the only native option that fully meets these needs today is Dataflows, as they are the only mechanism that supports lookup column mapping in Dataverse as mentioned by @AntoineW . While performance can be inconsistent, you can optimize this by enabling incremental refresh to reduce the number of rows processed each run and by handling heavy transformations in Fabric (e.g., preparing a clean Gold layer in your Lakehouse or using notebooks), leaving the Dataflow responsible only for the final merge and lookup mappings. This approach helps improve reliability and run time consistency.

Alternatives like Fabric Copy Jobs or Power Automate may help for simple insert/update scenarios, but they do not provide full support for lookup mapping, which makes Dataflow the most viable option for your case at present.

Hope this helps. Please feel free to rech out for any further questions.


Thank you .

View solution in original post

2 REPLIES 2
v-tsaipranay
Community Support
Community Support

Hi @Swapnil_CR ,

Thank you for reaching out to the Microsoft fabric community forum and sharing your scenario in detail.

 

Based on your requirements running 4–6 scheduled refreshes per day, supporting merge/upsert based on ID, and mapping lookup columns the only native option that fully meets these needs today is Dataflows, as they are the only mechanism that supports lookup column mapping in Dataverse as mentioned by @AntoineW . While performance can be inconsistent, you can optimize this by enabling incremental refresh to reduce the number of rows processed each run and by handling heavy transformations in Fabric (e.g., preparing a clean Gold layer in your Lakehouse or using notebooks), leaving the Dataflow responsible only for the final merge and lookup mappings. This approach helps improve reliability and run time consistency.

Alternatives like Fabric Copy Jobs or Power Automate may help for simple insert/update scenarios, but they do not provide full support for lookup mapping, which makes Dataflow the most viable option for your case at present.

Hope this helps. Please feel free to rech out for any further questions.


Thank you .

AntoineW
Responsive Resident
Responsive Resident

Hello @Swapnil_CR,

 

For my understanding, for now, Dataflow is the only native option that fully supports mapping to lookup columns. You can try to optimize it by:

  • Using incremental refresh (if possible) to reduce the number of rows processed each run.

  • Splitting transformations so heavy shaping happens in Fabric (with notebooks and store gold data in a lakehouse), and Dataflow only handles the last step (merge + lookup mapping)

 

Hope it can help you ! 

Best regards,

Antoine

 

Helpful resources

Announcements