Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
Hello everyone
I'm trying to map a dataverse lookup column in a pipeline data copy activity. Kind of like how it is explained in this youtube video:
https://www.youtube.com/watch?v=m24yu-dwK8Q
Except for that I'm trying to use a pipeline data copy activity instead of a power platform dataflow. I've tried it in a dataflow and that seems to work but those can not be integrated with Fabric pipelines. And I need it to be in a pipeline because that one is supported in a Fabric deployment pipeline.
So in my Fabric pipeline activity it looks like:
But then I get this error:
Error
Operation on target Copy_oml_eenheid failed: ErrorCode=DynamicsOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Dynamics operation failed with error code: -2147217149, error message: 'oml_eenheid' entity doesn't contain attribute with Name = 'oml_complex.oml_complexnummer' and NameMapping = 'Logical'..,Source=Microsoft.DataTransfer.ClientLibrary.DynamicsPlugin,''Type=System.ServiceModel.FaultException`1[[Microsoft.Xrm.Sdk.OrganizationServiceFault, Microsoft.Xrm.Sdk, Version=9.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35]],Message=The creator of this fault did not specify a Reason.,Source=Microsoft.DataTransfer.ClientLibrary.DynamicsPlugin,'
And in my Power Platform dataflow it looks like this:
And that just works. So I'm guessing I need to insert something else than 'oml_complex.oml_complexnummer' but I'm not sure what.
I could use some help so thank you kindly in advance.
Solved! Go to Solution.
Hi @dibeau
The internal team replied as follows:
For the pipeline copy - in addition to mapping the related record's guid to the lookup field - also add a second value to the source query that has the lookup field's associated entity name (e.g. 'contact' or 'incident' or 'my_customentity') Map this new entity string to the mapping by adding a custom destination field with the name of the lookup field + '@EntityReference".
In this example I'm passing the parentcustomerid guid to the parentcustomerid in the destination AND I'm also passing 'account' over as the parentcustomerid@EntityReference to identify the related record type.
Hope this helps. Please let me know if you have any further questions.
Hi @dibeau
You can refer to this video Fabric Pipelines for Dataverse Part 5: Populate the Lookups (youtube.com)
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. Otherwise, will respond back with the more details and we will try to help.
Thanks
Hi @dibeau
Thanks for using Fabric Community.
At this time, we are reaching out to the internal team to get some help on this. We will update you once we hear back from them.
Thanks
Hi @dibeau
The internal team replied as follows:
For the pipeline copy - in addition to mapping the related record's guid to the lookup field - also add a second value to the source query that has the lookup field's associated entity name (e.g. 'contact' or 'incident' or 'my_customentity') Map this new entity string to the mapping by adding a custom destination field with the name of the lookup field + '@EntityReference".
In this example I'm passing the parentcustomerid guid to the parentcustomerid in the destination AND I'm also passing 'account' over as the parentcustomerid@EntityReference to identify the related record type.
Hope this helps. Please let me know if you have any further questions.
Hi @Anonymous
I'm not quite sure what this means. My source (Lakehouse delta table) does not have a GUID. Does that mean I need to add a column with GUIDs? If I just try to replicate what the internal team showed I get the following error:
Operation on target Copy_oml_eenheid failed: ErrorCode=ParquetColumnNotFound,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column complexnummer@EntityReference does not exist in Parquet file.,Source=Microsoft.DataTransfer.Richfile.ParquetTransferPlugin,'
Kind regards
Hi @dibeau
Yes - you need to supply the GUID of the related record if you're trying to link to it. You would add that into the source query.
Hi @dibeau
You can refer to this video Fabric Pipelines for Dataverse Part 5: Populate the Lookups (youtube.com)
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet. Otherwise, will respond back with the more details and we will try to help.
Thanks
Hi @Anonymous
That did the trick! I'm super happy. It is working now. Thank you very much!
Little sidenote; In the example by Scott Sewell the source is an Azure SQL table. I'm using a Lakehouse delta table so there is no way to add a SQL query because there is no "use query" option. Instead of that I've created a new table with GUID's and joined them in the ETL pipeline with PySpark in a notebook. Took me a bit of work but it looks like its working.
Kind regards
Hi @dibeau
Glad that your query got resolved. Please continue using Fabric Community for any help regarding your queries.
Hello everyone!
I want to know if there is a way to do this without the related GUID?. I need to use the alternate Key to sync with external source.
We can do it with Dataflow, but we don’t know how with pipelines.
Thanks in advance
Anibal
Hi
Did you ever solve this? I've got the same issue and was wondering if you'd found a workaround.
Nop, I created a new GUID field in a staging area for all my entities. It allowed me to fill all the lookups fields properly.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.