The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi Community,
I’m encountering an error while working with a pipeline for copying data from Azure Databricks metastore to Lakehouse using Microsoft Fabric (Data Factory pipeline):
ErrorCode=InvalidParameter,
Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,
Message=The value of the property 'uriString' is invalid: 'Value cannot be null. Parameter name: uriString'.,
Source='',
Type=System.ArgumentNullException,
Message=Value cannot be null. Parameter name: uriString,
Source=System,
Note: The connection to Databricks is successful, and I’m able to preview the data without any issues. Staging is also enabled in the pipeline.
Has anyone faced a similar issue? It seems like a required URI parameter is missing or incorrectly passed, but I couldn’t pinpoint exactly where this is happening.
Any guidance on possible causes or how to trace and resolve this would be highly appreciated.
@LituRout please try following action items and let me know the results
Step Action
The error means the pipeline is missing a required URI string—most likely a table path or link in the Lakehouse sink. It often happens when the system attempts auto-create but lacks necessary metadata. Predefining the target table and validating parameter values usually resolves this issue.
Hi @LituRout ,
Thanks for reaching out to the Microsoft fabric community forum.
Could you please confirm if the copy activity writes to a Lakehouse table or a file in the Lakehouse?
Also, is the target table pre-created, or are you allowing the pipeline to auto-create it during the run?
Once we have that, we will be able sort the issue.
Best Regards,
Community Support Team
Hi @v-menakakota ,
It writes to Lakehouse Table.
The target table is not pre-created, I am allowing the pipeline to auto-create it during the run.
Hi @LituRout ,
The error message shows something is missing or not properly generated when the pipeline tries to create the table on its own. Sometimes, the Lakehouse sink expects certain details like the table name or folder path, and if those aren’t clearly defined, it can lead to this kind of issue.
To help us narrow it down, could you try manually creating the target table in the Lakehouse before running the pipeline? This will help us check if the problem is specifically related to the table being auto-created by the pipeline.
Can you try it once.
Best Regards,
Community Support Team
Hi @v-menakakota,
I tried manually creating the target table in the Lakehouse and as well as the warehouse before running the pipeline. Still the pipeline failed and same error showed-up.
what does your copy job to ie source to a table or a file?
have you switched off staging and tried it without just to isolate where the issue is happening.
if you are copying into a table, how do you create the table?
Proud to be a Super User!