Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
YoussefBOUKARAM
New Member

Not able to put parameter for Lakehouse in Fabric Pipeline

Details

I am using Azure Data Factory to configure a pipeline with a copy data step. When using automatically the lakehouse, it works fine. However, when I try to set a parameter in the pipeline to use in the lakehouse copy data step, it doesn't work. I am putting the correct name in the parameters value, but I am getting an error that says:

JavaScriptCopy
 
ErrorCode=AdlsGen2OperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ADLS Gen2 operation failed for: Operation returned an invalid status code 'BadRequest'. Account: ''. FileSystem: '3fb7192f-f276-49d1-a0c7-01047a2d313d'. Path: 'DemoAcquireEmployeeDataLakehouse/Tables/EmployeesWorldWide/_delta_log'. ErrorCode: 'BadRequest'. Message: 'Bad Request'. TimeStamp: 'Mon, 30 Oct 2023 10:21:56 GMT'..,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.Azure.Storage.Data.Models.ErrorSchemaException,Message=Operation returned an invalid status code 'BadRequest',Source=Microsoft.DataTransfer.ClientLibrary,'

How can I use a parameter for a lakehouse in a Fabric Pipeline without getting this error?

1 ACCEPTED SOLUTION

You'll need to use the GUID of the Lakehouse instead of its name

 

AndyDDC_0-1698675155176.png

 

You can get that from the URL when you browse the lakehouse itself 

AndyDDC_1-1698675194782.png

 

View solution in original post

4 REPLIES 4
YoussefBOUKARAM
New Member

Hello Andy,

 

Thank you for the reply.

- I am using the  DAta Factory/ Data Pipelines with Fabric itself.

- All the other parameters in the pipeline are working fine, only the DataLake not working.

 

I was using the copy into the lakehouse,  using the following , it is working fine.

YoussefBOUKARAM_0-1698673905968.png

 

When I switch to use parameters, it is not working:

YoussefBOUKARAM_1-1698674215191.png

YoussefBOUKARAM_2-1698674229553.png

All the other parameters, are working fine even the EmployeesWorldWideTableName works, just the datalake. This is the name of myy datalake: DemoAcquireEmployeeDataLakehouse

 

 

You'll need to use the GUID of the Lakehouse instead of its name

 

AndyDDC_0-1698675155176.png

 

You can get that from the URL when you browse the lakehouse itself 

AndyDDC_1-1698675194782.png

 

Thank you Andy, it worked

AndyDDC
Solution Sage
Solution Sage

Hi, couple of questions:

  • Are you using Data Factory/Data Pipelines within Fabric itself or the seperate Azure Data Factory service?
  • How are you configuring the parameterized connection to the lakehouse, do you have screenshots?

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

Expanding the Synapse Forums

New forum boards available in Synapse

Ask questions in Data Engineering, Data Science, Data Warehouse and General Discussion.

RTI Forums Carousel3

New forum boards available in Real-Time Intelligence.

Ask questions in Eventhouse and KQL, Eventstream, and Reflex.

MayFBCUpdateCarousel

Fabric Monthly Update - May 2024

Check out the May 2024 Fabric update to learn about new features.