Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
ljml61
New Member

Feed Lakehouse from external app

Hello,

I want to feed our Fabric Lakehouse from an on-premise ETL.

We are able to to feed an Azure Storage using a Shared Access Signature. But iIcan't find a way to build a SAS on a Fabric Lakehouse.

Is it possible ?

 

More generally, what do you think is the best way to feed a lakehouse from external app ? 

 

Thank you by advance.

1 ACCEPTED SOLUTION
Srisakthi
Super User
Super User

Hi @ljml61 ,

 

If you want to push the files to Azure storage as well as to Fabric then you can leverage shortcut. By using shortcut you can avoid duplicating files.

 

If your source data in on-prem sql server then you can utilise data pipeline with on-prem data gateway

 

Regards,

Srisakthi

View solution in original post

4 REPLIES 4
Anonymous
Not applicable

Hi @ljml61,

 

we would like to follow up to see if the solution provided by the super user resolved your issue. Please let us know if you need any further assistance.
If our super user response resolved your issue, please mark it as "Accept as solution" and click "Yes" if you found it helpful.

 

Regards,
Vinay Pabbu

Srisakthi
Super User
Super User

Hi @ljml61 ,

 

If you want to push the files to Azure storage as well as to Fabric then you can leverage shortcut. By using shortcut you can avoid duplicating files.

 

If your source data in on-prem sql server then you can utilise data pipeline with on-prem data gateway

 

Regards,

Srisakthi

ljml61
New Member

Hi,

 

Thank you for your response.

Our ETL is using azcopy.exe to push the files into Azure storage.

Do you think using API is better ?

 

Actually, we must keep to push files Push from on-premise to Fabric, so using spark is not in scope.

 

Regards

Srisakthi
Super User
Super User

Hi @ljml61 ,

 

Are you pushing data from external aaplication to azure storage via api's? If so similarly you can push the files to one lake also, but here there is SAS Key.

Another way is you can use notebook with spark libraries to connect to your external source and fetch the data load into one lake.

 

Regards,

Srisakthi

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.