Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The FabCon + SQLCon recap series starts April 14th at 8am Pacific. If you’re tracking where AI is going inside Fabric, this first session is a can't miss. Register now

Write Data Factory Web activity output to Lakehouse File or Table

Please provide the simple ability to write data from the Data Factory "Web" activity output to Lakehouse File or Table.


We used the new "Cloud Connection" Service Principal feature to run an Web V2 API call. The Output from the API call contained a JSON file. However, there did not appear to be a simple way to write this JSON to a Lakehouse file for Archive purposes. I had intended to replace the previous Json file, and save an Gzip archive.


We were looking something similar to the "Copy Data" Destination, which allowed the selection of Lakehouse file in Json and Gzip compression.


Unfortunately, there was no simple way to include the API json data into the "Copy Data" source. The source was limited to Workspace and External without the ability to select a previous step.


We managed to find a work arounds by adding complex Notebooks. The Notebook was allowed to received the API Json as a parameter input.





Status: Needs Votes
Comments
fbcideas_migusr
New Member

Voted. I think this should apply to any data source. Basically if we have some data in our data pipeline, e.g. output of web activity, get metadata activity, lookup, variable, etc. it should be possible to write that data to a file.

fbcideas_migusr
New Member
Status changed to: Needs Votes