Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Hi,
Is there any way to ingest XML data from a custom endpoint into a Lakehouse?
Since XML is not a natively supported format, the messages get rejected when the destination is a Lakehouse. I don't need any parsing, saving the message content as a string in a single column is fine.
I can sort of pull this off using a KQL destination, but it splits each line in the XML string into a separate row in the destination table which is a bit of an issue. Lakehouse is also the preferred output destination.
Solved! Go to Solution.
Hello
How about saving the XML output to the files section in a Lakehouse and then use some T-SQL skills to unfold or in any other way handle the data?
If you are querying a custom endpoint, it does not sound like a streaming data set. I would look into using a pipeline in Fabric - land the data in raw format in the Files folder in a Lakehouse, or, if you can, process the data in the Copy activity using an XML schema.
Cheers
Brian
Hello
How about saving the XML output to the files section in a Lakehouse and then use some T-SQL skills to unfold or in any other way handle the data?
If you are querying a custom endpoint, it does not sound like a streaming data set. I would look into using a pipeline in Fabric - land the data in raw format in the Files folder in a Lakehouse, or, if you can, process the data in the Copy activity using an XML schema.
Cheers
Brian
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.