Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Learn more
This is my situation:
I used ADF to get a JSON file from an API.
I placed the file in my new Lakehouse files section.
Now I am trying to read the JSON file so I can start getting the tables I wanted and place them in the Tables section.
Unfortunately, I am getting an Unexpected error and I can't find the way around it.
Am I doing something wrong here?
Thanks!
Martin
Solved! Go to Solution.
Hi,
This image is the connection window from a Dataflow Gen 2, right?
Could you try this using a pipeline in the same workspace as the data lake?
The connection process is easier and it can get the data from "Files" to "Tables".
About the exactly error you are getting, I don't know the root cause, but if the pipeline solves your problem, you don't need to face this error.
Kind Regards,
Dennes
P.S: The other time I faced this error was related to an outdated data gateway. An updated solved. But I know this makes no sense in your scenario.
Hi Martin,
You can definetly use dataflows gen 2 to connect to your files in the Lakehouse and no need to use Pipelines instead.
If you use the Lakehouse connector in dataflows, you can simply browse the file part of your Lakehouse and select the files you want to connect to.
If you keep experiences an unexpeted error, please open a ticket so we can invetsigate.
Hi,
You are right. I double-checked and I noticed the access to the files is "hidden" in such a way I believed it was not possible.
This is the image from the connector:
The way the "Files" folder is inside a "database" icon and closed by default make it seems like it's not possible to access it. But it is.
But there is another important detail (which made me miss it): The "Files" is only acessible as a source. When considered as a destination, it's not acessible. The image below illustrates the lakehouse as destination, only with the "Tables" area available, no "Files" folder:
Kind Regards,
Dennes
Hey!
Have you tried using the Lakehouse connector and the section to connect to the files inside of it?
If not, please give it a try and let us know if that works for you.
Thanks Dennes.
It looks like you can't use Dataflows in the files section. I was hoping to use it since it'd be easier to read all the sections of the JSON file at once.
Now I have to do a copy per section.
Thanks!
Martin
Hi,
Yes, in this case I belive you are facing a current limitation.
If the copy behaviour/flatten hierarchy doesn't help, maybe you can consider the possibility of using an azure data factory dataflow. I believe it will be able to connect to the files area of a lakehouse, although I haven't tested.
Kind Regards,
Dennes
Hi,
This image is the connection window from a Dataflow Gen 2, right?
Could you try this using a pipeline in the same workspace as the data lake?
The connection process is easier and it can get the data from "Files" to "Tables".
About the exactly error you are getting, I don't know the root cause, but if the pipeline solves your problem, you don't need to face this error.
Kind Regards,
Dennes
P.S: The other time I faced this error was related to an outdated data gateway. An updated solved. But I know this makes no sense in your scenario.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.