Get certified in Microsoft Fabric—for free! For a limited time, the Microsoft Fabric Community team will be offering free DP-600 exam vouchers. Prepare now
Hi there,
I successfully called an API and saved a JSON file in the Files section of the Lakehouse.
Now, I am trying to read the file and unpack the datasets inside.
I'd love to use the Dataflows or any other PowerQuery-like tool.
I did try Dataflows but it forced me to go to OneDrive. I tied to copy the link to the OneLake but I keep getting errors.
Any idea what to do next?
Is a notebook my only way to get the data?
Thanks,
Martin
Solved! Go to Solution.
Hi,
You can do it using a pipeline and the copy activity, you can even using the copy assistant to help you building it.
The dataflows can't be used for this.
The pipeline needs to be in the same workspace as the lakehouse, but it will ask you if your source is the "Files" or "Tables" area. The Dataflow Gen 2, on the other hand, doesn't need to be in the same workspace, but only accepts the "Tables" area as source or destination.
Kind Regards,
Dennes
Any idea why it is not liking this?
Never mind... wrong order. It is actually different from the documentation online
Hey!
Have you tried using the Lakehouse connector and the "Files" section inside of it to access the files from your Lakehouse?
If not, please give it a try and let us know if it works for you.
Thanks Dennes.
It looks like you can't use Dataflows in the files section. I was hoping to use it since it'd be easier to read all the sections of the JSON file at once.
Now I have to do a copy per section.
Thanks!
Martin
Hi,
You can do it using a pipeline and the copy activity, you can even using the copy assistant to help you building it.
The dataflows can't be used for this.
The pipeline needs to be in the same workspace as the lakehouse, but it will ask you if your source is the "Files" or "Tables" area. The Dataflow Gen 2, on the other hand, doesn't need to be in the same workspace, but only accepts the "Tables" area as source or destination.
Kind Regards,
Dennes
Thanks for your tips.
I now was able to use the Dataflow to access my Files section. I didn't realize there was a Lakehouse connector. I was going directly to the JSON connector instead.
But also, I overlooked that in the Get Data section, there was a OneLake data hub (such a newbie)
Now, to the next issue. You can't pass parameters to a Dataflow yet!!!!
I guess you can save the data somewhere and bring is as a dataset, but you can't pass parameters in a Fabric Dataflow yet. See https://community.fabric.microsoft.com/t5/Data-Pipelines/Pass-data-pipeline-parameter-to-dataflow-ge...
I am back to build an ADF copy per section.
But now I have to learn how to create a Table in Delta Lake. So far, I wasn't able to. I tried SQL commands, but CREATE is not available yet.
Hi,
You can use sparkSQL in a notebook to create a table in the lakehouse:
Kind Regards,
Dennes
Check out the October 2024 Fabric update to learn about new features.
Learn from experts, get hands-on experience, and win awesome prizes.
User | Count |
---|---|
4 | |
3 | |
2 | |
1 | |
1 |
User | Count |
---|---|
6 | |
4 | |
4 | |
3 | |
2 |