Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Earn a 50% discount on the DP-600 certification exam by completing the Fabric 30 Days to Learn It challenge.

Reply
mrojze
Helper I
Helper I

Loaded a JSON file in OneLake... Now what?

Hi there,

I successfully called an API and saved a JSON file in the Files section of the Lakehouse.

Now, I am trying to read the file and unpack the datasets inside.

I'd love to use the Dataflows or any other PowerQuery-like tool.

I did try Dataflows but it forced me to go to OneDrive. I tied to copy the link to the OneLake but I keep getting errors.

Any idea what to do next?

Is a notebook my only way to get the data?

 

Thanks,

Martin

1 ACCEPTED SOLUTION
DennesTorres
Post Prodigy
Post Prodigy

Hi,

You can do it using a pipeline and the copy activity, you can even using the copy assistant to help you building it.

DennesTorres_0-1692474211807.png


The dataflows can't be used for this.

The pipeline needs to be in the same workspace as the lakehouse, but it will ask you if your source is the "Files" or "Tables" area. The Dataflow Gen 2, on the other hand, doesn't need to be in the same workspace, but only accepts the "Tables" area as source or destination.

Kind Regards,

 

Dennes

View solution in original post

7 REPLIES 7
mrojze
Helper I
Helper I

Any idea why it is not liking this?

 

mrojze_0-1692884439058.png

 

Never mind... wrong order. It is actually different from the documentation online

 

mrojze_0-1692884983249.png

 

miguel
Community Admin
Community Admin

Hey!

Have you tried using the Lakehouse connector and the "Files" section inside of it to access the files from your Lakehouse?

If not, please give it a try and let us know if it works for you.

mrojze
Helper I
Helper I

Thanks Dennes.

It looks like you can't use Dataflows in the files section. I was hoping to use it since it'd be easier to read all the sections of the JSON file at once.

Now I have to do a copy per section. 

Thanks!

 

Martin

DennesTorres
Post Prodigy
Post Prodigy

Hi,

You can do it using a pipeline and the copy activity, you can even using the copy assistant to help you building it.

DennesTorres_0-1692474211807.png


The dataflows can't be used for this.

The pipeline needs to be in the same workspace as the lakehouse, but it will ask you if your source is the "Files" or "Tables" area. The Dataflow Gen 2, on the other hand, doesn't need to be in the same workspace, but only accepts the "Tables" area as source or destination.

Kind Regards,

 

Dennes

Thanks for your tips.

I now was able to use the Dataflow to access my Files section. I didn't realize there was a Lakehouse connector. I was going directly to the JSON connector instead.

But also, I overlooked that in the Get Data section, there was a OneLake data hub (such a newbie)

mrojze_2-1692881100726.png

Now, to the next issue. You can't pass parameters to a Dataflow yet!!!!

I guess you can save the data somewhere and bring is as a dataset, but you can't pass parameters in a Fabric Dataflow yet. See https://community.fabric.microsoft.com/t5/Data-Pipelines/Pass-data-pipeline-parameter-to-dataflow-ge...

 

I am back to build an ADF copy per section. 

But now I have to learn how to create a Table in Delta Lake. So far, I wasn't able to. I tried SQL commands, but CREATE is not available yet.

 

 

Hi,

You can use sparkSQL in a notebook to create a table in the lakehouse:

DennesTorres_0-1692882043519.png


Kind Regards,

 

Dennes

Helpful resources

Announcements
LearnSurvey

Fabric certifications survey

Certification feedback opportunity for the community.

April Fabric Update Carousel

Fabric Monthly Update - April 2024

Check out the April 2024 Fabric update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.