Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Did you hear? There's a new SQL AI Developer certification (DP-800). Start preparing now and be one of the first to get certified. Register now

Reply
SyedN
Regular Visitor

android App - One Lake Integration.

Hi All, 

I have built an app that will store the SMS and Call logs of my phone and saved them in a Json format i am trying to upload that Json file directly from my APp programatically ( using Java ) to One Lake. I have successfully able to get the SAS token but i am stuck after tht step , with aavailable SAS token to my OneLake how can i upload the Json file to OneLake using Java ?

 

I have looked in One Lake documentation its shows using PYthon which i am not familiar with so can any one help me out in this scenario please. 

 

 

1 ACCEPTED SOLUTION
Anonymous
Not applicable

HI @SyedN,

Here is the links with sample code that connection to the lakehouse/data warehouse based on odbc driver:

Load data to MS Fabric Warehouse from notebook - Stack Overflow

connect to fabric lakehouses warehouses from python code 

You can modify them to java version to use odbc/jdbc driver to connect the source with correspond connections string, data source and credentials and use connection cursor to save your data to table based on sql query.

If you mean to upload files to Lakehouse, It is more simply to use dataflow/data pipelines to getting data from API to the Lakehouse.

Fabric decision guide - copy activity, dataflow, or Spark - Microsoft Fabric | Microsoft Learn

How to copy data using copy activity - Microsoft Fabric | Microsoft Learn

Regards,

Xiaoxin Sheng

View solution in original post

1 REPLY 1
Anonymous
Not applicable

HI @SyedN,

Here is the links with sample code that connection to the lakehouse/data warehouse based on odbc driver:

Load data to MS Fabric Warehouse from notebook - Stack Overflow

connect to fabric lakehouses warehouses from python code 

You can modify them to java version to use odbc/jdbc driver to connect the source with correspond connections string, data source and credentials and use connection cursor to save your data to table based on sql query.

If you mean to upload files to Lakehouse, It is more simply to use dataflow/data pipelines to getting data from API to the Lakehouse.

Fabric decision guide - copy activity, dataflow, or Spark - Microsoft Fabric | Microsoft Learn

How to copy data using copy activity - Microsoft Fabric | Microsoft Learn

Regards,

Xiaoxin Sheng

Helpful resources

Announcements
April Fabric Update Carousel

Fabric Monthly Update - April 2026

Check out the April 2026 Fabric update to learn about new features.

Fabric SQL PBI Data Days

Data Days 2026 coming soon!

Sign up to receive a private message when registration opens and key events begin.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.