Don't miss your chance to take the Fabric Data Engineer (DP-600) exam for FREE! Find out how by watching the DP-600 session on-demand now through April 28th.
Learn moreJoin the FabCon + SQLCon recap series. Up next: Power BI, Real-Time Intelligence, IQ and AI, and Data Factory take center stage. All sessions are available on-demand after the live show. Register now
Hi All,
I have built an app that will store the SMS and Call logs of my phone and saved them in a Json format i am trying to upload that Json file directly from my APp programatically ( using Java ) to One Lake. I have successfully able to get the SAS token but i am stuck after tht step , with aavailable SAS token to my OneLake how can i upload the Json file to OneLake using Java ?
I have looked in One Lake documentation its shows using PYthon which i am not familiar with so can any one help me out in this scenario please.
Solved! Go to Solution.
HI @SyedN,
Here is the links with sample code that connection to the lakehouse/data warehouse based on odbc driver:
Load data to MS Fabric Warehouse from notebook - Stack Overflow
connect to fabric lakehouses warehouses from python code
You can modify them to java version to use odbc/jdbc driver to connect the source with correspond connections string, data source and credentials and use connection cursor to save your data to table based on sql query.
If you mean to upload files to Lakehouse, It is more simply to use dataflow/data pipelines to getting data from API to the Lakehouse.
Fabric decision guide - copy activity, dataflow, or Spark - Microsoft Fabric | Microsoft Learn
How to copy data using copy activity - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng
HI @SyedN,
Here is the links with sample code that connection to the lakehouse/data warehouse based on odbc driver:
Load data to MS Fabric Warehouse from notebook - Stack Overflow
connect to fabric lakehouses warehouses from python code
You can modify them to java version to use odbc/jdbc driver to connect the source with correspond connections string, data source and credentials and use connection cursor to save your data to table based on sql query.
If you mean to upload files to Lakehouse, It is more simply to use dataflow/data pipelines to getting data from API to the Lakehouse.
Fabric decision guide - copy activity, dataflow, or Spark - Microsoft Fabric | Microsoft Learn
How to copy data using copy activity - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
| User | Count |
|---|---|
| 9 | |
| 6 | |
| 5 | |
| 4 | |
| 3 |
| User | Count |
|---|---|
| 31 | |
| 15 | |
| 10 | |
| 8 | |
| 8 |