The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi All,
I have built an app that will store the SMS and Call logs of my phone and saved them in a Json format i am trying to upload that Json file directly from my APp programatically ( using Java ) to One Lake. I have successfully able to get the SAS token but i am stuck after tht step , with aavailable SAS token to my OneLake how can i upload the Json file to OneLake using Java ?
I have looked in One Lake documentation its shows using PYthon which i am not familiar with so can any one help me out in this scenario please.
Solved! Go to Solution.
HI @SyedN,
Here is the links with sample code that connection to the lakehouse/data warehouse based on odbc driver:
Load data to MS Fabric Warehouse from notebook - Stack Overflow
connect to fabric lakehouses warehouses from python code
You can modify them to java version to use odbc/jdbc driver to connect the source with correspond connections string, data source and credentials and use connection cursor to save your data to table based on sql query.
If you mean to upload files to Lakehouse, It is more simply to use dataflow/data pipelines to getting data from API to the Lakehouse.
Fabric decision guide - copy activity, dataflow, or Spark - Microsoft Fabric | Microsoft Learn
How to copy data using copy activity - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng
HI @SyedN,
Here is the links with sample code that connection to the lakehouse/data warehouse based on odbc driver:
Load data to MS Fabric Warehouse from notebook - Stack Overflow
connect to fabric lakehouses warehouses from python code
You can modify them to java version to use odbc/jdbc driver to connect the source with correspond connections string, data source and credentials and use connection cursor to save your data to table based on sql query.
If you mean to upload files to Lakehouse, It is more simply to use dataflow/data pipelines to getting data from API to the Lakehouse.
Fabric decision guide - copy activity, dataflow, or Spark - Microsoft Fabric | Microsoft Learn
How to copy data using copy activity - Microsoft Fabric | Microsoft Learn
Regards,
Xiaoxin Sheng