The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi,
I am looking for steps for making direct connection to Oracle cloud to pull Tables data into Fabric lakehouse.
I assume REST API call using pipeline can be a worable solution but still not sure.
Also I am open to other ways where we can establish connection to Fabric from Oracle clould and pull the tables data.
Solved! Go to Solution.
Hi @Abhinay_Raj,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution helps you? or let us know if you need any further assistance here?
Your feedback is important to us, Looking forward to your response.
Thanks,
Prashanth Are
MS Fabric community support.
Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"
Hi @Abhinay_Raj,
As we haven’t heard back from you, we wanted to kindly follow up to check if the solution helps you? or let us know if you need any further assistance here?
Your feedback is important to us, Looking forward to your response.
Thanks,
Prashanth Are
MS Fabric community support.
Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"
Hi @Abhinay_Raj,
Thanks for reaching out Microsoft Fabric Community Support.
REST API calls are an excellent choice when flexibility, cross-platform integration, and security are priorities. They are particularly useful for small to medium-sized datasets and custom data retrieval needs.
However, the approach has limitations for high-volume or real-time data transfer due to potential API rate limits and performance bottlenecks. if the use case involves large-scale data or frequent updates, direct database connectivity or batch processing might be more efficient alternatives.
Direct database connectivity offers a more straightforward approach if performance and real-time updates are priorities. Use the Oracle Database connector in Synapse Pipelines or Azure Data Factory to establish a connection.
After the connection is set up, configure a pipeline to query and transfer data into Fabric Lakehouse. Use Delta tables as the destination format for optimized analytics. This approach avoids the overhead of setting up APIs and offers direct data access but requires secure network connectivity between Oracle Cloud and Fabric, such as through a VPN or private link.
File-based integration is a simpler alternative for batch data transfers: Export table data from Oracle Cloud to a CSV or Parquet file. Store these files in Oracle Object Storage or a shared location accessible to Fabric. upload these files in Fabric Lakehouse using pipelines or manual uploads.
Use Fabric's transformation capabilities to convert the files into Delta tables. While this method is the easiest to set up, it is less efficient for large datasets and doesn’t support real-time updates. However, it’s suitable for periodic or one-time data transfers where simplicity is a priority.
Best practices
Limitations:
Choose the method based on the frequency, volume, and complexity of your data needs. Let me know if you need any additiponal help here?
Thanks,
Prashanth Are
MS Fabric community support.
Did I answer your question? Mark my post as a solution, this will help others!
If my response(s) assisted you in any way, don't forget to drop me a "Kudos"
Assuming you're running an Oracle DB in OCI.
You can;
We are looking into the first two and have already PoCed the third for bulk data movements.