Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
Business expects to ingest/integrate SAP data in realtime.
What is the best approach to bring SAP data realtime and 5 minute latency is the requirement.
Proposed bringing data from SAP ECC to SAP Dataspherea and push into ADLS. Using shortcut it can be loaded into Fabric.
(SAP ECC -> SAP Datasphere -> ADLS -> Fabric lakehouse) - This route may take greater than 5 minute latency.
Is it possible to ingest data via Azure Logic apps and SAP connectors for real time sync. How it can handle huge volume of data.
(SAP ECC -> Logic App -> ADLS -> Fabric)
Databricks is offering Zero Copy feature to bring SAP data. But the program is based on Microsoft suite and customer is expecting similar ask in Azure / Fabric. Please suggest what is the right approach in bringing the real time data into SAP. Customer is not looking third party tools like Dab Nexus etc.
What scenarios Azure Logic Apps are best suitable approach.
Solved! Go to Solution.
hi @v-menakakota : We reached out to MSFT for enabling SAP Mirror private preview for POC. You can close this thread for now, I will come back if there is any specific query.
Hi @BalajiL
Power BI Dataflow Gen2 is a better option when moving data from SAP.
You can achieve a latency near real-time. Move data from SAP HANA to Fabric SQL, and ETL integration is also available. Data Storage is in ADLS Gen 2 in Delta Lake.
You can connect to SSMS ( SQL Server Management Studio) as well
Thanks @BhaveshPatel for your response. Dataflow gen2 does not support SAP ECC and CDC feature - looks a limitation. Do you have any exposure of bringing SAP ECC and SAP Hana to Datasphere and then to Fabric considering SAP restricting ODP framework etc.
Hi @BalajiL ,
Thanks for reaching out to the Microsoft fabric community forum.
Dataflow Gen2 doesn’t support SAP ECC or CDC as of now, so that’s a limitation for real-time data.
About your question moving data from SAP ECC or SAP HANA to Datasphere and then to Fabric is possible, but since SAP restricts some parts of the ODP framework, it may not work for all scenarios. Usually, customers use SAP SLT or Datasphere to first bring the data into ADLS or Azure SQL, and then connect that to Fabric for reporting.
Currently, there’s no direct native option for real-time connection from SAP ECC to Fabric. The SAP mirroring feature that Microsoft is testing in private preview is expected to support this in future.
If I misunderstand your needs or you still have problems on it, please feel free to let us know.
Best Regards,
Community Support Team
Thanks @v-menakakota for your prompt response and clarification. Just a question the SLA to Fabric is 5 minute latency. In that case, if data flows from SAP ECC to Datasphere to ADLS and then to Fabric can meet the 5 minute latency. How about CDC enabled in this route. Please clarify.
Below link gives insights about SAP integration.
SAP ECC → SLT → SAP DataSphere → Azure Data Lake Storage Gen2 → Azure Synapse Analytics → Power BI
In our case, it could be like :
SAP ECC → SLT → SAP DataSphere → Azure Data Lake Storage Gen2 → Shortcut -> Fabric Lakehouse -> Power BI
Hi @BalajiL ,
Since the 5-minute latency and CDC behavior depend on how SAP Datasphere and Fabric integration perform in your setup, it would be best to raise a Microsoft Support Ticket.
Please refer below link on how to raise a contact support or support ticket.
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn
Best Regards,
Community Support Team
Hi @BalajiL ,
We wanted to follow up and ask if you have submitted a support ticket. Could you let us know if the issue was resolved after creating the support case?If you have any further questions, please let us know. we can assist you further.
Regards,
Microsoft Fabric Community Support Team.
hi @v-menakakota : We reached out to MSFT for enabling SAP Mirror private preview for POC. You can close this thread for now, I will come back if there is any specific query.
Hi @BalajiL
So, first of all, if you're interested: Microsoft offers a private preview for SAP mirroring.
Maybe it's something for you.
Here is the form for that.
Now to your question. From experience, I use Azure Logic Apps for other things. For example, for starting and stopping Fabric capacity.
If you are looking for something native, you could use Azure Data Factory to load data from SAP.
You could then control these from Fabric.
Here is an overview of the connectors for Azure Data Factory
https://learn.microsoft.com/en-us/azure/data-factory/connector-overview
Then, of course, you still have the usual SAP connectors in Fabric.
You can find them here and see what is supported.
https://learn.microsoft.com/en-us/fabric/data-factory/connector-overview
Of course, there are third-party providers that make this possible, but as you write, this is not an option for your customer, as I understand it.
That would be a first approach for you and your customer. But there are certainly other possibilities.
Perhaps you already have some initial ideas.
Best regards
Hi @spaceman127
Thanks for your prompt response. Fabric data factory has limited SAP connectors and data factory can be used for batch processing. The ask is how to integrate realtime data into Fabric from SAP systems. SAP mirroring will be good option and that is still in private preview. Why Azure logic was suggested to have bi-directional sync and getting data realtime but it will be suitable for low volume data.
All right,
I would stick with Fabric. Preview features always carry a risk, of course, but it's worth testing.
DataFlow Gen2 in conjunction with Fabric SQL is an alternative that could also work very well, as @BhaveshPatel writes.
He was quicker with the next answer 🙂
Best regards
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
Check out the October 2025 Fabric update to learn about new features.