The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hi,
I need to implement a Fabric notebook to read Dynamics 365 Businesss Central data by using the BC API.
I don't know how a such API functions. It seems that this API is based on OData service, that is a not very high-performance service.
I thought to create a notebook for an initial full load and next for an incremental load.
Has anyone had any experience with a notebook plus BC API regarding the performance offered in order to have any suggests?
Thanks
Solved! Go to Solution.
Hi @pmscorca,
In terms of performance, the Data Pipeline approach generally provides better results. It is optimized for bulk data movement, supports built-in parallelism, and is tuned for efficient handling of OData sources like Dynamics 365 Business Central. This makes it more suitable for high-volume or full-load scenarios.
When considering cost, Data Pipelines again come out ahead. They have lower orchestration overhead and are designed to be lightweight and efficient for recurring ETL tasks. Since they don’t require custom compute or extended execution times like notebooks, they’re usually more cost-effective.
In terms of reliability, especially when dealing with rate-limiting, API throttling, or complex retry strategies, Notebooks are more reliable. They allow full control over retry logic, timeouts, and error handling mechanisms, which is essential for robust data extraction from business APIs.
Finally, when it comes to simplicity, Data Pipelines are the preferred choice. They provide a user-friendly interface, come with built-in OData connectors, and handle authentication and API interactions automatically. This makes them well-suited for quick setup and straightforward data integration tasks without the need for complex configurations.
Thanks,
Prashanth Are
MS fabric community support
Simple Answer : Avoid these Dynamics 365 Business Central (BC) REST APIs in Fabric Notebook.
Ok, but is there an alternative solution, considering performance and costs?
Thanks
Hi @pmscorca,
integrating Dynamics 365 Business Central (BC) data using the BC API (OData V4) in Microsoft Fabric notebooks is a common and supported approach.
However, due to OData's performance constraints, especially with large datasets careful implementation is essential.
Thanks,
Prashanth Are
MS Fabric community support
Hi, in terms of performance and costs is it better implementing a notebook that uses the BC API or creating a data pipeline (copy job) that uses the Fabric OData connector?
BC API (vers. 2.0) and Fabric OData connector are both based on the OData v4 protocol.
Thanks
Hi @pmscorca,
Key considerations include:
Pagination Handling: OData responses are paged; proper looping with @odata.nextLink is necessary.
Filtering and $select: Use query options to limit data volume and improve performance.
Initial vs. Incremental Load: Design a full load for historical data and a separate incremental load based on modified timestamps.
Error and Retry Logic: Implement robust error handling for timeouts or throttling.
Data Volume Management: For large volumes, consider batching the load or using staging layers.
Thanks,
Prashanth Are
MS fabric community support
Ok, but in terms of performance and costs is it better implementing a notebook that uses the BC API or creating a data pipeline (copy job) that uses the Fabric OData connector? Thanks
Hi @pmscorca,
In terms of performance, the Data Pipeline approach generally provides better results. It is optimized for bulk data movement, supports built-in parallelism, and is tuned for efficient handling of OData sources like Dynamics 365 Business Central. This makes it more suitable for high-volume or full-load scenarios.
When considering cost, Data Pipelines again come out ahead. They have lower orchestration overhead and are designed to be lightweight and efficient for recurring ETL tasks. Since they don’t require custom compute or extended execution times like notebooks, they’re usually more cost-effective.
In terms of reliability, especially when dealing with rate-limiting, API throttling, or complex retry strategies, Notebooks are more reliable. They allow full control over retry logic, timeouts, and error handling mechanisms, which is essential for robust data extraction from business APIs.
Finally, when it comes to simplicity, Data Pipelines are the preferred choice. They provide a user-friendly interface, come with built-in OData connectors, and handle authentication and API interactions automatically. This makes them well-suited for quick setup and straightforward data integration tasks without the need for complex configurations.
Thanks,
Prashanth Are
MS fabric community support
Ok, therefore to get structured data as those BC ones reading some large table, using Data pipeline with OData connector seems as better solution as possible in terms of performance and costs, isn't it?
Hi, thinking to use Data Factory it seems that the OData connector is supported by data pipelines and data flows gen2 but not by copy jobs, while the Dynamics 365 Business Central connector is only supported by data flows gen2.
Is correct the official documentation about these connectors?
It could be very useful to use a copy job with these 2 connectors to a BC source.
Thanks