Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Register now to learn Fabric in free live sessions led by the best Microsoft experts. From Apr 16 to May 9, in English and Spanish.

Reply
bco_lde
Helper I
Helper I

Performance of PowerBI connecting to Business Cenral / Odata / ways to cope ?

Hi there
for PowerBI and Business Central all claiming to be the same stack, performance sure leaves a lot to be desired. 
examples: 
When I connect to a Business Central table via the normal integrated ApI connector, the full Data is loaded. 
In my example, table is project tasks. the whole data is "only" 47MB, yet the connection takes minutes to refresh the data. 

now when I instead use the OData link , connect via OData as source and integrate filters directly into the query, I can cut down the downloaded data to under 5MB. thats not even 10% of the data 

I would assume that transforming and cleaning up datasets in PowerBI desktop would also "cleanse" my data request, but it does not seem to work that way at all ? 
PowerBI always loads the complete data, then transforms. 
which is just... 

so. since OData does fall under the same restrictions regarding the general API limitations (number of synchronous calls, Transferrate limitations, etc etc... )

what is the most effective and efficint way to build business central datasets that span several tables ?
since every table has to be called as an individual datasource... I quickly run into afforementioned limitations. 

is it even possible to work around that without actually developing direct API datasources (meaning writing own AL code based stuff ) and combine tables into a single API source ? 

thanks in advance




 


3 REPLIES 3
bco_lde
Helper I
Helper I

I fail to understand what kind of example you would need to understand the problem ? 
dataset is > x in size. 
PowerBI always loads the complete data before the transformation step. it does this for every data refresh. 
loads the data, completes transformation, then refreshes reports with the new transformed data. 
API limits of business central are .. strict. timeout limits. bandwith restrictions. limited number of synchronous calls. 
so a dataset with two tables uses two calls. 5 tables 5 calls- 
if call limit is reached, no new call is allowed to be opened until other calls are finished. 
now if a call needs loads of time, because table is too big in size, it runs into a timeout. 
timeout for one call in the dataset means the complete data refresh simply fails. it does not restart the call. 

this means: limiting number of calls a s much as possible. 
limit volume of data as much as possible.

as stated: the built in PowerBI -> BC Connector seems to always load the complete datasource before applying the tranformation, and does this for each refresh 

hence my question: how can I optimize datasets without developing my own, optimized API directly in BC ? 
is there even a way to do so ? 

thanks in advance

v-henryk-mstf
Community Support
Community Support

Hi @bco_lde ,

 

The information you have provided is not making the problem clear to me. Can you please explain with an example?

 

Looking forward to your reply.


Best Regards,
Henry

 

I fail to understand what kind of example you would need to understand the problem ? 
dataset is > x in size. 
PowerBI always loads the complete data before the transformation step. it does this for every data refresh. 
loads the data, completes transformation, then refreshes reports with the new transformed data. 
API limits of business central are .. strict. timeout limits. bandwith restrictions. limited number of synchronous calls. 
so a dataset with two tables uses two calls. 5 tables 5 calls- 
if call limit is reached, no new call is allowed to be opened until other calls are finished. 
now if a call needs loads of time, because table is too big in size, it runs into a timeout. 
timeout for one call in the dataset means the complete data refresh simply fails. it does not restart the call. 

this means: limiting number of calls a s much as possible. 
limit volume of data as much as possible.

as stated: the built in PowerBI -> BC Connector seems to always load the complete datasource before applying the tranformation, and does this for each refresh 

hence my question: how can I optimize datasets without developing my own, optimized API directly in BC ? 
is there even a way to do so ? 

thanks in advance

Helpful resources

Announcements
Microsoft Fabric Learn Together

Microsoft Fabric Learn Together

Covering the world! 9:00-10:30 AM Sydney, 4:00-5:30 PM CET (Paris/Berlin), 7:00-8:30 PM Mexico City

PBI_APRIL_CAROUSEL1

Power BI Monthly Update - April 2024

Check out the April 2024 Power BI update to learn about new features.

April Fabric Community Update

Fabric Community Update - April 2024

Find out what's new and trending in the Fabric Community.