Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more. Get started

Reply
bco_lde
Helper I
Helper I

Performance of PowerBI connecting to Business Cenral / Odata / ways to cope ?

Hi there
for PowerBI and Business Central all claiming to be the same stack, performance sure leaves a lot to be desired. 
examples: 
When I connect to a Business Central table via the normal integrated ApI connector, the full Data is loaded. 
In my example, table is project tasks. the whole data is "only" 47MB, yet the connection takes minutes to refresh the data. 

now when I instead use the OData link , connect via OData as source and integrate filters directly into the query, I can cut down the downloaded data to under 5MB. thats not even 10% of the data 

I would assume that transforming and cleaning up datasets in PowerBI desktop would also "cleanse" my data request, but it does not seem to work that way at all ? 
PowerBI always loads the complete data, then transforms. 
which is just... 

so. since OData does fall under the same restrictions regarding the general API limitations (number of synchronous calls, Transferrate limitations, etc etc... )

what is the most effective and efficint way to build business central datasets that span several tables ?
since every table has to be called as an individual datasource... I quickly run into afforementioned limitations. 

is it even possible to work around that without actually developing direct API datasources (meaning writing own AL code based stuff ) and combine tables into a single API source ? 

thanks in advance




 


3 REPLIES 3
bco_lde
Helper I
Helper I

I fail to understand what kind of example you would need to understand the problem ? 
dataset is > x in size. 
PowerBI always loads the complete data before the transformation step. it does this for every data refresh. 
loads the data, completes transformation, then refreshes reports with the new transformed data. 
API limits of business central are .. strict. timeout limits. bandwith restrictions. limited number of synchronous calls. 
so a dataset with two tables uses two calls. 5 tables 5 calls- 
if call limit is reached, no new call is allowed to be opened until other calls are finished. 
now if a call needs loads of time, because table is too big in size, it runs into a timeout. 
timeout for one call in the dataset means the complete data refresh simply fails. it does not restart the call. 

this means: limiting number of calls a s much as possible. 
limit volume of data as much as possible.

as stated: the built in PowerBI -> BC Connector seems to always load the complete datasource before applying the tranformation, and does this for each refresh 

hence my question: how can I optimize datasets without developing my own, optimized API directly in BC ? 
is there even a way to do so ? 

thanks in advance

v-henryk-mstf
Community Support
Community Support

Hi @bco_lde ,

 

The information you have provided is not making the problem clear to me. Can you please explain with an example?

 

Looking forward to your reply.


Best Regards,
Henry

 

I fail to understand what kind of example you would need to understand the problem ? 
dataset is > x in size. 
PowerBI always loads the complete data before the transformation step. it does this for every data refresh. 
loads the data, completes transformation, then refreshes reports with the new transformed data. 
API limits of business central are .. strict. timeout limits. bandwith restrictions. limited number of synchronous calls. 
so a dataset with two tables uses two calls. 5 tables 5 calls- 
if call limit is reached, no new call is allowed to be opened until other calls are finished. 
now if a call needs loads of time, because table is too big in size, it runs into a timeout. 
timeout for one call in the dataset means the complete data refresh simply fails. it does not restart the call. 

this means: limiting number of calls a s much as possible. 
limit volume of data as much as possible.

as stated: the built in PowerBI -> BC Connector seems to always load the complete datasource before applying the tranformation, and does this for each refresh 

hence my question: how can I optimize datasets without developing my own, optimized API directly in BC ? 
is there even a way to do so ? 

thanks in advance

Helpful resources

Announcements
Europe Fabric Conference

Europe’s largest Microsoft Fabric Community Conference

Join the community in Stockholm for expert Microsoft Fabric learning including a very exciting keynote from Arun Ulag, Corporate Vice President, Azure Data.

Power BI Carousel June 2024

Power BI Monthly Update - June 2024

Check out the June 2024 Power BI update to learn about new features.

PBI_Carousel_NL_June

Fabric Community Update - June 2024

Get the latest Fabric updates from Build 2024, key Skills Challenge voucher deadlines, top blogs, forum posts, and product ideas.

Top Solution Authors