Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Calling all Data Engineers! Fabric Data Engineer (Exam DP-700) live sessions are back! Starting October 16th. Sign up.
Hi, i am seeking advice, to whether there is some best practices - and something I should be aware of, in this situation:
We are looking to simplify our data setup - currently we have unstable SQL tables, which often breaks down (dosn't update from source, calculations breaks down, which means we have to get pricey external consultancy to fix it).
Most of the tables are made from querring rest apis, and given that Powerbi has the ability to get data from Rest Api, i thought that it could be a good replacement to our unstable SQL tables.
We are looking at rest api calls that pulls in around a maximum of 50-100mb on each call.
Does this approach raise any red flags with you? Is there anything i should build into this, to make it more solid? Any advice is much appreciated.
Thank you,
Hi @jabrolsen - yes and no. Unfortunately it depends on the API source and authentication, so Power Query may not be the best option. I would consider looking at Data Factory option. This will allow you to export the API calls to Data Lake and then use Data Flows to transform. Data Factory approach may give you better options to manage the API calls and authentication. Microsoft Fabric make it easier to consider this option.
Many thanks
Daryl
So i have aldready made the connection in a test setup, so until now, i know that i can pull the data through power query - so autentication was not an issue.
But in regards to the calls, would the Data Factory option enable me to modify the call, so that for example, it only gets data that has been changed since last get.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the October 2025 Power BI update to learn about new features.