The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
Hello,
I need help improving our company's process with Power BI. Here is our current workflow:
We use Power BI Online Service as a solution for our clients. Each client has their own workspace. Within each client workspace, there are two dataflows:
Both flows are scheduled to update multiple times a day, handling data treatment and processing. Additionally, each client has their own dataset, which receives data from the ERP and API flows, processes and transforms this data (e.g., creating new tables), and ultimately feeds the respective report comprising various dashboards. Therefore, each client workspace contains four artifacts: ERP flow, API flow, dataset, and reports.
So far, so good. However, our main challenge is the version update process, which includes bug fixes, new dashboards, and new functionalities. Currently, we have to manually update the artifacts for each client workspace one by one. We have optimized many tasks via API (such as importing dataflows, PBIX files, scheduling updates, updating parameters, etc.), but the process remains quite labor-intensive and slow.
My question is: Using Microsoft's tools (whether Power BI, Fabric, or Azure), is there a way to streamline this operation? Given that the structure of the ERP flow, API flow, dataset, and reports is the same for all clients (only the queries differ in the ERP flow and company parameters in the API flow), is there a way to update the structure for all clients at once?
Our number of clients is rapidly increasing (which is excellent), but it's becoming increasingly challenging to sustain our version update process. Any guidance on how to address this issue would be greatly appreciated.
Thank you!
Have you considered PBIP/PBIR/TMDL and the git integration options?