Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
Hello,
I need help improving our company's process with Power BI. Here is our current workflow:
We use Power BI Online Service as a solution for our clients. Each client has their own workspace. Within each client workspace, there are two dataflows:
Both flows are scheduled to update multiple times a day, handling data treatment and processing. Additionally, each client has their own dataset, which receives data from the ERP and API flows, processes and transforms this data (e.g., creating new tables), and ultimately feeds the respective report comprising various dashboards. Therefore, each client workspace contains four artifacts: ERP flow, API flow, dataset, and reports.
So far, so good. However, our main challenge is the version update process, which includes bug fixes, new dashboards, and new functionalities. Currently, we have to manually update the artifacts for each client workspace one by one. We have optimized many tasks via API (such as importing dataflows, PBIX files, scheduling updates, updating parameters, etc.), but the process remains quite labor-intensive and slow.
My question is: Using Microsoft's tools (whether Power BI, Fabric, or Azure), is there a way to streamline this operation? Given that the structure of the ERP flow, API flow, dataset, and reports is the same for all clients (only the queries differ in the ERP flow and company parameters in the API flow), is there a way to update the structure for all clients at once?
Our number of clients is rapidly increasing (which is excellent), but it's becoming increasingly challenging to sustain our version update process. Any guidance on how to address this issue would be greatly appreciated.
Thank you!
Have you considered PBIP/PBIR/TMDL and the git integration options?