Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I want to create a data flow like:
Trigger --> Modify the data --> Store modified data in Sharepoint
1. Trigger the flow - when Sharepoint List gets updated/modified.
2. On this trigger, execute Power BI report (uploaded in Power BI service).
In power BI report, I will make some modifications to the data.
3. The modified data should be stored back in Sharepoint.
Is this a possibility or anything similar that can achieve this solution.
Solved! Go to Solution.
Hey,
Thanks for reverting back!
yes, considering that there is a mandatory aspect of using Power apps and original source wuld stay as is, you can just leverage Power automate to do the needful rather than bringing in Power BI in between.
You can consume from multiple sources and merge and transform them via power automate
https://www.encodian.com/blog/working-with-file-contents-and-files-in-power-automate/
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-control-flow-branches
Hey,Can you be a little clear as to what you mean by #2 {
On this trigger, execute Power BI report (uploaded in Power BI service).
In power BI report, I will make some modifications to the data.}
Do you mean you need to refersh the powerbi report and wrtie back into Powerbi?
2. On this trigger, execute Power BI report (uploaded in Power BI service).
In power BI report, I will make some modifications to the data.
This part is not clear to me. Maybe you mean "Initiate a Power BI dataset refresh"?
Note that dataset refreshes are processed asynchronously. You would have to include logic in your flow to wait for the completion of the refresh before proceeding with the next step.
Yes, on trigger there will be data refresh to read the latest data alongwith below.
In step 2, there will be some data modification required. Like, masking some data, adding few custom columns.
For this, I was exploring Power BI report/or any other possible way.
After these data modifications, I need the modified data to be stored back to Sharepoint.
I still don't understand why you want to do it this way, but you can use the "Execute a DAX query against a dataset" action to pull the Power BI data.
In step 2, there will be some data modification required. Like, masking some data, adding few custom columns.
For this, I was exploring Power BI report/or any other possible way.
After these data modifications, I need the modified data to be stored back to Sharepoint.
But you can also achive column addition, masking etc directly via power automate,logic app etc.
why use an intermittent layer of Power BI for transformations when you can actually leverage transformation tools
I do have few limitations for this project:
- This data will be merged with 1 or 2 other data sources (not Sharepoint).
- There can't be any modifications to the origin as it is Application generated data.
- The origin data is not in Table format. It is either list/comma-separated data.
- The requirement was to implement using only Power Automate & Power BI.
That's is why was exploring this path.
Hey,
Thanks for reverting back!
yes, considering that there is a mandatory aspect of using Power apps and original source wuld stay as is, you can just leverage Power automate to do the needful rather than bringing in Power BI in between.
You can consume from multiple sources and merge and transform them via power automate
https://www.encodian.com/blog/working-with-file-contents-and-files-in-power-automate/
https://learn.microsoft.com/en-us/azure/logic-apps/logic-apps-control-flow-branches
Maybe it is more productive to cut Power BI from this scenario and go to the data source directly.