The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
I have a report which uses only dataflows as data sources. The dataset loads several tables from different dataflows. I have a large fact table with a dozen of million rows as one of the tables of the dataset. Occasionally, some new requirements come and I have to update the model. I copy the dataflow and then change the dataflow Id in the PBI report in PBI Desktop which I refresh locally (sometimes unsuccessfully as the table is large and on top of that I have about 10 calculated columns which load simultaneously upon refresh and crash it).
Is there any way of changing the dataflow for in a dataset without the need to refresh locally before I upload to the service and be able to refresh it there? I tried the REST API and the option to update datasouces for a dataset but I am only given a single data source Id as a response to my request since I only use dataflows for the dataset (https://docs.microsoft.com/en-us/rest/api/power-bi/datasets/update-datasources-in-group). Then I see that a request to "https://api.powerbi.com/v1.0/myorg/groups/{group}/datasets/upstreamDataflows" returns all of the dataflow components of a dataset with their respective workspace IDs and dataflow IDs but there is no option to update any one of them.
Does anyone know how I can change the dataflow source for a table on the service without the need to update the report locally through the REST API or any external tool?
Solved! Go to Solution.
use service parameters