Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
Check it out now!Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
Hi everyone.
We are successfully importing a dataflow using the Post Import in Group API. Its data source is an Azure SQL DB. However, for the dataflow to properly load & be refreshed, we need to supply credentials for the data source(s). This can be done manually - either by trying to edit the dataflow entity or by opening up the dataflow settings. But can this be done programmatically?
For datasets, it seems that one can use the Update Data Sources in Group API to edit the credentials and provide a connection string. But there doesn't seem to be an equivalent for dataflows.
Ideally, we want a fully-automated way of creating a ready-to-use dataflow, but we can't seem to figure out a way of setting the credentials programmatically.
Does anyone have a solution to this?
Thanks,
Ed
Solved! Go to Solution.
Figured it out. To update any datasource credentials (Dataflow/Dataset) programmatically, you need to first get the datasourceId and the gatewayId for the Dataflow or Dataset. For a Dataflow, you should use the Get Dataflow Data Sources API. For a Dataset, use the Get Dataset Data Sources API.
Then, for the datasource whose credentials you want to update, use the Gateways Update Data Source API, using the corresponding datasourceId and gatewayId returned from the first call.
It was the Gateways bit I was initially missing - but having found that, everything works as expected.
Hi @ed-freeman ,
If you are not familiar with REST API or program coding, I'm not so recommend you to try this method.
AFAIK, the simple way to achieve automatic refresh is create a dataflow based on sql connector and configure correspond data credentials.(azure sql not require gateway to handle refresh)
Creating and using dataflows in Power BI
Regards,
Xiaoxin Sheng
Hi Xiaoxin,
Thank you for the reply!
We know we can use "Push Datasets" to create datasets in the Power BI Service. But we're not creating a dataset, we're creating a dataflow.
We are currently successfully using the API to import the dataflow. The dataflow "structure” is imported fine, but we don't know of any way to set up the credentials programmatically. We know we can set up the credentials manually, but that's exactly what we want to avoid.
If you know of any way to set up dataflow credentials programmatically, we'd love to hear how
Thanks again,
Ed
Hi @ed-freeman ,
In fact, power bi official document already list how to get access token.(tutorial step 2) Please check following document to know more about this: (include sample code)
Step 2: Get an authentication access token
After these, you only need to write a looping function based on timer to invoke getToken function to get new token before current token expired.
Regards,
Xiaoxin Sheng
Hi @v-shex-msft,
Thank you for your response, but I think you're misunderstanding my problem. I'm not trying to use the Push Data capability.
I have an Azure SQL Database. I'm programmatically creating a Dataflow (using the Post Import in Group API)) which connects to that database. However, before I can access the dataflow in the Power BI Service I need to manually enter the credentials (username & password) for the Azure SQL Database. I am wondering whether this part can be done programmatically.
Regards,
Ed
Figured it out. To update any datasource credentials (Dataflow/Dataset) programmatically, you need to first get the datasourceId and the gatewayId for the Dataflow or Dataset. For a Dataflow, you should use the Get Dataflow Data Sources API. For a Dataset, use the Get Dataset Data Sources API.
Then, for the datasource whose credentials you want to update, use the Gateways Update Data Source API, using the corresponding datasourceId and gatewayId returned from the first call.
It was the Gateways bit I was initially missing - but having found that, everything works as expected.