Don't miss your chance to take exam DP-600 or DP-700 on us!
Request nowLearn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now
On the PBI desktop, I make a simple one table dataset that connects to a dataflow using a gateway in the service. On the desktop, it connects and updates. When I publish the simple report, the Gateway connection turns to Personal Cloud Connect and then cannot refresh because their is no personal cloud connection. There is only one table in the data model and it comes from the gateway. Why does it keep changing to the Personal Cloud Connect and making a new personal cloud connection?
after loading to service
How it looks in the desktop
dataflow selected in desktop report
Solved! Go to Solution.
Hi @bdnypa,
Thanks for the clarification, and it’s good to hear that using the semantic model works as expected.
Your expectation about the gateway helping with easier administration is understandable. However, the gateway mainly provides secure connectivity between Power BI and the data source, and it does not automatically abstract or handle server name changes unless the gateway data source configuration and semantic model are updated accordingly.
To minimize maintenance when server changes occur, the recommended approach is to use a centralized semantic model and connect reports to it instead of directly connecting to the server. This way, if the server changes, you only need to update the connection in the semantic model or gateway data source, and all dependent reports will continue to work without requiring individual updates.
Thank you again for using the Microsoft Fabric Community Forum.
Thanks for the follow up. Using a semantic model works as expected. It would be a lot cleaner on the admin side if a gateway could be used. If we change the server, we will be stuck changing all these connections. I thought the gateway would have resolved that issue.
Hi @bdnypa,
Thanks for the clarification, and it’s good to hear that using the semantic model works as expected.
Your expectation about the gateway helping with easier administration is understandable. However, the gateway mainly provides secure connectivity between Power BI and the data source, and it does not automatically abstract or handle server name changes unless the gateway data source configuration and semantic model are updated accordingly.
To minimize maintenance when server changes occur, the recommended approach is to use a centralized semantic model and connect reports to it instead of directly connecting to the server. This way, if the server changes, you only need to update the connection in the semantic model or gateway data source, and all dependent reports will continue to work without requiring individual updates.
Thank you again for using the Microsoft Fabric Community Forum.
Hi @bdnypa,
Just checking in to see if the issue has been resolved on your end. If the earlier suggestions helped, that’s great to hear! And if you are still facing challenges, feel free to share more details happy to assist further.
Thank you.
Hi @bdnypa,
Just wanted to follow up. If the shared guidance worked for you, that’s wonderful hopefully it also helps others looking for similar answers. If there’s anything else you'd like to explore or clarify, don’t hesitate to reach out.
Thank you.
Hi @bdnypa,
Thank you for reaching out to the Microsoft Fabric Community Forum.
What is happening here is by design. Since your dataset in Desktop is connecting to a Power BI Dataflow, the Service treats it as a cloud data source (extensionDataSourceKind: “PowerBI”). Even though the Dataflow itself uses an on-premises SQL source via a gateway, that gateway is only required for the Dataflow refresh. The dataset connecting to the Dataflow is considered a cloud to cloud connection, so it automatically maps to a Personal Cloud Connection in the Service.
To resolve the refresh issue, you don’t need to configure a gateway at the dataset level. Instead, go to the semantic model settings in the Service and update the Data source credentials for the Power BI source using Organizational Account (OAuth2). Once credentials are set correctly, refresh should succeed. If your intention is to use a gateway at the dataset level, you would need to connect directly to SQL Server from Desktop rather than through a Dataflow.
Hope that clarifies. Let us know if you have any doubts regarding this. We will be happy to help.
Thank you for using the Microsoft Fabric Community Forum.
I'm not doubting it - it just presents me with some questions. By Oath, I am assuming it means the users Id and password, which I think would be an admistrative issue since every user who publishes would need access to the data.
What we are trying to do is create datasets that can automatically refresh from existing data processes in SQL and publish them in PBI. The dataflow seemed a natural fit, until our first user published and the data does not refresh.
What is the best way to provide users with datasets that they do not control, do not require their own personal authorization and connect to the SQL data that can automatically be refreshed? After research, it sounded like a dataflow was the answer until this suprising twist.
The workaround is as you suggested. I created the dataset on my desktop as a semantic model and published the SM. That keeps the gateway connection and does not create a second semantic model when they use it for their reports. I don't find this a straightforward and clean solution as I should just be able to make the dataflow and have that shared. For this, we need PBI desktop and update the appropriate workspaces. It is interesting to me that one changes to the cloud connection while the other retains its connection when they are both really doing the same thing.
Hi @bdnypa You can create a cloud connection that can be used by the Semantic Models with the dataflow as the source. Select the Fabric Dataflows connector.
Create the Cloud connection, share the connection with the users making use of it.
Map your semantic model to the dataflow connection
Hi @bdnypa,
Thank you for the detailed follow-up.
When a dataset consumes a Dataflow, it becomes its own refresh object in the Service and does not inherit the gateway credentials used by the Dataflow. That is why OAuth is required at the dataset level and why it appears as a cloud connection. This behaviour is by design, as Dataflows handle transformation, while credential storage and refresh governance are enforced at the semantic model layer.
If your goal is centralized control (automatic SQL refresh, no per-user authorization, and users not owning datasets), the recommended approach is to create a single centrally managed semantic model connected directly to SQL via the Enterprise Gateway, configure scheduled refresh once under a BI/service account, and allow users to build thin reports using that shared dataset.
This ensures refresh runs centrally, avoids personal credentials, prevents duplicate semantic models, and provides a clean, governed enterprise setup.
Thanks again for using the Microsoft Fabric Community Forum.
Hi @bdnypa,
Just wanted to follow up. If the shared guidance worked for you, that’s wonderful hopefully it also helps others looking for similar answers. If there’s anything else you'd like to explore or clarify, don’t hesitate to reach out.
Thank you.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 20 | |
| 18 | |
| 11 | |
| 9 | |
| 7 |
| User | Count |
|---|---|
| 42 | |
| 38 | |
| 21 | |
| 19 | |
| 17 |