Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
LeadingEdge
New Member

shared developer access to Postgresql dataset

Hi All: we are planning on using PowerBI to report against a PostGreSql (PG 12.0) data warehouse. The data warehouse is about 25GB but growing at ~50% annually. We have 5 developers and ~80 viewers of reports.

 

We'd like the developers to query against a common dataset instead of using multiple PBIX copies, and need the data refreshed in near real-time. The data warehouse is on a local server. 

 

What is the recommended solution for this? Do we need to create a cloud-based copy of the data warehouse with regular updates, and create a shared workspace and dataset pointing to the cloud-based copy?

1 ACCEPTED SOLUTION

Sorry - I incorrectly assumed you were refreshing the data directly to the service via a gateway. Bad assumption on my part! Glad you found that.

 

Yes, you ABSOLUTELY need that. That was assumed in my post. So, in summary, two ways you can try this:

  1. Create dataflows to the PostGre data using the gateway.
  2. Set up incremental refresh on those dataflows for the FACT tables so only recent data is constantly refreshed. You define "recent" - 3 days, 10 weeks, 12 months, whatever. Older data after the first big refresh is static.
  3. Connect multiple Power BI PBIX files to the dataflows so end users can create reports.
  4. Publish reports to service.
  5. User Power Automate to automatically refresh reports when dataflow succesfully refreshes.

Alterntively skip dataflows:

  1. Users connect PBIX files to PostGre database
  2. publish to service
  3. grant users rights to gateway
  4. users set up scheduled refreshes.
  5. You can set up incremental refreshes here too so your PostGRE server isn't getting hammered for 100% of the data with each refresh.

There are many other alternatives. The advantage of the dataflow scenario is there is only one connection to the on-prem database. All others are cloud to cloud as the Power BI datasets refresh against the dataflow.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

View solution in original post

7 REPLIES 7
LeadingEdge
New Member

Thanks Ed...my fear is that the dataset would be too large to effectively manage from a single users' desktop, and we need the data updated daily if not more often.

That was why I also recommended incremental refresh.

 

As far as one user doing the modeling, it depends on the org. There can and often should be a difference between a data modeler and report developer. And with Premium or PPU workspaces, you can edit models in the service via tools like Tabluar Editor, so the model is no longer on the desktop. Additionally, more than on data modeler can modify it, though they would need to coordinate changes. Adding/changing tables is a pretty big deal that can have wide ranging impacts. Editing measures less so. You can get real sophisticated with this using DevOps to manage the model changes.
Data modeling and data refreshes are not related, especially when they are all done in the cloud.

None of this works with a Pro license workspace, requires premium or PPU, but I suspect the size of your model precludes Pro anyway.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Thanks for the valuable info Ed! I found this note this morning: https://docs.microsoft.com/en-us/power-query/connectors/postgresql

 

It looks like I could build a direct connection to our local data warehouse using a "on-premises data gateway", but I'm not sure if this will allow me to get to the same place as your solution? We would want only one developer managing the data model at least to start. And yes we're not concerned with the licensing tier for the developers at this point.

 

I'm more  interested in getting the data refreshed frequently and in a common dataset, so that we can build near-real-time reporting capability for internal uses.

Sorry - I incorrectly assumed you were refreshing the data directly to the service via a gateway. Bad assumption on my part! Glad you found that.

 

Yes, you ABSOLUTELY need that. That was assumed in my post. So, in summary, two ways you can try this:

  1. Create dataflows to the PostGre data using the gateway.
  2. Set up incremental refresh on those dataflows for the FACT tables so only recent data is constantly refreshed. You define "recent" - 3 days, 10 weeks, 12 months, whatever. Older data after the first big refresh is static.
  3. Connect multiple Power BI PBIX files to the dataflows so end users can create reports.
  4. Publish reports to service.
  5. User Power Automate to automatically refresh reports when dataflow succesfully refreshes.

Alterntively skip dataflows:

  1. Users connect PBIX files to PostGre database
  2. publish to service
  3. grant users rights to gateway
  4. users set up scheduled refreshes.
  5. You can set up incremental refreshes here too so your PostGRE server isn't getting hammered for 100% of the data with each refresh.

There are many other alternatives. The advantage of the dataflow scenario is there is only one connection to the on-prem database. All others are cloud to cloud as the Power BI datasets refresh against the dataflow.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Thanks Ed, you have clarified the data flow nicely.

Great! Start a new thread about the dataflow should you have any questions or run into issues. 



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting
edhans
Super User
Super User

You might want a golden dataset - someone creates a dataset that fits everyone's needs, and set up incremental refresh so it only gets the latest data (x days, weeks, or months). That would all work within Power BI.

 

You can use Dataflows or other datawarehouse tools to regularly sync your data and have the devs query against those tables for their data. But that can be a very VERY large discussion of multiple products, costs, and capabilities.



Did I answer your question? Mark my post as a solution!
Did my answers help arrive at a solution? Give it a kudos by clicking the Thumbs Up!

DAX is for Analysis. Power Query is for Data Modeling


Proud to be a Super User!

MCSA: BI Reporting

Helpful resources

Announcements
Power BI DataViz World Championships

Power BI Dataviz World Championships

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now!

December 2025 Power BI Update Carousel

Power BI Monthly Update - December 2025

Check out the December 2025 Power BI Holiday Recap!

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.