Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us for an expert-led overview of the tools and concepts you'll need to become a Certified Power BI Data Analyst and pass exam PL-300. Register now.
We are implementing PowerBi in our company. We do a lot of self service reportig and some centralized design and deployment. We have tried keeping the data in our data warehouse and bringing the data with DirectQuery into the reports. The performance is really poor doing that. We also tried creating Datamarts with tables that are used frequently; that works a little better but still we are using DirectQuery to the Datamarts and we get intermitent performance. The point of the datamarts was to avoid having to upload and schedule many Semantic Models and share the data wit all reports needing those data sets. However, based on what we have seen, Datamarts have been a less practical approach. So my question, based on your experience, what has been the best way to handle large amounts of data coming from multiple tables in an environment where we need self service.
Keep using your central data warehouse but connect to it in import mode.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
4 | |
3 | |
3 | |
3 | |
3 |