Power BI is turning 10! Tune in for a special live episode on July 24 with behind-the-scenes stories, product evolution highlights, and a sneak peek at what’s in store for the future.
Save the dateEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
We are implementing PowerBi in our company. We do a lot of self service reportig and some centralized design and deployment. We have tried keeping the data in our data warehouse and bringing the data with DirectQuery into the reports. The performance is really poor doing that. We also tried creating Datamarts with tables that are used frequently; that works a little better but still we are using DirectQuery to the Datamarts and we get intermitent performance. The point of the datamarts was to avoid having to upload and schedule many Semantic Models and share the data wit all reports needing those data sets. However, based on what we have seen, Datamarts have been a less practical approach. So my question, based on your experience, what has been the best way to handle large amounts of data coming from multiple tables in an environment where we need self service.
Keep using your central data warehouse but connect to it in import mode.
User | Count |
---|---|
5 | |
4 | |
4 | |
2 | |
2 |
User | Count |
---|---|
8 | |
6 | |
4 | |
4 | |
4 |