The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredCompete to become Power BI Data Viz World Champion! First round ends August 18th. Get started.
We are implementing PowerBi in our company. We do a lot of self service reportig and some centralized design and deployment. We have tried keeping the data in our data warehouse and bringing the data with DirectQuery into the reports. The performance is really poor doing that. We also tried creating Datamarts with tables that are used frequently; that works a little better but still we are using DirectQuery to the Datamarts and we get intermitent performance. The point of the datamarts was to avoid having to upload and schedule many Semantic Models and share the data wit all reports needing those data sets. However, based on what we have seen, Datamarts have been a less practical approach. So my question, based on your experience, what has been the best way to handle large amounts of data coming from multiple tables in an environment where we need self service.
Keep using your central data warehouse but connect to it in import mode.
User | Count |
---|---|
5 | |
3 | |
2 | |
2 | |
2 |
User | Count |
---|---|
11 | |
7 | |
5 | |
4 | |
4 |