The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends September 15. Request your voucher.
I have a Fact Table of 100 Million Rows. And much much shorter Dimension Tables. THe Dimension Tables are IMPORT, and the fact table is DIRECT QUERY.
I understand for memory footprint and perfomance is good to minimimize column count in the Fact table. I currently have 6 colunns but I can also have only 5 if I make one of the a computed value (DAX).
All tables come via a ODBC connector.
What is better (Pros/Cons) of:
A. Using a pre-computed the 6th column in the server, (and perhaps paying the cost of moving more data each time the DQ is done)
B. Creating a computed column in DAX in PowerBI
Solved! Go to Solution.
Why are you doing DQ for the fact table? You should consider moving that to import mode, and use incremental refresh for faster refreshes. Either way, you should avoid DAX calculated columns, and move it upstream (in your data source, or in your query if you move it to import mode).
Pat
Good tips. But I don't want to leave your question unanswered.
we use DQ for two reasons.
1. With 100 M rows, import results in going over the 10GB workspace limit.
2. The odbc connector does not work in the pbi web service. It connects to a Palantir foundry cloud data lake, and import as well as dq are not supported yet in the to Palantir Foundry. We can use the ODBC on desktop.
'We are working with MSFT to setup a gateway, but our beaurocracy means that will take months.
Why are you doing DQ for the fact table? You should consider moving that to import mode, and use incremental refresh for faster refreshes. Either way, you should avoid DAX calculated columns, and move it upstream (in your data source, or in your query if you move it to import mode).
Pat
User | Count |
---|---|
65 | |
61 | |
60 | |
53 | |
30 |
User | Count |
---|---|
181 | |
88 | |
71 | |
48 | |
46 |