Starting December 3, join live sessions with database experts and the Microsoft product team to learn just how easy it is to get started
Learn moreGet certified in Microsoft Fabric—for free! For a limited time, get a free DP-600 exam voucher to use by the end of 2024. Register now
Greetings!
Friends, tell me how you can overcome the error "Cyclic dependency detected". I make two calculated columns that calculate the same value - but in different units. The first column counts the implementation in rubles, and the second column counts the implementation in weight units.
How can I avoid this error?
DAX below:
-- first DAX
Sales in rubles =
VAR CompanyID = 'Sales '[Company]
VAR ContractID = 'Sales '[Document]
VAR ContractDate = 'Sales '[Date_document]
RETURN
CALCULATE(SUM('Sales '[Sales_rub]),
FILTER(ALL('Full'[Company],'Full'[Document], 'Full'[Date_document], 'Full'[Date_approval]),
'Full'[Company] = CompanyID && 'Full'[Document] = ContractID && 'Full'[Date_document] = ContractDate && 'Full'[Date_document] >= 'Full'[Date_approval]
)
)
DAX: cyclic dependency detected
-- second DAX
Sales in kilograms =
VAR CompanyID = 'Sales '[Company]
VAR ContractID = 'Sales '[Document]
VAR ContractDate = 'Sales '[Date_document]
RETURN
CALCULATE(SUM('Sales '[Sales_kg]),
FILTER(ALL('Full'[Company],'Full'[Document], 'Full'[Date_document], 'Full'[Date_approval]),
'Full'[Company] = CompanyID && 'Full'[Document] = ContractID && 'Full'[Date_document] = ContractDate && 'Full'[Date_document] >= 'Full'[Date_approval]
)
)
Solved! Go to Solution.
Following up on my last post... Here's how to make PBI work with incremental refresh when using files: Chris Webb's BI Blog: Keep The Existing Data In Your Power BI Dataset And Add New Data To It Using I...
And here's the article that Chris refers to at the beginning of his article: Incremental refresh for files in a Folder or SharePoint - Power BI — Powered Solutions
Following up on my last post... Here's how to make PBI work with incremental refresh when using files: Chris Webb's BI Blog: Keep The Existing Data In Your Power BI Dataset And Add New Data To It Using I...
And here's the article that Chris refers to at the beginning of his article: Incremental refresh for files in a Folder or SharePoint - Power BI — Powered Solutions
Incremental updates as far as I understand are available on premium rates and the cloud, we have Desktop.
We plan to switch to the Report Server
It's a very, very, very bad idea to put CALCULATE in a calculated colum of a fact table. You'll feel the heat very soon. Fact tables are usually very big in terms of the number of rows. Forcing the engine to make millions (or many more) context transitions is nothing more than just asking for trouble with performance. And, of course, the use of CALCULATE is also the source of your troubles with cyclic references.
I really like these answers. What's the solution?
What is good or bad is useful. But it doesn't solve the problem..
Please let me ask you this question: Why don't you use the power of Power Query to calculate these columns? In fact, this is the very place you should be going to in order to calculate such things, especially on fact tables. The calculation will not only be faster (which is important when refreshing the model). The column will also be compressed in an optimal way which means your DAX will be more performant.
We have fact tables divided by year into different tables (2020,2021). Only the 2021 table is updated with new data.
In powerbi, we have a fact table formed by the formula UNION(2020,2021).
Power Query is not clear how to use it here, since there is no general table in it, there are only two tables for the years 2020,2021 - how to calculate, for example, a column for a fact table that is not physically present in PQ?
I can see you are doing things the wrong way all along. If you have a model where tables are in an Import mode (and I assume you do), then all the tables are imported through Power Query. You should do all our ETL in Power Query, not in DAX. UNION on 2 fact tables in DAX? Very bad idea. Please do all your data preparation in Power Query. This is the right and correct way. Power Query is a data mashup engine for a reason... DAX is for Data Analysis (Data Analysis eXpressions). Please note that if you don't program your PBI correctly from the start, you'll be having huge problems later down the line. Trust me: You don't want to do things incorrectly.
OK, how do you load data from 500 Excel files into Power Query with the same format? You will update the report for 3 hours when a new file appears or a single file in a folder is updated.
We have a new Excel file in the folder and the division into several tables by year is made so that the old data is not updated, but only the data for 2021 (current) is updated.
When analyzing data, we work with a common table (including all periods). If you have a better solution, tell me - how can I do this task correctly?
I've created 500 Excel files in a folder on my PC. In each there is a table with 20 columns (a mix of integers, floats and text) and 365 rows. I used the Folder connector in Power Query to load this data, automatically consolidate into one table and then load into PBI Desktop. The process of fully refreshing all data takes not more than about 10 seconds.
So, the question is: How many rows do you have in each of the files that you are worried about the refresh time? I'm not sure how to do an incremental refresh when loading data from Excel files and if it's possible at all. I'll have to find out. But if you don't have monstrous Excel files with tens of thousands of rows, then I can't see why you couldn't do what I've done...
The fact is that we have 500 files and each has 10-15 thousand lines. It takes us 3 hours. I say this not because I'm making it up, but as it is in practice.
10 thousand is not monstrous files, monstrous is 800 thousand and millions.
Or even on a few tabs like to do.I'll look at your link about incremental updates - but as far as I understand it requires a field with the date of addition - we do not have it.
More precisely, there is but it does not match the current one.
Therefore, it was decided to do it in the only possible way.
@ReyCarter , Try new columns like
Sales in rubles =
CALCULATE(SUM('Sales '[Sales_rub]),
FILTER('Full',
'Full'[Company] = earlier('Full'[Company]) && 'Full'[Document] = earlier('Sales '[Document]) && 'Full'[Date_document] = earlier('Sales '[Date_document])
&& 'Full'[Date_document] >= earlier('Full'[Date_approval])
)
)
Sales in kilograms =
CALCULATE(SUM('Sales '[Sales_kg]),
FILTER('Full',
'Full'[Company] = earlier('Full'[Company]) && 'Full'[Document] = earlier('Sales '[Document]) && 'Full'[Date_document] = earlier('Sales '[Date_document])
&& 'Full'[Date_document] >= earlier('Full'[Date_approval])
)
)
Does not work
It is not possible to define a single value for the Company column in the Full table. This can happen if the measure formula refers to a column containing multiple values to get a single result, without specifying an aggregate, such as MIN, MAX...
Starting December 3, join live sessions with database experts and the Fabric product team to learn just how easy it is to get started.
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early Bird pricing ends December 9th.
User | Count |
---|---|
26 | |
20 | |
19 | |
14 | |
14 |
User | Count |
---|---|
44 | |
36 | |
24 | |
24 | |
22 |