Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredPower BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.
We are looking at a 3rd party Data Lake where the reference tables are denormalized (aka multi-purpose). I'm trying to understand best approach to handle these tables as far as filters in direct query and in import scenarios.
The Data Lake has tables like (e.g.):
[ProjectView] - pretty much merged 1:1 meta data, one row per project ID with project ID as key
[ProjectValueListItems] - multi-value drop down lists and their respective values per project (if any).
[ProjectCustomFieldValues] - single-value custom fields and their assigned values per project (if any)
[ProjectValueListItems] shares fields like "client type" or "business sector" where one or more values might be chosen per project. So the table has many fields per project and many values per field, like:
ProjectID, ValueListName, ValueListValue
00001, ClientType, Private
00001, BusinessSector, Construction
00001, BusinessSector, Engineering
00001, ClientType, Municipal
00002, BusinessSector, Engineering
etc...
[ProjectCustomFieldValues] has any custom date, text or number fields that are linked to project records and their values (stored as text). So it has many field names per project and their respective values, per project, like:
ProjectID, FieldName, FieldValue
00001, MarketingNeeded, Yes
00001, InterviewDate, 5/2/24
00002, MarketingNeeded, No
etc...
We obviously want to build reports where we can filter projects by specific multi-valued fields [e.g. show me projects where BusinessSector field is in (Engineering,Construction)]. We'd also filter the single valued field table [e.g. show me projects where InterviewDate > 1/1/24]
My question is on the page filtering: How do we best handle referencing these lookup tables?
We need to filter [ProjectValueListItems] multiple times - once per value list - like where ProjectValueListItems.ValueListName="ClientType" and ProjectValueListItems.ValueListValue = [values where the valuelistname was ClientType]. But then we also need to filter it again where ProjectValueListItems.ValueListName="BusinessSector" and value = [results per BusinessSector]
If our report is import based, we have flexibility in transforming the data to duplicate or reference the [ProjectValueListItems] table multiple times - once per valuelist. For a "ClientList" field filter we could duplicate or reference the original table, create a new column where the value = "X" if the valuelistname = "ClientList", then then delete blanks, etc. Then we'd have an index for ClientList values per project. We would have to repeat the same process, one new table generated per multi-valued field. Is this the best way for handling import based filtering in a schema like this or am I missing something?
For direct query, transform options are limited and there seems to be no option to reduce tables (or I'm missing something). We can't create the custom lookups in the same manner; is there any "trick" to using direct query with a schema like this, where you normally have to split out multiple conditions per reference table?
I hope this question is clear - thanks for your time!
Solved! Go to Solution.
Hi @tbakrf ,
According to your description, first of all for import mode, you can copy the [ProjectValueListItems] table for each ValueListName. Then create a relationship between these new tables and the [ProjectView] table using ProjectID as the key. This allows you to filter projects based on the values in these tables. And in each new table, you can create calculated columns to simplify filtering. For example, in the ClientTypeTable, create a column to mark the rows where ValueListValue matches the desired filter criteria. For direct query mode, you can use parameterized queries to dynamically filter data based on user input. This helps you apply multiple conditions without copying the table. If your data source supports it, use the composite model to combine the direct query and import modes. This allows you to import smaller, frequently filtered tables while keeping larger tables in direct query mode. Also use DAX metrics to create dynamic filters. For example, create a metric to check if an item meets multiple conditions and use that metric in your visualization.
Best regards,
Albert He
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly
Thank you - this definitely sounds like the right answer - it's much easier with the "buzz words" out in the open.
I was going to hold off accepting it until I could try these scenarios, but I'll come back if I get stuck or need clarity. Much appreciated!
Hi @tbakrf ,
According to your description, first of all for import mode, you can copy the [ProjectValueListItems] table for each ValueListName. Then create a relationship between these new tables and the [ProjectView] table using ProjectID as the key. This allows you to filter projects based on the values in these tables. And in each new table, you can create calculated columns to simplify filtering. For example, in the ClientTypeTable, create a column to mark the rows where ValueListValue matches the desired filter criteria. For direct query mode, you can use parameterized queries to dynamically filter data based on user input. This helps you apply multiple conditions without copying the table. If your data source supports it, use the composite model to combine the direct query and import modes. This allows you to import smaller, frequently filtered tables while keeping larger tables in direct query mode. Also use DAX metrics to create dynamic filters. For example, create a metric to check if an item meets multiple conditions and use that metric in your visualization.
Best regards,
Albert He
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Power BI update to learn about new features.
User | Count |
---|---|
84 | |
75 | |
68 | |
41 | |
35 |
User | Count |
---|---|
102 | |
56 | |
52 | |
46 | |
40 |