Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Sign up nowGet Fabric certified for FREE! Don't miss your chance! Learn more
I am trying to create a table that shows the next 7 days (including today's date), and is conditionally formatted based on the second column's value.
The Excel Report that feeds it with the data is from a daily, downloaded file where the columns will progressivly change.
The error I recieve when adding today's file (2/11) to the folder, is that it is missing a column called 2/10 (yesterday's date).
Is there a way to have Power Query read the Excel file and simply return the headings from column 3 onward, regardless if they are titled the same?
Solved! Go to Solution.
If you want to still just query one file, but have it dynamically select the latest one (based on last created or some other datetime metadata column), then this should work as you intend ONLY IF you inject some additional steps up front that reads the files in the folder and then connects to the latest one.
Once you query to get latest file and download (edit: by "download" I mean get it into Power Query, e.g. Web.Contents("<some url to a file online>")) the binary (how you do that is specific to the data source - e.g. SharePoint will look a bit different than your desktop files and at the very least requires a different connector to query folder contents and get xlsx binary), then the steps as provided should work.
If, however, you want to query a folder, cobmine all xlsx and perform the above, then it's sort of similar to what I described above (step 1: query folder, step 2: get single file binary, step 3: perform transformation provided in solution), except you need to do something a little different at step at 2: 2a) get table/list of file binaries, 2b) parse, normalize, and combine into one table.
Try something like this. I included examples of:
let
Source = Excel.Workbook(File.Contents("C:\******\******\******\Book1.xlsx"), null, true),
Sheet1_Sheet = Source{[Item="Sheet1",Kind="Sheet"]}[Data],
PromoteHeaders = Table.PromoteHeaders(Sheet1_Sheet, [PromoteAllScalars=true]),
// specify columns that do not change from refresh to refresh
StableCols_Names = {"Material", "DOH"},
// and manually specify static columns' types
StableCols_Types = {type text, type number},
// grab dynamic column names
DynCols_Names = List.Buffer( List.RemoveMatchingItems(
Table.ColumnNames(PromoteHeaders),
StableCols_Names
) ),
// assuming these are all numbers - may need to update if different logic applies
// if needed, can implement a prefix on column names that you then use to set the type
DynCols_Types = List.Generate(
()=>0, each _< List.Count(DynCols_Names) ,
each _+1, each type number
),
// transform column type using static and dynamic names/types
DynFixTypes = Table.TransformColumnTypes(
PromoteHeaders,
List.Zip( {
StableCols_Names & DynCols_Names,
StableCols_Types & DynCols_Types
} )
),
// example: multiply all dynamic columns by 2 and switch out nulls for 0
DynUpdateAllDays = [
target_transform = { each ( _ * 2) ?? 0, type number },
transform_list = List.Transform( DynCols_Names, each {_} & target_transform ),
invoke_transform = Table.TransformColumns( DynFixTypes, transform_list )
] [invoke_transform],
// example: divide the specified columns {0..6} by 100 and convert to %
DynUpdateSpecificDays = [
target_cols_i = {0,6}, // first and last days
target_cols = List.Transform( target_cols_i, each DynCols_Names{_} ),
target_transform = { each Percentage.From( _ / 100 ), Percentage.Type },
transform_list = List.Transform( target_cols, each {_} & target_transform ),
invoke_transform = Table.TransformColumns( DynFixTypes, transform_list )
] [invoke_transform]
in
DynUpdateSpecificDays
Thanks @MarkLaf - I will try this.
one clarification - if i have the data source set to a folder instead of a specific sheet - with the intention of dropping a daily download into the folder and having PowerBI extract the data. Will the DAX formulas still work?
ource = Excel.Workbook(File.Contents("C:\******\******\******\Book1.xlsx"), null, true),
If you want to still just query one file, but have it dynamically select the latest one (based on last created or some other datetime metadata column), then this should work as you intend ONLY IF you inject some additional steps up front that reads the files in the folder and then connects to the latest one.
Once you query to get latest file and download (edit: by "download" I mean get it into Power Query, e.g. Web.Contents("<some url to a file online>")) the binary (how you do that is specific to the data source - e.g. SharePoint will look a bit different than your desktop files and at the very least requires a different connector to query folder contents and get xlsx binary), then the steps as provided should work.
If, however, you want to query a folder, cobmine all xlsx and perform the above, then it's sort of similar to what I described above (step 1: query folder, step 2: get single file binary, step 3: perform transformation provided in solution), except you need to do something a little different at step at 2: 2a) get table/list of file binaries, 2b) parse, normalize, and combine into one table.
Hi @BakerE
Following up to confirm if the earlier responses addressed your query. If not, please share your questions and we’ll assist further.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 12 | |
| 11 | |
| 10 | |
| 7 | |
| 6 |