Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!Learn from the best! Meet the four finalists headed to the FINALS of the Power BI Dataviz World Championships! Register now
Hi,
I have the following array loaded in Power BI
[{"hours":[{"startTime":"13:00:00","endTime":"17:00:00","weekdays":["MON"]},{"startTime":"10:00:00","endTime":"17:00:00","weekdays":["TUE"]},{"startTime":"10:00:00","endTime":"17:00:00","weekdays":["WED"]},{"startTime":"10:00:00","endTime":"17:00:00","weekdays":["THU"]},{"startTime":"10:00:00","endTime":"21:00:00","weekdays":["FRI"]},{"startTime":"10:00:00","endTime":"17:00:00","weekdays":["SAT"]},{"startTime":"12:00:00","endTime":"17:00:00","weekdays":["SUN"]}]}]z
Power BI recognize this as a string inside the whole table therefor I cannot expand this.
What is the best possible way to expand this array to multiple columns inside the table?
Solved! Go to Solution.
Hey @Anonymous ,
Similar to what @Pragati11 had mentioned - it is difficult to really assess your solution without some more details on how this data is structued and being loaded.
That being said, is this is just coming through as a single cell/array and the data you provided is static this might be a workable solution for you
So right now we are starting with this data
We can split the data at each day using this delimite in Power Query
Add each column as a unique query using "Add as New Query"
Here is if we use column 2 as an example
Convert to a table with comma as the delimiter
We can then do a few custom columns
Start Time
Text.AfterDelimiter([Column1], ":")
End Time
Text.AfterDelimiter([Column2], ":")
Weekday
Text.AfterDelimiter([Column2], ":")
Then we can remove the old columns
And remove the brackets from the Weekday column. You will now have this data left
You can then take all of the steps from the advanced editor and copy and paste the other columns from your original table savings you some work
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45Wiq6OUcrILy0qjlGyArGLSxKLSkIyc1OB/BglQ2MrAwMgilHSiVFKzUtBSJgjSZSnpmanJFaCjYhR8vX3i1GKrdXBNMyAdMNCQl2pZ1i4qwsVXeYRSqJhRoY4DXML8qSey4IdQ3AYZkSGYaHg2ATCKqXYWAA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Test_Data = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Test_Data", type text}}),
#"Split Column by Delimiter" = Table.SplitColumn(#"Changed Type", "Test_Data", Splitter.SplitTextByDelimiter("},", QuoteStyle.Csv), {"Test_Data.1", "Test_Data.2", "Test_Data.3", "Test_Data.4", "Test_Data.5", "Test_Data.6", "Test_Data.7"}),
#"Test_Data 3" = #"Split Column by Delimiter"[Test_Data.2],
#"Converted to Table" = Table.FromList(#"Test_Data 3", Splitter.SplitTextByDelimiter(","), null, null, ExtraValues.Error),
#"Changed Type1" = Table.TransformColumnTypes(#"Converted to Table",{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}}),
#"Added Custom" = Table.AddColumn(#"Changed Type1", "StartTime", each Text.AfterDelimiter([Column1], ":")),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "EndTime", each Text.AfterDelimiter([Column2], ":")),
#"Added Custom2" = Table.AddColumn(#"Added Custom1", "Weekday", each Text.AfterDelimiter([Column3], ":")),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom2",{"Column1", "Column2", "Column3"}),
#"Replaced Value" = Table.ReplaceValue(#"Removed Columns","[","",Replacer.ReplaceText,{"Weekday"}),
#"Replaced Value1" = Table.ReplaceValue(#"Replaced Value","]","",Replacer.ReplaceText,{"Weekday"})
in
#"Replaced Value1"
Once you've done that with each weekday, go ahead and append your data and it will look like this (I only did Tues and Wednesday)
I realize this is a clunky solution but, let us know if it works. If not we can keep noodling away at it!
Hi all,
Sorry for my late response.
Thank you very much for all your help guys. Both solutions are working fine for me! 🙂
Hi @Anonymous ,
Try this custom column:
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45Wiq6OUcrILy0qjlGyArGLSxKLSkIyc1OB/BglQ2MrAwMgilHSiVFKzUtBSJgjSZSnpmanJFaCjYhR8vX3i1GKrdXBNMyAdMNCQl2pZ1i4qwsVXeYRSqJhRoY4DXML8qSey4IdQ3AYZkSGYaHg2ATCKqXYWAA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Column1 = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Column1", type text}}),
#"Added Custom" = Table.AddColumn(#"Changed Type", "Custom", each Table.FromRecords(Json.Document(Text.RemoveRange([Column1], Text.Length([Column1]) - 1)){0}[hours])),
#"Expanded Custom" = Table.ExpandTableColumn(#"Added Custom", "Custom", {"startTime", "endTime", "weekdays"}, {"startTime", "endTime", "weekdays"}),
#"Expanded weekdays" = Table.ExpandListColumn(#"Expanded Custom", "weekdays")
in
#"Expanded weekdays"
Hey @Anonymous ,
Similar to what @Pragati11 had mentioned - it is difficult to really assess your solution without some more details on how this data is structued and being loaded.
That being said, is this is just coming through as a single cell/array and the data you provided is static this might be a workable solution for you
So right now we are starting with this data
We can split the data at each day using this delimite in Power Query
Add each column as a unique query using "Add as New Query"
Here is if we use column 2 as an example
Convert to a table with comma as the delimiter
We can then do a few custom columns
Start Time
Text.AfterDelimiter([Column1], ":")
End Time
Text.AfterDelimiter([Column2], ":")
Weekday
Text.AfterDelimiter([Column2], ":")
Then we can remove the old columns
And remove the brackets from the Weekday column. You will now have this data left
You can then take all of the steps from the advanced editor and copy and paste the other columns from your original table savings you some work
let
Source = Table.FromRows(Json.Document(Binary.Decompress(Binary.FromText("i45Wiq6OUcrILy0qjlGyArGLSxKLSkIyc1OB/BglQ2MrAwMgilHSiVFKzUtBSJgjSZSnpmanJFaCjYhR8vX3i1GKrdXBNMyAdMNCQl2pZ1i4qwsVXeYRSqJhRoY4DXML8qSey4IdQ3AYZkSGYaHg2ATCKqXYWAA=", BinaryEncoding.Base64), Compression.Deflate)), let _t = ((type nullable text) meta [Serialized.Text = true]) in type table [Test_Data = _t]),
#"Changed Type" = Table.TransformColumnTypes(Source,{{"Test_Data", type text}}),
#"Split Column by Delimiter" = Table.SplitColumn(#"Changed Type", "Test_Data", Splitter.SplitTextByDelimiter("},", QuoteStyle.Csv), {"Test_Data.1", "Test_Data.2", "Test_Data.3", "Test_Data.4", "Test_Data.5", "Test_Data.6", "Test_Data.7"}),
#"Test_Data 3" = #"Split Column by Delimiter"[Test_Data.2],
#"Converted to Table" = Table.FromList(#"Test_Data 3", Splitter.SplitTextByDelimiter(","), null, null, ExtraValues.Error),
#"Changed Type1" = Table.TransformColumnTypes(#"Converted to Table",{{"Column1", type text}, {"Column2", type text}, {"Column3", type text}}),
#"Added Custom" = Table.AddColumn(#"Changed Type1", "StartTime", each Text.AfterDelimiter([Column1], ":")),
#"Added Custom1" = Table.AddColumn(#"Added Custom", "EndTime", each Text.AfterDelimiter([Column2], ":")),
#"Added Custom2" = Table.AddColumn(#"Added Custom1", "Weekday", each Text.AfterDelimiter([Column3], ":")),
#"Removed Columns" = Table.RemoveColumns(#"Added Custom2",{"Column1", "Column2", "Column3"}),
#"Replaced Value" = Table.ReplaceValue(#"Removed Columns","[","",Replacer.ReplaceText,{"Weekday"}),
#"Replaced Value1" = Table.ReplaceValue(#"Replaced Value","]","",Replacer.ReplaceText,{"Weekday"})
in
#"Replaced Value1"
Once you've done that with each weekday, go ahead and append your data and it will look like this (I only did Tues and Wednesday)
I realize this is a clunky solution but, let us know if it works. If not we can keep noodling away at it!
HI @Anonymous ,
A quick question here. How you are importing this array to Power BI? Is this whole thing in a text file and then you are bringing it in POwer BI or some other file format?
Thanks,
Pragati
Hi @Pragati11
Thank you for your reply.
I am using an ODBC connection to import this data into Power BI.
Hi @Anonymous ,
ODBC connection to a database?
I am interested in knowing if this whole value is stored in a single column?
Thanks,
Pragati
It's not really a connection to a database, this situation is a little bit more complex.
The original data is stored in a nosql database in AWS. We use "Glue"" from AWS to put this data into "Athena". Afther this, I can import the data with an Simba Athena ODBC connection in Power BI.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
Check out the February 2026 Power BI update to learn about new features.
| User | Count |
|---|---|
| 53 | |
| 51 | |
| 39 | |
| 15 | |
| 14 |
| User | Count |
|---|---|
| 95 | |
| 78 | |
| 34 | |
| 28 | |
| 25 |