Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredJoin us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM. Register now.
Hello,
I am reaching out to discuss a challenge I've encountered while working with binary data from an Oracle Database, following up on a question I posted last week [link to the post: https://community.fabric.microsoft.com/t5/Desktop/Accessing-Binary-Columns-from-Oracle-Database/m-p/...].
I've successfully accessed binary data that was in my table, and now the data is formatted as follows:
| SERIAL_NUM | TEST_DATE | TEST_TIME | DATA_POINTS | All |
| 55559 | 08-19-2021 | 10:42:27 | 1087 | [Table] |
| 55573 | 08-19-2021 | 10:42:27 | 1087 | [Table] |
| 55569 | 08-19-2021 | 10:42:27 | 1087 | [Table] |
| 55563 | 08-19-2021 | 10:42:27 | 1087 | [Table] |
| 55520 | 08-19-2021 | 10:42:27 | 1087 | [Table] |
| 55562 | 08-19-2021 | 10:42:27 | 1087 | [Table] |
| 55464 | 08-19-2021 | 10:42:27 | 1087 | [Table] |
| 55505 | 08-19-2021 | 10:42:27 | 1087 | [Table] |
| 55585 | 08-19-2021 | 10:42:27 | 1087 | [Table] |
| 55554 | 08-19-2021 | 10:42:27 | 1087 | [Table] |
In the "ALL" column, each table contains data with around 1000 columns. For example, the table associated with serial number 55559 looks like this:
| Data 1 | Data 2 | Seconds |
| 752 | 784 | 0 |
| 263 | 4 | 1 |
| 343 | 462 | 2 |
| 498 | 512 | 3 |
| 367 | 93 | 4 |
| 645 | 137 | 5 |
| 154 | 980 | 6 |
| …. | …. | …. |
| 906 | 83 | 1078 |
| 433 | 260 | 1079 |
| 338 | 911 | 1080 |
| 582 | 401 | 1081 |
| 367 | 204 | 1082 |
| 114 | 383 | 1083 |
| 853 | 349 | 1084 |
| 657 | 350 | 1085 |
| 738 | 24 | 1086 |
| 890 | 667 | 1087 |
And this is what I have in Power Query when I filter by one serial number to see the data (DATA_POINTS represents how many rows inside every single table) :
With around 3 million unique serial numbers and each serial number expanding to around 1000 rows, the resulting dataset becomes substantial. I've attempted to expand the table using Power Query, but the process is taking a long time.
My primary goal is to optimize this expanding process, as I need the data within the "Data 1" and "Data 2" columns in the "ALL" column. This data will be used for plotting in a line chart and performing some simple calculations.
However, when attempting to expand the table, it remains stuck at this message for over 3 hours, with no clue of the progress or remaining rows:
I would appreciate any insights you may have regarding this issue. Thank you.
@lbendlinAt what step please? It won't work on the source, gives an Expression.Error: We cannot convert a value of type Table to type Binary.
Then use Table.Buffer instead.
How much memory does your PC have? Try using Binary.Buffer or Table.Buffer .
Hi @lbendlin , here is my device specs:
I am using Table.Buffer and List.Buffer, but it's downloading a billion row in a 1 row increment. I am using the following code now:
let
Source = ...,
GSDATA = ...,
....= ...,
#"Removed Other Columns" = Table.SelectColumns(ECHEM_DTL,{"SERIAL_NUM", "TEST_DATE", "TEST_TIME", "CFG_FILE", "DATA_ARRAY1", "DATA_ARRAY2", "DATA_POINTS"}),
#"Changed Type" = Table.TransformColumnTypes(#"Removed Other Columns",{{"TEST_DATE", type date}, {"TEST_TIME", type time}}),
#"Added Custom3" = Table.AddColumn(
#"Removed Other Columns",
"ALL",
each Table.Buffer(Table.AddIndexColumn(
Table.Combine(
List.Transform(
List.Zip(
{
List.Buffer(List.FirstN(
List.Transform(
Binary.Split([DATA_ARRAY1], 4),
each BinaryFormat.ByteOrder(
BinaryFormat.Record([val = BinaryFormat.SignedInteger32]),
ByteOrder.LittleEndian
)(_)[val]
),
[DATA_POINTS]
)),
List.Buffer(List.FirstN(
List.Transform(
Binary.Split([DATA_ARRAY2], 4),
each BinaryFormat.ByteOrder(
BinaryFormat.Record([val = BinaryFormat.SignedInteger32]),
ByteOrder.LittleEndian
)(_)[val]
),
[DATA_POINTS]
))
}
),
(x) => Table.FromRows({x}, {"DATA1", "DATA2"})
)
),
"Secondes",
1,
1
)
)),
#"Expanded ALL" = Table.ExpandTableColumn(
#"Added Custom3",
"ALL",
{"DATA1", "DATA2", "Secondes"},
{"DATA1", "DATA2", "Secondes"}
),
#"Removed Columns" = Table.RemoveColumns(#"Expanded ALL",{"CFG_FILE", "DATA_ARRAY1", "DATA_ARRAY2","DATA_POINTS"})//Here I am removing the binary columns
in
#"Removed Columns"use a Binary.Buffer on the source.
@lbendlin Still very slow, do you think it would possible to achieve the results by passing the serial number as a parameter from the report to power query? I tried doing this but the direct query won't allow me to do the transformation with the binary data. But I think expanding the tables for the serial numbers the users choose could eliminate the need to deal with the billions rows.
please provide a meaningful amount of sample data that illustrates the issue.
Join the Fabric FabCon Global Hackathon—running virtually through Nov 3. Open to all skill levels. $10,000 in prizes!
Check out the October 2025 Power BI update to learn about new features.
| User | Count |
|---|---|
| 79 | |
| 38 | |
| 31 | |
| 27 | |
| 27 |