Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Power BI is turning 10! Let’s celebrate together with dataviz contests, interactive sessions, and giveaways. Register now.

Reply
Anonymous
Not applicable

Table Expansion Optimization in Power Query

Hello,

I am reaching out to discuss a challenge I've encountered while working with binary data from an Oracle Database, following up on a question I posted last week [link to the post: https://community.fabric.microsoft.com/t5/Desktop/Accessing-Binary-Columns-from-Oracle-Database/m-p/...].

I've successfully accessed binary data that was in my table, and now the data is formatted as follows:

SERIAL_NUMTEST_DATETEST_TIMEDATA_POINTSAll
5555908-19-202110:42:271087[Table]
5557308-19-202110:42:271087[Table]
5556908-19-202110:42:271087[Table]
5556308-19-202110:42:271087[Table]
5552008-19-202110:42:271087[Table]
5556208-19-202110:42:271087[Table]
5546408-19-202110:42:271087[Table]
5550508-19-202110:42:271087[Table]
5558508-19-202110:42:271087[Table]
5555408-19-202110:42:271087[Table]


In the "ALL" column, each table contains data with around 1000 columns. For example, the table associated with serial number 55559 looks like this:

Data 1Data 2Seconds
7527840
26341
3434622
4985123
367934
6451375
1549806
….….….
906831078
4332601079
3389111080
5824011081
3672041082
1143831083
8533491084
6573501085
738241086
8906671087


And this is what I have in Power Query when I filter by one serial number to see the data (DATA_POINTS represents how many rows inside every single table) :

sarah931_0-1707095252017.png

 

 

With around 3 million unique serial numbers and each serial number expanding to around 1000 rows, the resulting dataset becomes substantial. I've attempted to expand the table using Power Query, but the process is taking a long time.

My primary goal is to optimize this expanding process, as I need the data within the "Data 1" and "Data 2" columns in the "ALL" column. This data will be used for plotting in a line chart and performing some simple calculations.

However, when attempting to expand the table, it remains stuck at this message for over 3 hours, with no clue of the progress or remaining rows:

sarah931_1-1707094010577.png

I would appreciate any insights you may have regarding this issue. Thank you.

@lbendlin 
@Ahmedx 

 



8 REPLIES 8
Anonymous
Not applicable

@lbendlin It didn't change the speed much, it is still so slow

Anonymous
Not applicable

@lbendlinAt what step please? It won't work on the source, gives an Expression.Error: We cannot convert a value of type Table to type Binary.

Then use Table.Buffer instead.

lbendlin
Super User
Super User

How much memory does your PC have?  Try using Binary.Buffer or Table.Buffer .

 

Anonymous
Not applicable

Hi @lbendlin , here is my device specs:

sarah931_0-1707143320061.png


I am using Table.Buffer and List.Buffer, but it's downloading a billion row in a 1 row increment. I am using the following code now:

let
    Source = ...,
    GSDATA = ...,
    ....= ...,
    #"Removed Other Columns" = Table.SelectColumns(ECHEM_DTL,{"SERIAL_NUM", "TEST_DATE", "TEST_TIME", "CFG_FILE", "DATA_ARRAY1", "DATA_ARRAY2", "DATA_POINTS"}),
    #"Changed Type" = Table.TransformColumnTypes(#"Removed Other Columns",{{"TEST_DATE", type date}, {"TEST_TIME", type time}}),
     #"Added Custom3" = Table.AddColumn(
   #"Removed Other Columns",
   "ALL",
   each Table.Buffer(Table.AddIndexColumn(
     Table.Combine(
       List.Transform(
         List.Zip(
           {
             List.Buffer(List.FirstN(
               List.Transform(
                 Binary.Split([DATA_ARRAY1], 4),
                 each BinaryFormat.ByteOrder(
                   BinaryFormat.Record([val = BinaryFormat.SignedInteger32]),
                   ByteOrder.LittleEndian
                 )(_)[val]
               ),
               [DATA_POINTS]
             )),
             List.Buffer(List.FirstN(
               List.Transform(
                 Binary.Split([DATA_ARRAY2], 4),
                 each BinaryFormat.ByteOrder(
                   BinaryFormat.Record([val = BinaryFormat.SignedInteger32]),
                   ByteOrder.LittleEndian
                 )(_)[val]
               ),
               [DATA_POINTS]
             ))
           }
         ),
         (x) => Table.FromRows({x}, {"DATA1", "DATA2"})
       )
     ),
     "Secondes",
     1,
     1
   )
 )),
 #"Expanded ALL" = Table.ExpandTableColumn(
   #"Added Custom3",
   "ALL",
   {"DATA1", "DATA2", "Secondes"},
   {"DATA1", "DATA2", "Secondes"}
 ),
    #"Removed Columns" = Table.RemoveColumns(#"Expanded ALL",{"CFG_FILE", "DATA_ARRAY1", "DATA_ARRAY2","DATA_POINTS"})//Here I am removing the binary columns
in
    #"Removed Columns"




use a Binary.Buffer on the source.

Anonymous
Not applicable

@lbendlin Still very slow, do you think it would possible to achieve the results by passing the serial number as a parameter from the report to power query? I tried doing this but the direct query won't allow me to do the transformation with the binary data. But I think expanding the tables for the serial numbers the users choose could eliminate the need to deal with the billions rows.

please provide a meaningful amount of sample data that illustrates the issue.

Helpful resources

Announcements
Join our Fabric User Panel

Join our Fabric User Panel

This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.

June 2025 Power BI Update Carousel

Power BI Monthly Update - June 2025

Check out the June 2025 Power BI update to learn about new features.

June 2025 community update carousel

Fabric Community Update - June 2025

Find out what's new and trending in the Fabric community.