Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! Get ahead of the game and start preparing now! Learn more

Reply
omarecd
Advocate I
Advocate I

Basic manipulation of data from events

Hello, hello.
I am receiving the data from an Azure Event hubs here in Fabric in a Event Stream.

 

a. On the data that I receive, I have a lot of columns that I don't need, this is step is simple, because using a filter I can just remove the columns that I don't need.

omarecd_0-1734356267807.png

 

b. The challenge that I have comes after. As it is the case with the msgs from event hubs, all my data coded using base64, I basically need to add a new column where the value is the existing column but decoded from base64. In KQL, can I decode from base64?

 

What would be the simplest way to do it ? also important, I would like this new column to be populated automatically each time that a new message arrives...

Thanks for your kind feedback.

1 ACCEPTED SOLUTION
omarecd
Advocate I
Advocate I

Hello @Anonymous .

I was checking and actually there is a nice way to have this automatic updates, with out going to external services or adding extra complexity.
In KQL, we just need to use an "Update Policy" and that is all ! 👏

 

Have a look here below:

 

Omar C.

View solution in original post

3 REPLIES 3
omarecd
Advocate I
Advocate I

Hello @Anonymous .

I was checking and actually there is a nice way to have this automatic updates, with out going to external services or adding extra complexity.
In KQL, we just need to use an "Update Policy" and that is all ! 👏

 

Have a look here below:

 

Omar C.

omarecd
Advocate I
Advocate I

Hello @Anonymous , thanks a lot for your great and valuable feedback.

 

I am quite new on the community and now it can see that it is a great thing ! 👏
Yes !!! The decoding from base64 is working very ok, thanks for the tip.

 

When I do the decoding, I will always get a JSON file that will always have two fields, value and at. With KQL is also posible to retrieved this two wich is great.

 

Using an extend I can add the new fields as new columns, that is fantastic !

 

omarecd_0-1734423854001.png

 

The next step would be to save this somewhere (I imagine), a new table with the format that I want, to pass it then to the application Power BI where it would be finally displayed. 

 

I could do that now, running the KQL query and it would be 'ok', however my question remains the same, how could I keep doing this transformation continuously? Or should I do this manual each time? I am only using Fabric so I have no access to Azure tools. I have been learning a lot about all this data engineering process and I am also curious to know if some other people have the same challenge and how they solve it...

 

Let's keep the case open a little bit more, maybe some other people interested here can also have a look.

 

In any case, I will report any progres ! 👏

 

Greetings from Belgium 🇧🇪 

Omar C.

Anonymous
Not applicable

Hi @omarecd ,

In KQL, can I decode from base64?

Yes, and you can try to use the base64_decode_toarray() and array_concat() functions to decode Base64 data. For example:

// Sample data
let eventData = datatable (EncodedData: string)
[
    "U29tZSBleGFtcGxlIGJhc2U2NCBlbmNvZGVkIHN0cmluZw==",
    "QW5vdGhlciBlbmNvZGVkIGV4YW1wbGU="
];

// Decoding Base64
eventData
| extend DecodedData = todynamic(base64_decode_toarray(EncodedData))
| extend DecodedText = array_concat(DecodedData)

vjunyantmsft_0-1734399861331.png


And about:

I would like this new column to be populated automatically each time that a new message arrives...

There is probably no way to accomplish this using only the KQL database. Perhaps you can try using Azure Data Explorer (ADX) or Azure Functions to achieve your desired results.

You can set up a continuous query or a scheduled job in your Azure Data Explorer (ADX) to process the data stream.

Here's an example of a scheduled query in ADX:

.create continuous-export MyContinuousExport
into table DecodedEventData
<|
myEventStream
| extend DecodedData = todynamic(base64_decode_toarray(EncodedData))
| extend DecodedText = array_concat(DecodedData)

However, whether you choose Azure Data Explorer (ADX) or Azure Functions, it is beyond the scope of technical support that this forum can provide. The suggestions I can provide are very limited. I suggest you go to other related forums to ask specifically whether you can use these tools to automatically fill in columns. Thank you!

Best Regards,
Dino Tao
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly.

Helpful resources

Announcements
December Fabric Update Carousel

Fabric Monthly Update - December 2025

Check out the December 2025 Fabric Holiday Recap!

Real Time Intelligence in a Day

Real-Time Intelligence in a Day—Free Training

Turn streaming data into instant insights with Microsoft Fabric. Learn to connect live sources, visualize in seconds, and use Copilot + AI for smarter decisions.

FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.

Top Solution Authors
Top Kudoed Authors