Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

The Power BI Data Visualization World Championships is back! It's time to submit your entry. Live now!

Reply
jaryszek
Memorable Member
Memorable Member

Write Back to Power bi dataset on LakeHouse

hi Guys,

Is it possible to have write-back feature within Power Bi report, but based on DirectLake Over OneLake?

Or only it has to be SQL endpoint ?

Can anybody help and send some docs about it?

Best,
Jacek

1 ACCEPTED SOLUTION

Hi @jaryszek,

write the data to the underlying OneLake/Lakehouse instead of the model itself. For example, users can upload Excel or CSV files, and then use Fabric Dataflows, Notebooks (Spark/Python/SQL), or Translytical Task Flows to ingest that data into Lakehouse Delta tables. The DirectLake semantic model then reads this updated data after a refresh.

 

Thanks,

Prashanth

View solution in original post

5 REPLIES 5
amitchandak
Super User
Super User

@jaryszek ,Using translytical Task flows, you can. I have tried for Warehouse and SQL DB. But you can have to lakehouse files. 

Power BI Data Write-back Feature via Translytical Task Flows: https://youtu.be/2giHs13KUDI

Share with Power BI Enthusiasts: Full Power BI Video (20 Hours) YouTube
Microsoft Fabric Series 60+ Videos YouTube
Microsoft Fabric Hindi End to End YouTube

thank you

rohit1991
Super User
Super User

Hii @jaryszek 

 

DirectLake and Lakehouse datasets do not support write-back directly from a Power BI report. DirectLake is read-only, so you cannot update tables in OneLake from visuals or DAX. Write-back is only possible through an external endpoint such as SQL, API, Power Apps, or custom web service that writes data back to the Lakehouse/SQL. Power BI itself cannot write data back to a DirectLake model today.


Did it work? ✔ Give a Kudo • Mark as Solution – help others too!

Thank you, 

so what could be a workaround for writing back from Excel/CSV file into power bi semantic model on DirectLake over OneLake?

Best,
Jacek

Hi @jaryszek,

write the data to the underlying OneLake/Lakehouse instead of the model itself. For example, users can upload Excel or CSV files, and then use Fabric Dataflows, Notebooks (Spark/Python/SQL), or Translytical Task Flows to ingest that data into Lakehouse Delta tables. The DirectLake semantic model then reads this updated data after a refresh.

 

Thanks,

Prashanth

Helpful resources

Announcements
FabCon Atlanta 2026 carousel

FabCon Atlanta 2026

Join us at FabCon Atlanta, March 16-20, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.