Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered

Reply
mhaupt
Frequent Visitor

How to use polars write_parquet with Lakehouse

Hi there,

 

Hoping someone can help.  I want to write a parquet file to Lakehouse, but can't see how to include storage otions (access token, use_fabric_endpoint).  Polars does this as part of its "write_delta" process, eg.

 

polardataframe.write_delta(target=url,mode=mode,storage_options, delta_write_options)

 

, but "polardataframe.write_parquet" doesn't have the 'storage_options'  field.

 

Note my motivation for using parquet rather than deltatables is so I can implement partitions for faster querying.  If some kind person could tell me how to write deltatables with partitions this would aso solve my problem.  I did try adding "partition_by" as an delta_write option, but Fabric would only accept a blank list

 

Some forums suggest using "df.to_pandas().to_parquet(filepath, storage_options={...}) ", but I don't know what to put in the filepath or storage options.

 

Others suggest "... stop using the storage_options parameter and just use the aabfs.open handler for both reading and writing", which sounds like a solution but I've never (consciously) used aabfs before and wouldn't know where to begin

 

1 ACCEPTED SOLUTION
mhaupt
Frequent Visitor

Actually I managed to apply partitions using write deltatables.  The issue was I had been trying to apply partitions to an existing rather than a new table. D'oh!  😛

View solution in original post

4 REPLIES 4
mhaupt
Frequent Visitor

Actually I managed to apply partitions using write deltatables.  The issue was I had been trying to apply partitions to an existing rather than a new table. D'oh!  😛

Anonymous
Not applicable

Hi @mhaupt ,

Glad to know that your query got resolved. Please continue using Fabric Community on your further queries.

Anonymous
Not applicable

Hi @mhaupt ,

Thanks for using Fabric Community.
As I understand you are trying to write a partitioned parquet file to Microsoft Fabric Lakehouse.

Can you please check this doc - Microsoft Fabric: using Notebooks and Table Partitioning to Convert Files to Tables 

vgchennamsft_0-1715926043014.png


Hope this might give you some idea over your query. Do let me know incase of further queries.

ta!

Helpful resources

Announcements
May FBC25 Carousel

Fabric Monthly Update - May 2025

Check out the May 2025 Fabric update to learn about new features.

May 2025 Monthly Update

Fabric Community Update - May 2025

Find out what's new and trending in the Fabric community.