Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Don't miss out! 2025 Microsoft Fabric Community Conference, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount. Prices go up February 11th. Register now.

Reply
mhaupt
Frequent Visitor

How to use polars write_parquet with Lakehouse

Hi there,

 

Hoping someone can help.  I want to write a parquet file to Lakehouse, but can't see how to include storage otions (access token, use_fabric_endpoint).  Polars does this as part of its "write_delta" process, eg.

 

polardataframe.write_delta(target=url,mode=mode,storage_options, delta_write_options)

 

, but "polardataframe.write_parquet" doesn't have the 'storage_options'  field.

 

Note my motivation for using parquet rather than deltatables is so I can implement partitions for faster querying.  If some kind person could tell me how to write deltatables with partitions this would aso solve my problem.  I did try adding "partition_by" as an delta_write option, but Fabric would only accept a blank list

 

Some forums suggest using "df.to_pandas().to_parquet(filepath, storage_options={...}) ", but I don't know what to put in the filepath or storage options.

 

Others suggest "... stop using the storage_options parameter and just use the aabfs.open handler for both reading and writing", which sounds like a solution but I've never (consciously) used aabfs before and wouldn't know where to begin

 

1 ACCEPTED SOLUTION
mhaupt
Frequent Visitor

Actually I managed to apply partitions using write deltatables.  The issue was I had been trying to apply partitions to an existing rather than a new table. D'oh!  😛

View solution in original post

4 REPLIES 4
mhaupt
Frequent Visitor

Actually I managed to apply partitions using write deltatables.  The issue was I had been trying to apply partitions to an existing rather than a new table. D'oh!  😛

Anonymous
Not applicable

Hi @mhaupt ,

Glad to know that your query got resolved. Please continue using Fabric Community on your further queries.

Anonymous
Not applicable

Hi @mhaupt ,

Thanks for using Fabric Community.
As I understand you are trying to write a partitioned parquet file to Microsoft Fabric Lakehouse.

Can you please check this doc - Microsoft Fabric: using Notebooks and Table Partitioning to Convert Files to Tables 

vgchennamsft_0-1715926043014.png


Hope this might give you some idea over your query. Do let me know incase of further queries.

ta!

Helpful resources

Announcements
Las Vegas 2025

Join us at the Microsoft Fabric Community Conference

March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Prices go up Feb. 11th.

JanFabricDE_carousel

Fabric Monthly Update - January 2025

Explore the power of Python Notebooks in Fabric!

JanFabricDW_carousel

Fabric Monthly Update - January 2025

Unlock the latest Fabric Data Warehouse upgrades!

JanFabricDF_carousel

Fabric Monthly Update - January 2025

Take your data replication to the next level with Fabric's latest updates!