Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more

Reply
msprog
Advocate II
Advocate II

Fabric pyspark notebook session config

HI,

We are using MS Fabric pyspark notebooks.  How can i get the spark session configuration in the notebook?

Specifically i need to know if the parquet predicatepushdown is enabled

Please advise how i can retrieve all the session configurations please

 

thanks

1 ACCEPTED SOLUTION
v-tsaipranay
Community Support
Community Support

Hi @msprog ,

Thank you for reaching out to the Microsoft Fabric Community.

 

In Microsoft Fabric PySpark Notebooks, you can retrieve all current Spark session configurations using the command spark.conf.getAll(), which returns a complete dictionary of configuration settings for your active session.

To specifically check whether Parquet predicate pushdown is enabled — an optimization that improves performance by reducing the amount of data read from Parquet files through filtering at the file scan level — you can query the configuration key spark.sql.parquet.filterPushdown with spark.conf.get("spark.sql.parquet.filterPushdown").

 

If this returns true, the pushdown feature is enabled and functioning. If it returns false, the feature is disabled. If you find it's disabled but need it enabled for performance reasons, you can update the setting directly within the notebook using spark.conf.set("spark.sql.parquet.filterPushdown", "true").

 

This ensures that your Parquet data processing is optimized to read only the necessary data, improving both performance and resource efficiency.

vtsaipranay_0-1741776073078.png

 

I hope my suggestions give you good idea, if you need any further assistance, feel free to reach out.

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thank you. 

View solution in original post

2 REPLIES 2
msprog
Advocate II
Advocate II

@v-tsaipranay  thanks for this. very helpful. one abservation when i ran the scripts:

when i run spark.conf.get("spark.sql.parquet.filterPushdown") i get the value as true. 

However, when i run the spark.sparkContext.getConf().getAll() , i get around 300 session configurations but i find that spark.sql.parquet.filterPushdown is not present among the 300 key values. 

I was expecting it to be present.  ami missing something here please?

v-tsaipranay
Community Support
Community Support

Hi @msprog ,

Thank you for reaching out to the Microsoft Fabric Community.

 

In Microsoft Fabric PySpark Notebooks, you can retrieve all current Spark session configurations using the command spark.conf.getAll(), which returns a complete dictionary of configuration settings for your active session.

To specifically check whether Parquet predicate pushdown is enabled — an optimization that improves performance by reducing the amount of data read from Parquet files through filtering at the file scan level — you can query the configuration key spark.sql.parquet.filterPushdown with spark.conf.get("spark.sql.parquet.filterPushdown").

 

If this returns true, the pushdown feature is enabled and functioning. If it returns false, the feature is disabled. If you find it's disabled but need it enabled for performance reasons, you can update the setting directly within the notebook using spark.conf.set("spark.sql.parquet.filterPushdown", "true").

 

This ensures that your Parquet data processing is optimized to read only the necessary data, improving both performance and resource efficiency.

vtsaipranay_0-1741776073078.png

 

I hope my suggestions give you good idea, if you need any further assistance, feel free to reach out.

If this post helps, then please give us Kudos and consider Accept it as a solution to help the other members find it more quickly.

 

Thank you. 

Helpful resources

Announcements
MarchFBCvideo - carousel

Fabric Monthly Update - March 2025

Check out the March 2025 Fabric update to learn about new features.