Explore and share Fabric Notebooks to boost Power BI insights in the new community notebooks gallery.
Check it out now!Microsoft is giving away 50,000 FREE Microsoft Certification exam vouchers. Get Fabric certified for FREE! Learn more
I'm trying to connect snowflake with spark notebooks in fabric.
I have the executable pyspark script which is already running in Azure Databricks.
I'm using same script in fabric notebook
and I'm getting this error
No CA bundle file is found in the system. Set REQUESTS_CA_BUNDLE to the file.
Could you please help on this.
Got strucked in initial phase itself.
Solved! Go to Solution.
Imported the certificate from OS.
My issue got resolved 👍
import os
import certifi
os.environ['REQUESTS_CA_BUNDLE'] = certifi.where()
print(certifi.where())
Imported the certificate from OS.
My issue got resolved 👍
import os
import certifi
os.environ['REQUESTS_CA_BUNDLE'] = certifi.where()
print(certifi.where())
Hi @muzaffar527_hot,
AFAIK, the fabric environment not fully similar to databrick.
For your error message, it can be caused by the referenced object not exist or you not have enough permissions to operate with them. (some of them already existed but dev team has not released to normal users)
Regards,
Xiaoxin Sheng