Power BI is turning 10, and we’re marking the occasion with a special community challenge. Use your creativity to tell a story, uncover trends, or highlight something unexpected.
Get startedJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
I try to get data from snowflake (with oauth authentication) and write it to the default lakehouse with Pyspark:
conn = http.client.HTTPSConnection("XXX")
payload = f"XXX"
headers = {'content-type': "application/x-www-form-urlencoded"}
conn.request("POST", "/oauth/token", payload, headers)
res = conn.getresponse()
data = res.read()
json_data = json.loads(data)
TOKEN = json_data["access_token"]
conn = sf.connect(account="XXX",
user=CLIENT_ID + "@clients",
authenticator="oauth",
token=TOKEN,
database="XXX",
schema="XXX",
warehouse="XXX")
cur = conn.cursor()
cur.execute(pSqlquery)
df = cur.fetch_pandas_all()
spark_df = spark.createDataFrame(df)
# Print the DataFrame
spark_df.write.mode("overwrite").format("delta").saveAsTable(pLakehousedestinationtable)
# Close the cursor and connection
cur.close()
conn.close()
When i limit the query to 100 rows everything works fine. But if i want to load > 800 rows, i get the following error:
No CA bundle file is found in the system. Set REQUESTS_CA_BUNDLE to the file.
Failed to fetch the large result set batch 01b8638a-0000-9655-0020-6a8300058006_0%2Fmain%2Fdata_0_0_1 for the 1 th time, backing off for 1s for the reason: '254007: The certificate is revoked or could not be validated: hostname=rhoyejsfcb1stg.blob.core.windows.net'
Traceback (most recent call last):
File "/home/trusted-service-user/cluster-env/clonedenv/lib/python3.11/site-packages/snowflake/connector/result_batch.py", line 332, in _download
response = session.request("get", **request_data)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/trusted-service-user/cluster-env/clonedenv/lib/python3.11/site-packages/snowflake/connector/vendored/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/trusted-service-user/cluster-env/clonedenv/lib/python3.11/site-packages/snowflake/connector/vendored/requests/sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/trusted-service-user/cluster-env/clonedenv/lib/python3.11/site-packages/snowflake/connector/vendored/requests/adapters.py", line 486, in send
resp = conn.urlopen(
^^^^^^^^^^^^^
File "/home/trusted-service-user/cluster-env/clonedenv/lib/python3.11/site-packages/snowflake/connector/vendored/urllib3/connectionpool.py", line 715, in urlopen
httplib_response = self._make_request(
^^^^^^^^^^^^^^^^^^^
File "/home/trusted-service-user/cluster-env/clonedenv/lib/python3.11/site-packages/snowflake/connector/vendored/urllib3/connectionpool.py", line 404, in _make_request
self._validate_conn(conn)
File "/home/trusted-service-user/cluster-env/clonedenv/lib/python3.11/site-packages/snowflake/connector/vendored/urllib3/connectionpool.py", line 1058, in _validate_conn
conn.connect()
File "/home/trusted-service-user/cluster-env/clonedenv/lib/python3.11/site-packages/snowflake/connector/vendored/urllib3/connection.py", line 419, in connect
self.sock = ssl_wrap_socket(
^^^^^^^^^^^^^^^^
File "/home/trusted-service-user/cluster-env/clonedenv/lib/python3.11/site-packages/snowflake/connector/ssl_wrap_socket.py", line 93, in ssl_wrap_socket_with_ocsp
raise OperationalError(
snowflake.connector.errors.OperationalError: 254007: The certificate is revoked or could not be validated: hostname=rhoyejsfcb1stg.blob.core.windows.net
Does anyone have an idea what the problem could be?
I use snowflake-connector-python Version 3.12.3 as public library in the environment.
Thanks!
Solved! Go to Solution.
After investigation with MS and Snowflake support it ended with the workaround mentioned in this topic: Solved: Re: No CA bundle file is found in the system. Set ... - Microsoft Fabric Community
The error is special to MS Fabric, as it does not occur in a local test
After investigation with MS and Snowflake support it ended with the workaround mentioned in this topic: Solved: Re: No CA bundle file is found in the system. Set ... - Microsoft Fabric Community
The error is special to MS Fabric, as it does not occur in a local test
Hi @Anonymous , how can i check this? It is an external snowflake database, we have only a reading access to the db.
HI @cw88,
I mean you can check the reuslt table data size in the lakehouse.
As you said, larget than 800 rows is a target that may triggered the issue, you can try to get less than 800 rows and check the file data size in lakehosue. (you can right-click on the delta table and choose 'view files' to show the 'files view' in the list)
In addition, you can also use notebook with mssparkutils library to achieve these:
pyspark - Microsoft Fabric Lakehouse size and table sizes - Stack Overflow
Regards,
Xiaoxin Sheng
Hi @Anonymous ,
data size is 16 KB, so it should not be the problem 😞
HI @cw88,
Did any changes or update that applied to the credentials you used? How long did these operation processed? BTW, how did you configure the spark session and time out? Can you please share some more detail?
Regards,
Xiaoxin Sheng
hello @Anonymous ,
Did any changes or update that applied to the credentials you used? --> No
How long did these operation processed? --> Processing takes 2-3 seconds
BTW, how did you configure the spark session and time out? Can you please share some more detail? --> No custom settings
HI @cw88,
It seems like nothing special changes on your configurations.
For this scenario, I suppose this may related to the http.client method. Have you tried to other methods(e.g. snowflake.connector) or data driver(e.g. jdbc) to getting data from snowflake?
Regards,
Xiaoxin Sheng
HI @cw88,
Have you checked the data size about these records? I'd like to suggest you check them if they meet the 16 Mb size limit of the snowflake database:
What is max row size of a table in Snowflake Database? - Stack Overflow
Regards,
Xiaoxin Sheng
I came across this other post also mentioning the CA error. Does this also work for you: https://community.fabric.microsoft.com/t5/Data-Science/No-CA-bundle-file-is-found-in-the-system-Set-...?
Hi @FabianSchut , CA is not the problem. Problem is the "Failed to fetch the large result set batch ..." error.
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
Check out the June 2025 Fabric update to learn about new features.
User | Count |
---|---|
5 | |
4 | |
3 | |
2 | |
2 |