March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
I was try using PySpark and SparkSQL to write/create data/table into my lakehouse and it gave me an error. I was try using both .save and .saveAsTable options, but it didn't work. When I was using .save, it gave me a bad request error and .saveAsTable gave me a forbidden error. This was when I created the lakehouse with schema enabled.
But, when I created the lakehouse without enabling schema it didn't gave me any error, just run successfully and created the table.
What is the issue exactly? Is it with permission or? Please help me out here.
TIA.
Solved! Go to Solution.
Resolution: The issue is caused by the use of internal APIs. In some cases, the internal APIs are returning a 403 when routed to a datacenter where the user is not present. The fix is to move to the public APIs. The code is already in there, but we need to provide the switch from the workload code.
%%pyspark
!echo "spark.trident.pbiApiVersion=v1">>/home/trusted-service-user/.trident-context
The fix would be deployed within 3 to 4 weeks in all production regions.
Hi @pavarayaaa ,
Thanks for the reply from v-ayaanali / arlindTrystar.
Is my follow-up just to ask if the problem has been solved?
I and some users have provided workaround, if it works for you could you accept that answer as a solution to help other members find it faster?
Thank you very much for your cooperation!
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
Resolution: The issue is caused by the use of internal APIs. In some cases, the internal APIs are returning a 403 when routed to a datacenter where the user is not present. The fix is to move to the public APIs. The code is already in there, but we need to provide the switch from the workload code.
%%pyspark
!echo "spark.trident.pbiApiVersion=v1">>/home/trusted-service-user/.trident-context
The fix would be deployed within 3 to 4 weeks in all production regions.
This is a bug, they have not resolved it yet.
Check this: https://community.fabric.microsoft.com/t5/Data-Engineering/An-error-occurred-while-calling-o321-sql/...
It is similar to your problem. We had to redo our work, re-create lakehouses without schema enabled.
Hi @pavarayaaa ,
Thanks for the reply from pawelpo .
When schema validation is enabled, the system may apply more stringent checks to the data being written. Ensure that the data conforms to the schema.
I created a lakehouse with preview schema enabled and did a lot of testing and found that I get an error when using the following statement “saveAsTable”:
AnalysisException: Found invalid character(s) among ' ,;{}()\n\t=' in the column names of your schema.
Using this statement works fine:
df = spark.read.format(“csv”).option(“header”, “false”).load(“Files/2019.csv”)
# Create a new table
df.write.format(“delta”).saveAsTable(“test2”)
If you have any other questions please feel free to contact me.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
This does not address the problem. Because the user is getting bad request error and forbidden error, not AnalysisException error.
Hi @pavarayaaa ,
Thanks for the reply from arlindTrystar .
If none of the workaround we provided works, perhaps creating lakehouse without schema enabled is your last option.
Best Regards,
Yang
Community Support Team
If there is any post helps, then please consider Accept it as the solution to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!
I can reproduce your errors. Looks like a bug to me, as Microsoft described in Fabric documentation it should be possible to create new tables using saveAsTable (Lakehouse schemas (Preview) - Microsoft Fabric | Microsoft Learn). Consider reporting it as a bug to Microsoft.
Also, keep in mind that lakehouse schemas are currently in public preview having tons of limitations (Lakehouse schemas (Preview) - Microsoft Fabric | Microsoft Learn).
HTH,
Pawel
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
User | Count |
---|---|
3 | |
3 | |
3 | |
1 | |
1 |
User | Count |
---|---|
12 | |
6 | |
4 | |
4 | |
4 |