Microsoft Fabric Community Conference 2025, March 31 - April 2, Las Vegas, Nevada. Use code MSCUST for a $150 discount.
Register nowGet certified as a Fabric Data Engineer: Check your eligibility for a 50% exam voucher offer and join us for free live learning sessions to get prepared for Exam DP-700. Get started
Hi,
I'm not sure I'm posting this in the right place. Based on here as I couldn't find a forum about PySpark.
In summary I want to write spark dataframe to directory as delta output with pyspark. But I get the error "Authentication Failed with Bearer token is not available in request" and not much is gained from there.
Is there anyone to help?
Thanks,
Tolga
I was also able to resolve it as well. Basically, you need to add lakehouse as source in the notebook and it will work
I think that's because you haven't specified the area (Files/Tables) to which you want to write the data.
Assuming you want to create a delta table in a managed area of the lakehouse (Tables), try this:
df_bonus.write.format("delta").save("Tables/WriteTest")
Make sure you have your lakehouse pinned on in the Lakehouse explorer on the left.
I am having the same issue. I am simply reading a file from Lakehouse, and made sure file exist and path is correct
Hi @LineshGajeraq,
I solved this issue many month ago 🙂 Try this pyspark code please.