Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
I am creating a dataframe with data from silver lakehouse so this is the default one. dev_silver_lk
But I want to create a dimension Delta Parquet file in the gold lakehouse dev_gold_lk
I have also mounted this Data lake to the notebook (They both are in public preview as I have Schemas ticked)
And I just cant seem to do it?
Surely this should be possible?
I suspect it might be because my Lakehouse has Preview schemas? I tried this and it didnt work in Pyspark Notebook
You could try saveAsTable:
dfBlockersKey_reordered.write \
.format("delta") \
.mode("overwrite") \
.saveAsTable("dev_timesheet_gold_lh.dimCustomer")
This is your chance to engage directly with the engineering team behind Fabric and Power BI. Share your experiences and shape the future.
User | Count |
---|---|
13 | |
4 | |
3 | |
3 | |
3 |
User | Count |
---|---|
8 | |
7 | |
6 | |
6 | |
5 |