The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredAsk the Fabric Databases & App Development teams anything! Live on Reddit on August 26th. Learn more.
So MSFT exposed Lakehouse Schemas as a preview product in the last month or so... Have you used this feature?
I have, and let me say, I don't understand why MSFT feels the need to push something in Preview, mark it in their roadmap as 'shipped' and it doesn't work!
Can I create a schema? yes...
Can I write a table to a schema? yes...
Can I literally do anything else? no!
cannot see it in sql endpoint, cannot do cross lakehouse queries, cannot write a report on top of these schema-ed tables...
Why does MSFT insist in shipping 'broken' products, even in preview?! Do they want feedback or frustration?
Dazed and confused as usual. And this is from someone who has used Fabric since June of last year when it went into Public Preview... great product, but poorly released features. 😞
Solved! Go to Solution.
HI @kely,
We’re really sorry that the preview feature caused any inconvenience to your experience.
For your description, I think this should be more related to the preview feature has changed on the Lakehouse schema. So that it may not be compatible with original features, each of them required to re-adaptation between two versions.
Also, the 'shipped' mark on the preview feature roadmap may not means they fully finished and released to the general environment. (they will be test in internal at first and if release lately if the test works well)
BTW, these 'shipped' parts may be rollbacked if any internal request/reasons or compatible issues, but the status may not update intime.
Regards,
Xiaoxin Sheng
Hello everyone! Being a fairly stubborn guy, and knowing that Fabric sometimes has more than one way to do this, I tested this pyspark and BANG it worked!
Notebook mounts a Lakehouse 1, and writes to Lakehouse 2 with schema enabled... and it works!
sql = """
select *
from TableInLakehouse1
"""
df = spark.sql(sql)
# (over)write table to BusinessModel_DataProduct lakehouse
df.write.mode("overwrite").option("overwriteSchema", "true").format("delta").save("abfss://[target lakehouse 2 workspace guid]@onelake.dfs.fabric.microsoft.com/[target lakehouse 2 guid]/Tables/dim/schematesttable")
display(df)
Hello everyone! Being a fairly stubborn guy, and knowing that Fabric sometimes has more than one way to do this, I tested this pyspark and BANG it worked!
Notebook mounts a Lakehouse 1, and writes to Lakehouse 2 with schema enabled... and it works!
sql = """
select *
from TableInLakehouse1
"""
df = spark.sql(sql)
# (over)write table to BusinessModel_DataProduct lakehouse
df.write.mode("overwrite").option("overwriteSchema", "true").format("delta").save("abfss://[target lakehouse 2 workspace guid]@onelake.dfs.fabric.microsoft.com/[target lakehouse 2 guid]/Tables/dim/schematesttable")
display(df)
Ok, i understand this... if you want feedback on this feature, you can see the support ticket I submitted 2408020040011302. Your reply will not be "accepted as a solution", sorry.
@Anonymous I hope you can get this new feature working properly and provide SOME idea when it will be fully functioning and in GA?
Hi @kely,
I check the official document, blog and release plans but no of them mentioned this.
I suppose they may be related to the feature dev progress and feature completeness. (as you said, lot of old version Lakehouse features still not compatibility with this)
Regards,
Xiaoxin Sheng
WARNING: once this solution code was executed, I can see the table under the schema in the Lakehouse BUT i cannot see it in the sql endpoint which means the table is currently inaccessable to a report! (ie. still a broken and not fully functional feature)
sorry all! 😞
Well, until this is fixed, I cannot use this feature... which is sad.
HI @kely,
We’re really sorry that the preview feature caused any inconvenience to your experience.
For your description, I think this should be more related to the preview feature has changed on the Lakehouse schema. So that it may not be compatible with original features, each of them required to re-adaptation between two versions.
Also, the 'shipped' mark on the preview feature roadmap may not means they fully finished and released to the general environment. (they will be test in internal at first and if release lately if the test works well)
BTW, these 'shipped' parts may be rollbacked if any internal request/reasons or compatible issues, but the status may not update intime.
Regards,
Xiaoxin Sheng
User | Count |
---|---|
4 | |
2 | |
2 | |
2 | |
2 |
User | Count |
---|---|
10 | |
8 | |
7 | |
6 | |
6 |