The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
Hello ,
I have created a table in lakehouse using notebook :
Table is created successfully.
the issue is
Data is visible in Lakehouse , but these tables are not visible in SQL EP.
this is happenig with all the tables I am writting using NB.
Any suggestions.
Solved! Go to Solution.
Hi @nilendraFabric
I tried to reproduce your scenario it works fine for me.
Steps to create a table using a Notebook:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
data = [(1, "Product A", 100), (2, "Product B", 150)]
columns = ["ID", "Product", "Sales"]
df = spark.createDataFrame(data, columns)
df.write.format("delta").mode("overwrite").saveAsTable("TestTable")
********************************************************
If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.
I am facing same issue. But its intermittent
Hi @nilendraFabric
If the above suggested approach works for you could you please accept as solution.
Best regards,
Cheri Srikanth
Hi @nilendraFabric
I tried to reproduce your scenario it works fine for me.
Steps to create a table using a Notebook:
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
data = [(1, "Product A", 100), (2, "Product B", 150)]
columns = ["ID", "Product", "Sales"]
df = spark.createDataFrame(data, columns)
df.write.format("delta").mode("overwrite").saveAsTable("TestTable")
********************************************************
If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.
User | Count |
---|---|
15 | |
10 | |
7 | |
4 | |
3 |
User | Count |
---|---|
46 | |
23 | |
18 | |
17 | |
12 |