Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
nilendraFabric
Super User
Super User

Table not visible in SQL EP : The requested Lakehouse operation failed. A retry of the operation may

 

Hello ,

I have created a table in lakehouse using notebook :

 
    df.write.format("delta").mode("append").saveAsTable("BudgetData_API")


Table is created successfully.

the issue is 

nilendraFabric_1-1742474614613.png



nilendraFabric_2-1742474630504.png

 

 

nilendraFabric_0-1742474501289.png



Data is visible in Lakehouse , but these tables are not visible in SQL EP.

this is happenig with all the tables I am writting using NB.

Any suggestions.

 

 

1 ACCEPTED SOLUTION
v-csrikanth
Community Support
Community Support

Hi @nilendraFabric 
I tried to reproduce your scenario it works fine for me.
Steps to create a table using a Notebook:

  1. Inside the Lakehouse, go to Notebooks+ New Notebook.
  2. Attach the notebook to lakehouse.
  3. Run the following Python code:
    ********************************************************

    from pyspark.sql import SparkSession

    spark = SparkSession.builder.getOrCreate()

    data = [(1, "Product A", 100), (2, "Product B", 150)]
    columns = ["ID", "Product", "Sales"]

    df = spark.createDataFrame(data, columns)

    df.write.format("delta").mode("overwrite").saveAsTable("TestTable")

    ********************************************************
  4. Refresh and confirm that TestTable appears under the Tables section in your Lakehouse.
    vcsrikanth_0-1742548247392.png

     


If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.

View solution in original post

3 REPLIES 3
nilendraFabric
Super User
Super User

I am facing same issue. But its intermittent 

v-csrikanth
Community Support
Community Support

Hi @nilendraFabric 
If the above suggested approach works for you could you please accept as solution.

Best regards,
Cheri Srikanth

v-csrikanth
Community Support
Community Support

Hi @nilendraFabric 
I tried to reproduce your scenario it works fine for me.
Steps to create a table using a Notebook:

  1. Inside the Lakehouse, go to Notebooks+ New Notebook.
  2. Attach the notebook to lakehouse.
  3. Run the following Python code:
    ********************************************************

    from pyspark.sql import SparkSession

    spark = SparkSession.builder.getOrCreate()

    data = [(1, "Product A", 100), (2, "Product B", 150)]
    columns = ["ID", "Product", "Sales"]

    df = spark.createDataFrame(data, columns)

    df.write.format("delta").mode("overwrite").saveAsTable("TestTable")

    ********************************************************
  4. Refresh and confirm that TestTable appears under the Tables section in your Lakehouse.
    vcsrikanth_0-1742548247392.png

     


If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.