Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
nilendraFabric
Super User
Super User

Table not visible in SQL EP : The requested Lakehouse operation failed. A retry of the operation may

 

Hello ,

I have created a table in lakehouse using notebook :

 
    df.write.format("delta").mode("append").saveAsTable("BudgetData_API")


Table is created successfully.

the issue is 

nilendraFabric_1-1742474614613.png



nilendraFabric_2-1742474630504.png

 

 

nilendraFabric_0-1742474501289.png



Data is visible in Lakehouse , but these tables are not visible in SQL EP.

this is happenig with all the tables I am writting using NB.

Any suggestions.

 

 

1 ACCEPTED SOLUTION
v-csrikanth
Community Support
Community Support

Hi @nilendraFabric 
I tried to reproduce your scenario it works fine for me.
Steps to create a table using a Notebook:

  1. Inside the Lakehouse, go to Notebooks+ New Notebook.
  2. Attach the notebook to lakehouse.
  3. Run the following Python code:
    ********************************************************

    from pyspark.sql import SparkSession

    spark = SparkSession.builder.getOrCreate()

    data = [(1, "Product A", 100), (2, "Product B", 150)]
    columns = ["ID", "Product", "Sales"]

    df = spark.createDataFrame(data, columns)

    df.write.format("delta").mode("overwrite").saveAsTable("TestTable")

    ********************************************************
  4. Refresh and confirm that TestTable appears under the Tables section in your Lakehouse.
    vcsrikanth_0-1742548247392.png

     


If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.

View solution in original post

3 REPLIES 3
nilendraFabric
Super User
Super User

I am facing same issue. But its intermittent 

v-csrikanth
Community Support
Community Support

Hi @nilendraFabric 
If the above suggested approach works for you could you please accept as solution.

Best regards,
Cheri Srikanth

v-csrikanth
Community Support
Community Support

Hi @nilendraFabric 
I tried to reproduce your scenario it works fine for me.
Steps to create a table using a Notebook:

  1. Inside the Lakehouse, go to Notebooks+ New Notebook.
  2. Attach the notebook to lakehouse.
  3. Run the following Python code:
    ********************************************************

    from pyspark.sql import SparkSession

    spark = SparkSession.builder.getOrCreate()

    data = [(1, "Product A", 100), (2, "Product B", 150)]
    columns = ["ID", "Product", "Sales"]

    df = spark.createDataFrame(data, columns)

    df.write.format("delta").mode("overwrite").saveAsTable("TestTable")

    ********************************************************
  4. Refresh and confirm that TestTable appears under the Tables section in your Lakehouse.
    vcsrikanth_0-1742548247392.png

     


If the above information is helpful, please give us Kudos and mark the response as Accepted as solution.
Best Regards,
Community Support Team _ C Srikanth.

Helpful resources

Announcements
FabCon and SQLCon Highlights Carousel

FabCon &SQLCon Highlights

Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.