Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
GammaRamma
Helper I
Helper I

Error using DESCRIBE HISTORY on lakehouse with schemas

I am using schema enabled lakehouse. When set my lakehouse as default for the notebook and I do this it works :

DESCRIBE HISTORY tableName LIMIT 1

 

But when I try it with this syntax :

DESCRIBE HISTORY lakehouseName.schemaName.tableName LIMIT 1

 

I get the following error : 

Illegal table name lkh_bronze_dri_rawData.dbo.myTable(line 1, pos 17) == SQL == DESCRIBE HISTORY lkh_bronze_dri_rawData.dbo.myTable LIMIT 1 -----------------^^^ io.delta.sql.parser.DeltaSqlAstBuilder.$anonfun$visitTableIdentifier$1(DeltaSqlParser.scala:433) org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:160) io.delta.sql.parser.DeltaSqlAstBuilder.visitTableIdentifier(DeltaSqlParser.scala:430) io.delta.sql.parser.DeltaSqlAstBuilder.$anonfun$visitDescribeDeltaHistory$3(DeltaSqlParser.scala:388) scala.Option.map(Option.scala:230) io.delta.sql.parser.DeltaSqlAstBuilder.$anonfun$visitDescribeDeltaHistory$1(DeltaSqlParser.scala:388) org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:160) io.delta.sql.parser.DeltaSqlAstBuilder.visitDescribeDeltaHistory(DeltaSqlParser.scala:386) io.delta.sql.parser.DeltaSqlAstBuilder.visitDescribeDeltaHistory(DeltaSqlParser.scala:156) io.delta.sql.parser.DeltaSqlBaseParser$DescribeDeltaHistoryContext.accept(DeltaSqlBaseParser.java:316) org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visit(AbstractParseTreeVisitor.java:18) io.delta.sql.parser.DeltaSqlAstBuilder.$anonfun$visitSingleStatement$1(DeltaSqlParser.scala:426) org.apache.spark.sql.catalyst.parser.ParserUtils$.withOrigin(ParserUtils.scala:160) io.delta.sql.parser.DeltaSqlAstBuilder.visitSingleStatement(DeltaSqlParser.scala:426) io.delta.sql.parser.DeltaSqlAstBuilder.visitSingleStatement(DeltaSqlParser.scala:156) io.delta.sql.parser.DeltaSqlBaseParser$SingleStatementContext.accept(DeltaSqlBaseParser.java:179) org.antlr.v4.runtime.tree.AbstractParseTreeVisitor.visit(AbstractParseTreeVisitor.java:18) io.delta.sql.parser.DeltaSqlParser.$anonfun$parsePlan$1(DeltaSqlParser.scala:78) io.delta.sql.parser.DeltaSqlParser.parse(DeltaSqlParser.scala:113) io.delta.sql.parser.DeltaSqlParser.parsePlan(DeltaSqlParser.scala:77) com.microsoft.azure.synapse.ml.predict.PredictParser.parsePlan(PredictParser.scala:19) org.apache.spark.sql.SparkSession.$anonfun$sql$2(SparkSession.scala:633) org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:120) org.apache.spark.sql.SparkSession.$anonfun$sql$1(SparkSession.scala:632) org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:827) org.apache.spark.sql.SparkSession.sql(SparkSession.scala:630) org.apache.spark.sql.SparkSession.sql(SparkSession.scala:671) org.apache.livy.repl.SQLInterpreter.execute(SQLInterpreter.scala:163) org.apache.livy.repl.Session.$anonfun$executeCode$1(Session.scala:893) scala.Option.map(Option.scala:230) org.apache.livy.repl.Session.executeCode(Session.scala:890) org.apache.livy.repl.Session.$anonfun$execute$12(Session.scala:585) org.apache.livy.repl.Session.withRealtimeOutputSupport(Session.scala:1144) org.apache.livy.repl.Session.$anonfun$execute$3(Session.scala:585) scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) scala.concurrent.Future$.$anonfun$apply$1(Future.scala:659) scala.util.Success.$anonfun$map$1(Try.scala:255) scala.util.Success.map(Try.scala:213) scala.concurrent.Future.$anonfun$map$1(Future.scala:292) scala.concurrent.impl.Promise.liftedTree1$1(Promise.scala:33) scala.concurrent.impl.Promise.$anonfun$transform$1(Promise.scala:33) scala.concurrent.impl.CallbackRunnable.run(Promise.scala:64) java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) java.base/java.lang.Thread.run(Thread.java:829)

Any help is appreciated. Thanks!

1 ACCEPTED SOLUTION
Anonymous
Not applicable

Hi @GammaRamma ,

 

This is my version of Runtime.

vhuijieymsft_0-1733295478142.png

 

This is my spark pool configuration:

vhuijieymsft_1-1733295478151.png

 

I turned on high concurrency mode.

vhuijieymsft_2-1733295498937.png

 

This is my version of region as well as service.

vhuijieymsft_3-1733295498939.png

 

Given we have so little information at this time, I can't tell what's causing your problem.You can create a support ticket for free and a dedicated Microsoft engineer will come to solve the problem for you.

 

It would be great if you continue to share in this issue to help others with similar problems after you know the root cause or solution.

 

The link of Power BI Support:  https://powerbi.microsoft.com/en-us/support/

 

For how to create a support ticket, please refer to How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

 

Thank you for your understanding.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

View solution in original post

5 REPLIES 5
Anonymous
Not applicable

Hi @GammaRamma ,

 

Is my follow-up just to ask if the problem has been solved?

 

If so, can you accept the correct answer as a solution or share your solution to help other members find it faster?

 

Thank you very much for your cooperation!

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

Anonymous
Not applicable

Hi @GammaRamma ,

 

I ran it successfully using your syntax.

vhuijieymsft_0-1733195723709.png

 

Please check that there are no errors in your syntax, and that the lakehouse name, schema name, and table name are not entered incorrectly.

 

If you are not sure, you can use it by using “Load data” >> “Spark” under Table and paste the name into your syntax.

vhuijieymsft_1-1733195723713.png

 

If you have any other questions please feel free to contact me.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

Hello @Anonymous thank you for your reply.

It is strange that it works for you and not me. I just retried it using the load data feature and copying the path. I get the same error. Also, my colleages have tried on their side and get the same error. When I do other operations on the tables in my lakehouse using the notation lakehouse.schema.table there is no problem.

So what config could be different in our environments?

Thank you!

Anonymous
Not applicable

Hi @GammaRamma ,

 

This is my version of Runtime.

vhuijieymsft_0-1733295478142.png

 

This is my spark pool configuration:

vhuijieymsft_1-1733295478151.png

 

I turned on high concurrency mode.

vhuijieymsft_2-1733295498937.png

 

This is my version of region as well as service.

vhuijieymsft_3-1733295498939.png

 

Given we have so little information at this time, I can't tell what's causing your problem.You can create a support ticket for free and a dedicated Microsoft engineer will come to solve the problem for you.

 

It would be great if you continue to share in this issue to help others with similar problems after you know the root cause or solution.

 

The link of Power BI Support:  https://powerbi.microsoft.com/en-us/support/

 

For how to create a support ticket, please refer to How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

 

Thank you for your understanding.

 

Best Regards,
Yang
Community Support Team

 

If there is any post helps, then please consider Accept it as the solution  to help the other members find it more quickly.
If I misunderstand your needs or you still have problems on it, please feel free to let us know. Thanks a lot!

Ok it was the spark runtime version. I had 1.2. It started working when I set it to 1.3.

Thanks!

Helpful resources

Announcements
Fabric July 2025 Monthly Update Carousel

Fabric Monthly Update - July 2025

Check out the July 2025 Fabric update to learn about new features.

August 2025 community update carousel

Fabric Community Update - August 2025

Find out what's new and trending in the Fabric community.