Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Enhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.

Reply
kely
Frequent Visitor

SQL Enpoint no longer working

Was using sql enpoints a couple weeks ago against Lakehouse... this has stopped working this week (week ending Friday june 29, 2023)… use to but no longer able to view/query data in tables that actually have data… getting error from sql query or when trying to preview a table (see error message below).

NOTE:  the Lakehouse tables work fine even though the sql endpoint does not

 

Error handling external file: 'Invalid: Parquet file size is 0 bytes'. File/External table name: 'dbo.EmployeeEmploymentStatus'.

Msg 15813, Level 16, State 1, Code line 1

Statement ID: {72AD18B6-E608-4F03-A86B-1494151FB6A9} | Query hash: 0x1B8F8F8EF72745BE | Distributed request ID: {27AF8F33-4597-4CD8-8D8F-32B6CF0F7CBD}

Msg 15806, Level 0, State 1, Code line 1

 

1 ACCEPTED SOLUTION
kely
Frequent Visitor

HINT:   Adding also a suspicon it has to do with the Lakehouse table having a 'bit' value that was copied from a data source type of 'boolean'... if the Lakehouse table has no bit values, the SQL Endpoing works... 

View solution in original post

2 REPLIES 2
kely
Frequent Visitor

More investigations on this repeating and itermittent issue....

 

  • On First read of a new table from Odata to Delta Table in Fabric Lakehouse
    • When we are reading an OData source (we suspect that this may happen on most/any source), we have a mix of varchar, boolean, datetime columns from our source

                                

kely_0-1689283811434.png

 

    • SOME of the columns (especially the datetime and Boolean columns) MAY have a NULL value in them…

                             

kely_1-1689283811437.png

 

    • We believe the COPY activity in Fabric Pipelines is reading that data type with a NULL and translating it to a varchar (likely some kinda of defaulted data type in a delta table target)
    • This means that the above fields that are Boolean or datetime get set to NULL the target Delta table on the INITIAL read from the source
    • After looking at several tables, comparing our Odata source to the Delta lake target, this seems to be a consistent Data Type setting pattern (which is ☹)
        Capture.JPG

  (in this example, you can see some datetime and boolean fields are coming through correctly and some are not!)

 

    • So first time execution of a new table from scratch, we see no errors in loading or updating (using pyspark/sparksql)… however….

 

  • On Subsequent INCREMENTAL runs of that same table from OData to Delta Table here is what is likely happening:
    • The ‘new’ rows from the source are read and the engine now sees datetimes or Booleans and then writes a NEW .parquet file with THAT data type (different from the one that as written in the initial load)
    • When we go to do any transformations, updates, reads, or sql endpoint querying… these processes are expecting ALL rows from all parquet files in that table to have the same data type… and then don’t!  and bang!  You get an error like we’ve been seeing since we started using Fabric:

“py4j.GatewayConnection.run(GatewayConnection.java:238)\n\tat java.lang.Thread.run(Thread.java:750)\nCaused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 1 in stage 24.0 failed 4 times, most recent failure: Lost task 1.3 in stage 24.0 (TID 266) (vm-cdb84346 executor 1): org.apache.spark.sql.execution.QueryExecutionException: Parquet column cannot be converted in file abfss://e7fe3852-3049-4795-a43f-17985a4fa98d@onelake.dfs.fabric.microsoft.com/d1d79935-5097-42b6-bc91-7d57b9ab5a52/Tables/EmployeeEmploymentStatus_Landed/f41f826c-1558-4e73-be43-7007b874c32d.parquet. Column: [Base_Rate_Manually_Set], Expected: string, Found: BOOLEAN\n\tat”

 

 

kely
Frequent Visitor

HINT:   Adding also a suspicon it has to do with the Lakehouse table having a 'bit' value that was copied from a data source type of 'boolean'... if the Lakehouse table has no bit values, the SQL Endpoing works... 

Helpful resources

Announcements
July 2025 community update carousel

Fabric Community Update - July 2025

Find out what's new and trending in the Fabric community.