Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Score big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount

Support Name Column Mapping Mode in SQL Endpoints for Lakehouse

Name Column Mapping Mode in SQL Endpoints for Lakehouse is not supported so tables with unsupported columns names (spaces, special chars, capitalization) do not work.


To re-create the behaviour.


1. Create a table in Databricks with column mapping enabled.

  

%python spark.conf.set("spark.databricks.delta.properties.defaults.minWriterVersion", 5) spark.conf.set("spark.databricks.delta.properties.defaults.minReaderVersion", 2) spark.conf.set("spark.databricks.delta.properties.defaults.columnMapping.mode", "name") 

  

2. Add shortcut tables from Databricks to Synapse Lakehouse. The data shows correctly.


3. Go to the SQL Endpoint. Tables fail to load. with the error.  Corrective Action: Recreate the table without column mapping property.


4. This has a knock-on effect with data failing to feed into the datasets and Power BI.

Status: Completed
Comments
CharlesWebbMSFT
Microsoft Employee
We've shipped this idea.
fbcideas_migusr
New Member
Status changed to: Completed