Join us at FabCon Atlanta from March 16 - 20, 2026, for the ultimate Fabric, Power BI, AI and SQL community-led event. Save $200 with code FABCOMM.
Register now!To celebrate FabCon Vienna, we are offering 50% off select exams. Ends October 3rd. Request your discount now.
Name Column Mapping Mode in SQL Endpoints for Lakehouse is not supported so tables with unsupported columns names (spaces, special chars, capitalization) do not work.
To re-create the behaviour.
1. Create a table in Databricks with column mapping enabled.
%python spark.conf.set("spark.databricks.delta.properties.defaults.minWriterVersion", 5) spark.conf.set("spark.databricks.delta.properties.defaults.minReaderVersion", 2) spark.conf.set("spark.databricks.delta.properties.defaults.columnMapping.mode", "name")
2. Add shortcut tables from Databricks to Synapse Lakehouse. The data shows correctly.
3. Go to the SQL Endpoint. Tables fail to load. with the error. Corrective Action: Recreate the table without column mapping property.
4. This has a knock-on effect with data failing to feed into the datasets and Power BI.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.