Supplies are limited. Contact info@espc.tech right away to save your spot before the conference sells out.
Get your discountScore big with last-minute savings on the final tickets to FabCon Vienna. Secure your discount
Name Column Mapping Mode in SQL Endpoints for Lakehouse is not supported so tables with unsupported columns names (spaces, special chars, capitalization) do not work.
To re-create the behaviour.
1. Create a table in Databricks with column mapping enabled.
%python spark.conf.set("spark.databricks.delta.properties.defaults.minWriterVersion", 5) spark.conf.set("spark.databricks.delta.properties.defaults.minReaderVersion", 2) spark.conf.set("spark.databricks.delta.properties.defaults.columnMapping.mode", "name")
2. Add shortcut tables from Databricks to Synapse Lakehouse. The data shows correctly.
3. Go to the SQL Endpoint. Tables fail to load. with the error. Corrective Action: Recreate the table without column mapping property.
4. This has a knock-on effect with data failing to feed into the datasets and Power BI.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.