March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount! Early bird discount ends December 31.
Register NowBe one of the first to start using Fabric Databases. View on-demand sessions with database experts and the Microsoft product team to learn just how easy it is to get started. Watch now
Hi, does anyone know if there's a limitation on table columns in the SQL endpoints of a Lakehouse? I've got a table which tracks county statistics that has 720 columns. I was able to load this table with no issues using a notebook. See image 1 below - the table (Raw_CHS) shows up fine in the Lakehouse explorer, and displays preview data with no issues. Also I'm able to query it from a 2nd notebook and build additional dimension tables from it.
However, I'm unable to get the table to show up in the SQL endpoint, see image 2 below.
Also, a "select top 100 * from Raw_CHS" fails (image 3) with an Invalid Object Name error.
Any ideas? I was unable to find any documentation showing limits to the number of columns a table can have in the documentation.
Thanks,
Scott
Thanks for using Microsoft Fabric Community.
Apologies for the issue that you are facing.
As I understand that you were able to see the "Raw_CHS" table in the Lakehouse explorer, but the same table is not shown in the SQL Endpoint.
Could you please confirm the type of the table "Raw_CHS"? Is it managed table or external table?
Hi @v-cboorla-msft , it's definitely a managed table. The only thing "odd" about it is that it has a large number of columns - 700 or so. I hadn't seen this on any other table, so I'm kinda assuming the large # of columns is what did it.
I'm happy to open a support ticket, or take a Teams call from anyone that needs more details.
Thanks,
Scott
Thanks for your response.
We are reaching out to the internal team to get more information related to your query and will get back to you as soon as we have an update.
Apologies for the delay in response.
Follow up to see if you have a resolution yet on your issue.
In case if you have any resolution please do share that same with the community as it can be helpful to others.
In case if you didn't get any resolution.
Please go ahead and raise a support ticket to reach our support team: https://support.fabric.microsoft.com/support
Please provide the ticket number here as we can keep an eye on it.
Thank you.
Hi @v-cboorla-msft - I actually don't think this is an issue with "too many columns". Instead, I think it's an issue with combining data from multiple .csv input files, where many of them don't have all the columns. I think something is going wrong during the data load to the table (via a notebook using dataframes), but no error is being shown.
I suspect this because I loaded "only" the latest file - and everything worked fine.
Trying now to figure out how to load all of the years .csv files, plugging in NULLs for columns that don't exist. But I'm not at all fluent in PySpark so really struggling.
Long and short though I don't believe it's a limit on number of columns - everything works fine even with 700 columns if I stick to just a single year.
Thanks,
Scott
March 31 - April 2, 2025, in Las Vegas, Nevada. Use code MSCUST for a $150 discount!
Your insights matter. That’s why we created a quick survey to learn about your experience finding answers to technical questions.
Arun Ulag shares exciting details about the Microsoft Fabric Conference 2025, which will be held in Las Vegas, NV.
User | Count |
---|---|
8 | |
4 | |
3 | |
2 | |
2 |
User | Count |
---|---|
8 | |
7 | |
6 | |
5 | |
4 |