Join us for an expert-led overview of the tools and concepts you'll need to pass exam PL-300. The first session starts on June 11th. See you there!
Get registeredJoin us at FabCon Vienna from September 15-18, 2025, for the ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM. Get registered
I have created a gold_dev_lh and my first test is to just manually add a file. Get data. Upload a small csv, Its just not working. The file just sits in upload files and nothing happens?
Upload Files
Current Uploads
FileName Lakehouse Name
test.csv gold_dev_lh 25/kb / 25 kb
Is this another bug?
I added a file to my silver data lake yesterday so I know that it has happened. i just cant fathom why I can't add into the gold LK I just set up
Hi @DebbieE , Thank you for reaching out to the Microsoft Community Forum.
This is unlikely a permissions or configuration issue, especially if both Lakehouses were created in the same workspace and under the same user. It's almost certainly a UI glitch, not a backend failure. The upload is probably completing, but the file isn't showing in the UI due to a rendering issue. To confirm whether the file actually uploaded, open a notebook within the same workspace and run the following code:
display(dbutils.fs.ls("Files/"))
If your file, like test.csv, appears in the output, then the upload succeeded, it’s just the UI that failed to reflect it. From there, you can read and promote it to a Delta table. If the file doesn't appear, try uploading again. if it's in another language, make sure you’ve switched the Fabric UI to English and use an incognito window in Chrome or Edge to bypass any cached UI issues. After uploading, refresh the page immediately. These steps often resolve the display issue and allow the file to become visible.
If this helped solve the issue, please consider marking it “Accept as Solution” so others with similar queries may find it more easily. If not, please share the details, always happy to help.
Thank you.
I already tried this but ModuleNotFoundError: No module named 'pyspark.dbutils' I dont thin this is available in Fabric. Its in databricks so i can't do that?
tried again this morning and it still doesnt work.. This is a bit ridiculous, Is this a known bug?
As an addition. I thought I would try and load a small csv into silver to check this works. Hours after trying again with gold and can see this
Files are currently being uploaded to Fabric from multiple Lakehouses. This will affect the upload time of files to dev_timesheet_silver_lh until other files have finished.
So...this isnt good. Its a tony csv file
User | Count |
---|---|
13 | |
4 | |
3 | |
3 | |
3 |
User | Count |
---|---|
8 | |
8 | |
7 | |
6 | |
5 |