The ultimate Fabric, Power BI, SQL, and AI community-led learning event. Save €200 with code FABCOMM.
Get registeredEnhance your career with this limited time 50% discount on Fabric and Power BI exams. Ends August 31st. Request your voucher.
We've recently moved data into Azure Synapse Link. I periodically (every other day or so?) receive errors like this:
Failure details: The last refresh attempt failed because of an internal service error. This is usually a transient issue. If you try again later and still see this message, contact support.
<ccon>DataSource.Error: <ccon>Microsoft SQL</ccon>: <ccon>Cannot bulk load because the file "https://ilconsumerdls.dfs.core.windows.net/dataverse-glynlyon-glynlyon2/lead/2022-10.csv" could not be opened. Operating system error code 12(The access code is invalid.). Statement ID: {AE1D6FCE-71A3-41B8-80D3-FC4EFC6B3865} | Query hash: 0x435D07F88FE741DE | Distributed request ID: {C3BA2156-131D-4E5B-93D7-D8892A087212}. Total size of data scanned is 1771 megabytes, total size of data moved is 64 megabytes, total size of data written is 0 megabytes.</ccon>.
It does seem to be transient, but I'm curious as to why it's happening.
Through further Googling, it would seem this is a bug. A work around is to use the _partitioned equivalent of the Synapse tables. The non-partioned version is updated in so called real-time. The _partitioned version is updated hourly so there is less chance of an update failing the data refresh, although this isn't sustainable for me.
Take a look at the below link to see how other customers are facing the same issue:
https://learn.microsoft.com/en-us/answers/questions/778004/reading-data-from-synapse-lake-database-a...
I've recently switched over to using Synapse and am facing the same issue.
Google returns a number of different answers but the one I found that had most legs was due to the failing CSV file being updated at the same time you are trying to read from it.
I decided to test this theory. I refreshed a Power BI report and received a failure the same as yours (obviously with a different file path). I then navigated in Synapse and saw the entity's CSV file had indeed been updated during the time Power BI was refreshing. It confirms the theory.
I find this surprising as it would mean you cannot refresh or build against the source (Synapse) and have any PQ evualtion complete if the end-user are updating the raw source (which get pushes through to Synapse).
I now attempting to understand if our Synapse configuration is correct or does it need further tweaking.
I'll be interested to hear if you've had to modify any settings, @D_PBI . I too wondered if the error is produced by the file being in use (ie: being written to).
Hi , @dross According to your error details, it seems that the refresh failure is caused by the inability to open the file path: https://ilconsumerdls.dfs.core.windows.net/dataverse-glynlyon-glynlyon2/lead/2022-10.csv", if it is convenient, you can go to the corresponding path to check whether the corresponding file can be opened normally. You can also check if this file is different from other files in the same path .
Best Regards,
Aniya Zhang
If this post helps, then please consider Accept it as the solution to help the other members find it more quickly
User | Count |
---|---|
42 | |
15 | |
12 | |
11 | |
8 |
User | Count |
---|---|
51 | |
31 | |
22 | |
17 | |
15 |