Find everything you need to get certified on Fabric—skills challenges, live sessions, exam prep, role guidance, and more.
Get startedGrow your Fabric skills and prepare for the DP-600 certification exam by completing the latest Microsoft Fabric challenge.
I sometime got below error return when "refresh" on a large size pibx file (~1.8GB on PIBX, over 30GB on raw data source)
"Memory error: Memory Allocation failure . Try simplifying or reducing the number of queries..."
The whole "refresh" process needed ~2hrs, but it seem probability return memory error when i was away my PC with screen lock
but it seem no problem when i was working with my PC...
and a "debug.log" file was found in the folder of pibx file
debug.log -
"[0611/160027.784:ERROR:directory_reader_win.cc(43)] FindFirstFile: The system cannot find the path specified. (0x3)
[0705/092804.140:ERROR:directory_reader_win.cc(43)] FindFirstFile: The system cannot find the path specified. (0x3)
[0707/120147.347:ERROR:directory_reader_win.cc(43)] FindFirstFile: The system cannot find the path specified. (0x3)
[0708/085610.830:ERROR:directory_reader_win.cc(43)] FindFirstFile: The system cannot find the path specified. (0x3)
[0809/141903.435:ERROR:directory_reader_win.cc(43)] FindFirstFile: The system cannot find the path specified. (0x3)
[0812/183403.637:ERROR:directory_reader_win.cc(43)] FindFirstFile: The system cannot find the path specified. (0x3)"
Thanks!
@Anonymous , I do face this issue very often with my file of a similar size. And it works on reducing the data. I get it after 2.2 GB .
do you reducing it from data source or just drop columns after import ?
I already dropped some unused columns after import actually.
@Anonymous , I am actually reducing the data(rows) . Ideally, we should use a deployment pipeline to shift from DEV to Prod, so data get loaded in service, not in desktop, Another way is to use dataflows with a new direct query option.
But I think both are premium features.
My option are direct query to source or dataflow to reduce the data as of now.
@amitchandak Fyi, online premium with shared capacity also have a 2 hours limit on data refresh fuction.
unless you are puchased premium capacities.. and it can supported up to 5 hours.
therefore, i would like to see if any workaround / solution on desktop verison.
User | Count |
---|---|
85 | |
74 | |
71 | |
68 | |
56 |
User | Count |
---|---|
98 | |
96 | |
92 | |
78 | |
70 |