Advance your Data & AI career with 50 days of live learning, dataviz contests, hands-on challenges, study groups & certifications and more!
Get registeredGet Fabric Certified for FREE during Fabric Data Days. Don't miss your chance! Request now
I am trying to Process large volume of data (61.3 GB) in an Analysis Service Table. The data has 300 millions of records and 21 columns. Whenever , I try to process the data using SQL server Analysis service , I am getting the following error :
Failed to save modifications to the server. Error returned: 'The number of items in the list is too large. Buffered lists can support up to 2147483647 items and streamed lists can support up to 4503599627370496 items.. The exception was raised by the IDbCommand interface.
I need to process this volume of data. What are the processes to push this amount of data into an Analysis Service table. Please help
Hi @rakesmanna,
Did you connect to SQL Server Analysis Service with Direct Query mode or Import mode?
Regards,
Yuliana Gu
Thanks @v-yulgu-msft
Actually I am importing data from Azure Data lake Storage Gen 1 and only Import Mode is available there. To overcome the problem I partitioned the table in smaller chunks so that no partiotion conatains more than 10 million records. Then I create a smaller table with same structure but only records was there. Import this smaller table first. Then use SQL Server Analysis Service and Process all the partition for original data .
Check out the November 2025 Power BI update to learn about new features.
Advance your Data & AI career with 50 days of live learning, contests, hands-on challenges, study groups & certifications and more!
| User | Count |
|---|---|
| 97 | |
| 76 | |
| 52 | |
| 51 | |
| 46 |