I can’t insert big amount of data to worksheet.

Error is “Insufficient memory to continue the execution of the program”.

I’m using Excel 2007. My array has 393210 rows and 29 columns.

Firstly I tried to insert it using one operation (Range = (object) array).

After that I divided array to several parts.

My code:

```
// Fill content
const int row_limit = 283500;
int row_start = 0;
int row_end;
int number_of_iterations = (int)Math.Ceiling((double)dtable.Rows.Count / (double)row_limit);
for (int idx = 0; idx < number_of_iterations; idx++)
{
if (idx == number_of_iterations - 1)
row_end = dtable.Rows.Count - 1;
else
row_end = row_start + row_limit - 1;
object[,] arr = new object[row_end - row_start + 1, dtable.Columns.Count];
int arr_row = 0;
for (int r = row_start; r <= row_end; r++)
{
for (int c = 0; c < dtable.Columns.Count; c++)
{
arr[arr_row, c] = dtable.Rows[r][c];
}
arr_row++;
}
((Excel.Range)ex_sheet.get_Range((Excel.Range)ex_sheet.Cells[row_start + 2, 1], (Excel.Range)ex_sheet.Cells[row_end + 2, dtable.Columns.Count])).Value = arr;
row_start = row_end + 1;
}
```

Constant row_limit sets the maximum amount of rows inserted during one operation. If the number is less than 283000 everything is ok. But if it is more than 283500 I get the error.

As I found out here “Worksheet size 1,048,576 rows by 16,384 columns”.

What is the reason of this error?

Excel 2007 is limited to about 2GB of memory (it is a 32-bit program), but in practice it often runs out of memory well before reaching that limit due to memory fragmentation problems.

Internally Excel uses a sparse array scheme to track the cells in use on a worksheet so although the published limits of 1,048,576 rows and 16,384 columns are correct it is not possible to use all of these cells without exceeding the 2GB limit (you would need over 1000 GB).

The 64-bit versions of Excel 2010 and 2013 have much greater memory limits.

See http://www.decisionmodels.com/memlimitsc.htm for some details of memory limits by Excel version.

Tags: c#c#