I have a big problem here with python, openpyxl and Excel files. My objective is to write some calculated data to a preconfigured template in Excel. I load this template and write the data on it. There are two problems:
- I’m talking about writing Excel books with more than 2 millions of cells, divided into several sheets.
- I do this successfully, but the waiting time is unthinkable.
I don’t know other way to solve this problem. Maybe openpyxl is not the solution. I have tried to write in xlsb, but I think openpyxl does not support this format. I have also tried with optimized writer and reader, but the problem comes when I save, due to the big data. However, the output file size is 10 MB, at most. I’m very stuck with this. Do you know if there is another way to do this?
Thanks in advance.
The file size isn’t really the issue when it comes to memory use but the number of cells in memory. Your use case really will push
openpyxl to the limits at the moment which is currently designed to support either optimised reading or optimised writing but not both at the same time. One thing you might try would be to read in
use_iterators=True this will give you a generator that you can call from
xlsxwriter which should be able to write a new file for you.
xlsxwriter is currently significantly faster than
openpyxl when creating files. The solution isn’t perfect but it might work for you.