It is very unlikley that you get an out of memory exception because of processing a ZIP file. The Java classes ZipFile
and ZipEntry
don't contain anything that could possibly fill up 613 MB of memory.
What could exhaust your memory is to keep the decompressed files of the ZIP archive in memory, or - even worse - keeping them as an XML DOM, which is very memory intensive.
Switching to another ZIP library will hardly help. Instead, you should look into changing your code so that it processes the ZIP archive and the contained files like streams and only keeps a limited part of each file in memory at a time.
BTW: I would be nice if you could provide more information about the huge ZIP files (do they contain many small files or few large files?) and about what you do with each ZIP entry.
Update:
Thanks for the additional information. It looks like you keep the contents of the ZIP file in memory (although it somewhat depends on the implementation of the S3Object
class, which I don't know).
It's probably best to implement some sort of batching as you propose yourself. You could for example add up the decompressed size of each ZIP entry and upload the files every time the total size exceeds 100 MB.
--------------------------------
上面的是java上的原因,android 上不知道是不是也是这个原因.
目前没找到解决方案.