Rabatscher Michael
2008-06-25 10:37:11 UTC
Hi all!
I'm using an Apache module and want to implement file uploads.
The thing is that these files are quite big (up to 100MB) and
many concurrent file uploads shall be possible. The thing is
that the Webbroker module is quite memory intensive, there are
many analysis modules which are memory intensive too.
I found out that the apache module first copies the whole file
into a string (fContent). Which is then parsed in the
ContentParser (a derivate from Matlus's content parser). This
parser again moves the content into a memorystream so the data is
at least doubled and thus can lead to memory problems.
Has anyone written a more intelligent parser or can I reduce the
memory footprint if I use memory mapped files (instead of the
fContent variable) for large data chunks?
kind regards
Mike
I'm using an Apache module and want to implement file uploads.
The thing is that these files are quite big (up to 100MB) and
many concurrent file uploads shall be possible. The thing is
that the Webbroker module is quite memory intensive, there are
many analysis modules which are memory intensive too.
I found out that the apache module first copies the whole file
into a string (fContent). Which is then parsed in the
ContentParser (a derivate from Matlus's content parser). This
parser again moves the content into a memorystream so the data is
at least doubled and thus can lead to memory problems.
Has anyone written a more intelligent parser or can I reduce the
memory footprint if I use memory mapped files (instead of the
fContent variable) for large data chunks?
kind regards
Mike