Home » Php » javascript – Reduce memory consumption in PHP while handling uploads by php input

javascript – Reduce memory consumption in PHP while handling uploads by php input

Posted by: admin July 12, 2020 Leave a comment


I have nginx 1.0.5 + php-cgi (PHP 5.3.6) running.
I need to upload ~1GB files (1-5 parallel uploads must be).
I trying to create uploading of big files through ajax upload. Everything is working but PHP eating a lot of memory for each upload. I have set memory_limit = 200M, but it’s working up to ~150MB size of uploaded file. If file is bigger – uploading fails. I can set memory_limit bigger and bigger, but I think it’s wrong way, cause PHP can eat all memory.
I use this PHP code (it’s simplified) to handle uploads on server side:

$input = fopen('php://input', 'rb');
$file = fopen('/tmp/' . $_GET['file'] . microtime(), 'wb');
while (!feof($input)) {
    fwrite($file, fread($input, 102400));


user www-data;
worker_processes 100;
pid /var/run/nginx.pid;

events {
        worker_connections 768;
        # multi_accept on;

http {

        # Basic Settings

        sendfile on;
        tcp_nopush on;
        tcp_nodelay on;
        keepalive_timeout 65;
        types_hash_max_size 2048;
        client_max_body_size 2g;
        # server_tokens off;
        server_names_hash_max_size 2048;
        server_names_hash_bucket_size 128;

        # server_names_hash_bucket_size 64;
        # server_name_in_redirect off;

        include /etc/nginx/mime.types;
        default_type application/octet-stream;

        # Logging Settings

        access_log /var/log/nginx/access.log;
        error_log /var/log/nginx/error.log;

        # Gzip Settings

        gzip on;
        gzip_disable "msie6";

        include /etc/nginx/conf.d/*.conf;
        include /etc/nginx/sites-enabled/*;


server {
    listen  80;
    server_name srv.project.loc;

    # Define root
    set $fs_webroot "/home/andser/public_html/project/srv";
    root $fs_webroot;
    index   index.php;

    # robots.txt
    location = /robots.txt {
        alias $fs_webroot/deny.robots.txt;

    # Domain root
    location / {
        if ($request_method = OPTIONS ) {
            add_header Access-Control-Allow-Origin "http://project.loc";
            add_header Access-Control-Allow-Methods "GET, OPTIONS, POST";
            add_header Access-Control-Allow-Headers "Authorization,X-Requested-With,X-File-Name,Content-Type";
            #add_header Access-Control-Allow-Headers "*";
            add_header Access-Control-Allow-Credentials "true";
            add_header Access-Control-Max-Age "10000";
            add_header Content-Length 0;
            add_header Content-Type text/plain;
            return 200;
        try_files $uri $uri/ /index.php?$query_string;

    #error_page  404  /404.htm

    location ~ index.php {
        fastcgi_index   index.php;
        fastcgi_param   SCRIPT_FILENAME $fs_webroot/$fastcgi_script_name;
        include fastcgi_params;
        fastcgi_param   REQUEST_METHOD  $request_method;
        fastcgi_param   PATH_INFO   $fastcgi_script_name;

        add_header Pragma no-cache;
        add_header Cache-Control no-cache,must-revalidate;
        add_header Access-Control-Allow-Origin *;
        #add_header Access-Control-Allow-Headers "Content-Type, X-Requested-With, X-File-Name";

Anybody knows the way to reduce memory consumption by PHP?

How to&Answers:

There’s a hack, which is about faking content type header, turning it from application/octet-stream to multipart/form-data. It will stop PHP from populating $HTTP_RAW_POST_DATA. More details https://github.com/valums/file-uploader/issues/61.


Have been in the same shoe before and this is what i did split the files into different chunks during the upload process.

I good example is using [1]: http://www.plupload.com/index.php “pulpload” or trying using a java applet http://jupload.sourceforge.net which also has resume capability when there are network issues etc.

The most important thing is that you want your files uploaded via a web browser there is noting stopping you from doing so in chunks


Why don’t you try using flash to upload huge files. For example, you can try swfupload, which has good support for PHP.