Home » Php » php – Import large file on MySQL DB

php – Import large file on MySQL DB

Posted by: admin July 12, 2020 Leave a comment

Questions:

I want to insert about 50,000 mysql query for ‘insert’ in mysql db,
for this i have 2 options,

1- Directly import the (.sql) file:
Following error is occur
” You probably tried to upload too large file. Please refer to documentation for ways to workaround this limit. “

2- Use php code to insert these queries in form of different chunks from the (.sql) file.
here is my code:

<?php

// Configure DB
include "config.php";

// Get file data
$file = file('country.txt'); 

// Set pointers & position variables
$position = 0;
$eof = 0; 

while ($eof < sizeof($file))
{
    for ($i = $position; $i < ($position + 2); $i++)
    {
        if ($i < sizeof($file))
        {
            $flag = mysql_query($file[$i]);

            if (isset($flag))
            {
                echo "Insert Successfully<br />";
                $position++;
            }

            else
            {
                echo mysql_error() . "<br>\n";
            }
        }

        else
        {
            echo "<br />End of File";
            break;
        }
    }

    $eof++;
}

?>

But memory size error is occur however i have extend memory limit from 128M to 256M or even 512M.

Then i think that if i could be able to load a limited rows from (.sql) file like 1000 at a time and execute mysql query then it may be import all records from file to db.
But here i dont have any idea for how to handle file start location to end and how can i update the start and end location, so that it will not fetch the previously fetched rows from .sql file.

How to&Answers:

Here is the code you need, now prettified! =D

<?php

include('config.php');

$file = @fopen('country.txt', 'r');

if ($file)
{
    while (!feof($file))
    {
        $line = trim(fgets($file));
        $flag = mysql_query($line);

        if (isset($flag))
        {
            echo 'Insert Successfully<br />';
        }

        else
        {
            echo mysql_error() . '<br/>';
        }

        flush();
    }

    fclose($file);
}

echo '<br />End of File';

?>

Basically it’s a less greedy version of your code, instead of opening the whole file in memory it reads and executes small chunks (one liners) of SQL statements.

Answer:

Instead of loading the entire file into memory, which is what’s done when using the file function, a possible solution would be to read it line by line, using a combinaison of fopen, fgets, and fclose — the idea being to read only what you need, deal with the lines you have, and only then, read the next couple of ones.

Additionnaly, you might want to take a look at this answer : Best practice: Import mySQL file in PHP; split queries

There is no accepted answer yet, but some of the given answers might already help you…

Answer:

Use the command line client, it is far more efficient, and should easily handle 50K inserts:

 mysql -uUser -p <db_name> < dump.sql

Answer:

I read recently about inserting lots of queries into a database to quickly. The article suggested using the sleep() (or usleep) function to delay a few seconds between queries so as not to overload the MySQL server.