Home » Mysql » converting database from mysql to mongoDb

converting database from mysql to mongoDb

Posted by: admin November 1, 2017 Leave a comment

Questions:

is there any easy way to change the database from mysql to mongoDB ?

or better any one suggest me good tutorial do it

Answers:

is there any easy way to change the database from mysql to mongoDB ?

Method #1: export from MySQL in a CSV format and then use the mongoimport tool. However, this does not always work well in terms of handling dates of binary data.

Method #2: script the transfer in your language of choice. Basically you write a program that reads everything from MySQL one element at a time and then inserts it into MongoDB.

Method #2 is better than #1, but it is still not adequate.

MongoDB uses collections instead of tables. MongoDB does not support joins. In every database I’ve seen, this means that your data structure in MongoDB is different from the structure in MySQL.

Because of this, there is no “universal tool” for porting SQL to MongoDB. Your data will need to be transformed before it reaches MongoDB.

Questions:
Answers:

If you’re using Ruby, you can also try: Mongify

It’s a super simple way to transform your data from a RDBS to MongoDB without losing anything.

Mongify will read your mysql database, build a translation file for you and all you have to do is map how you want your data transformed.

It supports:

  • Auto updating IDs (to BSON ObjectID)
  • Updating referencing IDs
  • Type Casting values
  • Embedding tables into other documents
  • Before save filters (to allow changes to the data manually)
  • and much much more…

Read more about it at: http://mongify.com/getting_started.html

There is also a short 5 min video on the homepage that shows you how easy it is.

Questions:
Answers:

MongoVUE‘s free version can do this automatically for you.

It can connect to both databases and perform the import

Questions:
Answers:

I am kind of partial to TalendOpenStudio for those kind of migration jobs. It is an eclipse based solution to create data migration “scripts” in a visual way. I do not like visual programming, but this is a problem domain I make an exception.

Adrien Mogenet has create a MongoDBConnection plugin for mongodb.

It is probably overkill for a “simple” migration but ut is a cool tool.

Mind however, that the suggestion of Nix will probably save you time if it is a one-of migration.

Questions:
Answers:

You can use QCubed (http://qcu.be) framework for that. The procedure would be something like this:

  1. Install QCubed (http://www.thetrozone.com/qcubed-installation)
  2. Do the codegen on your database. (http://www.thetrozone.com/php-code-generation-qcubed-eliminating-sql-hassle)
  3. Take your database offline from the rest of the world so that only one operation runs at a time.
  4. Now write a script which will read all rows from all tables of the database and use the getJson on all objects to get the json. You can then use the data to convert to array and push it into the mongoDB!
Questions:
Answers:

If you are looking for a tool to do it for you, good luck.

My suggestion is to just pick your language of choice, and read from one and write to another.

Questions:
Answers:

If I could quote Matt Briggs (it solved my roblem one time):

The driver way is by FAR the most straight forward. The import/export tools are fantastic, but only if you are using them as a pair. You are in for a wild ride if your table includes dates and you try to export from the db and import into mongo.

You are lucky too, being in c#. We are using ruby, and have a 32million row table we migrated to mongo. Our ending solution was to craft an insane sql statement in postgres that output json (including some pretty kludgy things to get dates going properly) and piped the output of that query on the command line into mongoimport. It took an incredibly frustrating day to write, and is not the sort of thing that can ever really be changed.

So if you can get away with it, use ado.net with the mongo driver. If not, I wish you well 🙂

(note that this is coming from a total mongo fanboi)

MySQL is very similar to other SQL databases, so I send You to the topić:
Convert SQL table to mongoDB document

Questions:
Answers:

You can use the following project.It requires solr like configuration file to be written.Its very simple and straight forward.

http://code.google.com/p/sql-to-mongo-importer/

Questions:
Answers:

Try this:
Automated conversion of MySQL dump to Mongo updates using simple r2n mappings.
https://github.com/virtimus/mysql2mongo

Questions:
Answers:

Here’s what I did it with Node.js for this purpose:

var mysql = require('mysql');
var MongoClient = require('mongodb').MongoClient;

function getMysqlTables(mysqlConnection, callback) {
    mysqlConnection.query("show full tables where Table_Type = 'BASE TABLE';", function(error, results, fields) {
        if (error) {
            callback(error);
        } else {
            var tables = [];
            results.forEach(function (row) {
                for (var key in row) {
                    if (row.hasOwnProperty(key)) {
                        if(key.startsWith('Tables_in')) {
                            tables.push(row[key]);
                        }
                    }
                }
            });
            callback(null, tables);
        }
    });

}

function tableToCollection(mysqlConnection, tableName, mongoCollection, callback) {
    var sql = 'SELECT * FROM ' + tableName + ';';
    mysqlConnection.query(sql, function (error, results, fields) {
        if (error) {
            callback(error);
        } else {
            if (results.length > 0) {
                mongoCollection.insertMany(results, {}, function (error) {
                    if (error) {
                        callback(error);
                    } else {
                        callback(null);
                    }
                });
            } else {
                callback(null);
            }
        }
    });
}

MongoClient.connect("mongodb://localhost:27017/importedDb", function (error, db) {
    if (error) throw error;

    var MysqlCon = mysql.createConnection({
        host: 'localhost',
        user: 'root',
        password: 'root',
        port: 8889,
        database: 'dbToExport'
    });

    MysqlCon.connect();

    var jobs = 0;

    getMysqlTables(MysqlCon, function(error, tables) {
        tables.forEach(function(table) {
            var collection = db.collection(table);
            ++jobs;
            tableToCollection(MysqlCon, table, collection, function(error) {
                if (error) throw error;
                --jobs;
            });
        })
    });

    // Waiting for all jobs to complete before closing databases connections.
    var interval = setInterval(function() {
        if(jobs<=0) {
            clearInterval(interval);
            console.log('done!');
            db.close();
            MysqlCon.end();
        }
    }, 300);
});

Questions:
Answers:

If anyone’s still looking for a solution, i found that the easiest way is to write a PHP script to connect to your SQL DB, retrieve the information you want using the usual Select statement, transform the information into JSON using the PHP JSON Encode functions and simply output your results to file or directly to MongoDB. It’s actually pretty simple and straight forward, the only thing to do is to double check your output against a Json validator, you may have to use functions such as explode to replace certain characters and symbols to make it valid. I have done this before however i currently do not have the script at hand but from what i can remember it was literally half a page of code.

Oh also remember Mongo is a document store so some data mapping is required to get it to be acceptable with mongo.