Home » Wordpress » How to merge local and live databases?

How to merge local and live databases?

Posted by: admin November 5, 2017 Leave a comment


We’ve been developing for WordPress for several years and whilst our workflow has been upgraded at several points there’s one thing that we’ve never solved… merging a local WordPress database with a live database.

So I’m talking about having a local version of the site where files and data are changed, whilst the data on the live site is also changing at the same time.

All I can find is the perfect world scenario of pulling the site down, nobody (even customers) touching the live site, then pushing the local site back up. I.e copying one thing over the other.

How can this be done without running a tonne of mysql commands? (it feels like they could fall over if they’re not properly checked!) Can this be done via Gulp’s (I’ve seen it mentioned) or a plugin?

Just to be clear, I’m not talking about pushing/pulling data back and forth via something like WP Migrate DB Pro, BackupBuddy or anything similar – this is a merge, not replacing one database with another.

I would love to know how other developers get around this!

File changes are fairly simple to get around, it’s when there’s data changes that it causes the nightmare.

WP Stagecoach does do a merge but you can’t work locally, it creates a staging site from the live site that you’re supposed to work on. The merge works great but it’s a killer blow not to be able to work locally.

I’ve also been told by the developers that datahawk.io will do what I want but there’s no release date on that.


It sounds like VersionPress might do what you need:

VersionPress staging

A couple of caveats: I haven’t used it, so can’t vouch for its effectiveness; and it’s currently in early access.


Important : Take a backup of Live database before merging Local data to it.

Follow these steps might help in migrating the large percentage of data and merging it to live

  1. Go to wp back-end of Local site Tools->Export.
  2. Select All content radio button (if not selected by default).
  3. This will bring an Xml file containing all the local data comprised of all default post types and custom post types.
  4. Open this XML file in notepad++ or any editor and find and replace the Local URL with the Live URL.
  5. Now visit the Live site and Import the XML under Tools->Import.
  6. Upload the files (images) manually.

This will bring a large percentage of data from Local to Live .

Rest of the data you will have to write custom scripts.

Risk factors are :

  1. When uploading the images from Local to Live , images of same name
    will be overriden.
  2. WordPress saves the images in post_meta generating a serialized data for the images , than should be taken care of when uploading the database.
  3. Serialized data in post_meta for post_type="attachment" saves serialized data for 3 or 4 dimensions of the images.
  4. Usernames or email ids of users when importing the data , can be same (Or wp performs the function of checking unique usernames and emails) then those users will not be imported (might be possible).

If I were you I’d do the following (slow but affords you the greatest chance of success)

First off, set up a third database somewhere. Cloud services would probably be ideal, since you could get a powerful server with an SSD for a couple of hours. You’ll need that horsepower.

Second, we’re going to mysqldump the first DB and pipe the output into our cloud DB.

mysqldump -u user -ppassword dbname | mysql -u root -ppass -h somecloud.db.internet

Now we have a full copy of DB #1. If your cloud supports snapshotting data, be sure to take one now.

The last step is to write a PHP script that, slowly but surely, selects the data from the second DB and writes it to the third. We want to do this one record at a time. Why? Well, we need to maintain the relationships between records. So let’s take comments and posts. When we pull post #1 from DB #2 it won’t be able to keep record #1 because DB #1 already had one. So now post #1 becomes post #132. That means that all the comments for post #1 now need to be written as belonging to post #132. You’ll also have to pull the records for the users who made those posts, because their user IDs will also change.

There’s no easy fix for this but the WP structure isn’t terribly complex. Building a simple loop to pull the data and translate it shouldn’t be more then a couple of hours of work.


If I understand you, to merge local and live database, until now I’m using other software such as NavicatPremium, it has Data Sycn feature.


This can be achieved live using spring-xd, create a JDBC Stream to pull data from one db and insert into the other. (This acts as streaming so you don’t have to disturb any environment)


The first thing you need to do is asses if it would be easier to do some copy-paste data entry instead of a migration script. Sometimes the best answer is to suck it up and do it manually using the CMS interface. This avoids any potential conflicts with merging primary keys, but you may need to watch for references like the creator of a post or similar data.

If it’s just outright too much to manually migrate, you’re stuck with writing a script or finding one that is already written for you. Assuming there’s nothing out there, here’s what you do…


1) Make a list of what you need to transfer. Do you need users, posts, etc.? Find the database tables and add them to the list.

2) Make a note all possible foreign keys in the database tables being merged into the new database. For example, wp_posts has post_author referencing wp_users. These will need specific attention during the migration. Use this documentation to help find them.

3) Once you know what tables you need and what they reference, you need to write the script. Start by figuring out what content is new for the other database. The safest way is to do this manually with some kind of side-by-side list. However, you can come up with your own rules on how to automatically match table rows. Maybe to check for $post1->post_content === $post2->post_content in cases the text needs to be the same. The only catch here is the primary/foreign keys are off limits for these rules.

4) How do you merge new content? The general idea is that all primary keys will need to be changed for any new content. You want to use everything except for the id of post and insert that into the new database. There will be an auto-increment to create the new id, so you wont need the previous id (unless you want it for script output/debug).

5) The tricky part is handling the foreign keys. This process is going to vary wildly depending on what you plan on migrating. What you need to know is which foreign key goes to which (possibly new) primary key. If you’re only migrating posts, you may need to hard-code a user id to user id mapping for the post_author column, then use this to replace the values.

But what if I don’t know the user ids for the mapping because some users also need to be migrated?

This is where is gets tricky. You will need to first define the merge rules to see if a user already exists. For new users, you need record the id of the newly inserted users. Then after all users are migrated, the post_author value will need to be replaced when it references a newly merged user.

6) Write and test the script! Test it on dummy databases first. And again, make backups before using it on your databases!


I’ve done something simillar with ETL (Extract, Transform, Load) process when I was moving data from one CMS to another.

Rather than writing a script I used a Pentaho Data Integration (Kettle) tool.

The Idea of ETL is pretty much straight forward:

  • Extract the data (for instance from one database)
  • Transform it to suit your needs
  • Load it to the final destination (your second database).

The tool is easy to use and it allows you to experiment with various steps and outputs to investigate the data. When you design a right ETL proces, you are ready to merge those databases of yours.


How can this be done without running a tonne of mysql commands?

No way. If both local and web sites are running at the same time how can you prevent not having the same ids’ with different content?


so if you want to do this you can use mysql repication.i think it will help you to merge with different database mysql.