The other day I was testing a feature on my development box when I realized my local data was really out of date, and if I was going to get anywhere with my testing, I needed some recent data from production. The problem is that our main MySQL database is just enormous, and restoring a 10GB MySQL database takes hours on even the fastest box.

Sure, you can point and laugh about why we're not using a better database, like maybe PostgreSQL, but since our sites are based on WordPress, we're stuck with MySQL and have to make the best of it.

Luckily I remembered that command-line arguments exist, and looked to see if there was a way to export only a single table using mysqldump, and sure enough, there was. The syntax goes like this:

mysqldump [OPTIONS] database [tables]

Of course that example sucks, so here's a better one -- if you want to backup only the wp_users table from your WordPress database named wordpressdb, you'd do something closer to this, which would export it all into a file called users.bak.

mysqldump -uUser -pPassword -hHostname wordpressdb wp_users > users.bak

You'd want to replace User, Password, and Hostname with the values from your wp-config, of course.

If you wanted to export multiple tables -- for instance, if you realized you probably need wp_usermeta if you're going to export wp_users -- you would simply add that to the command:

mysqldump -uUser -pPassword -hHostname wordpressdb wp_users wp_usermeta > users.bak

To import that backup later on, transfer it wherever you want to use it, and run something like this:

mysql -uUser -pPassword -hHostname databasename < users.bak

And that's the story of how one table was exported and my local database is now up to date.