Backup Data MySQL and Migration and Restore into new DB

June 18, 2019 by
Let’s go through the step by step process needed for this.

1. Backup the data

The first step is to take a dump of the data that you want to transfer. To do that, we will use mysqldump command. The basic syntax of the command is:
mysqldump -u [username] -p [database] > dump.sql
If the database is on a remote server, either log in to that system using ssh or use -h and -P options to provide host and port respectively.
mysqldump -P [port] -h [host] -u [username] -p [database] > dump.sql
There are various options available for this command, let’s go through the major ones as per the use case.

Backing up specific Databases

mysqldump -u [username] -p [database] > dump.sql
This command dumps specified database to the file.
You can specify multiple databases for the dump using the following command:
mysqldump -u [username] -p --databases [database1] [database2] > dump.sql
You can use –all-databases option to backup all databases on the MySQL instance.
mysqldump -u [username] -p --all-databases > dump.sql

Specific tables

Above commands dump all the tables in the specified database, if you need to take backup of some specific tables, you can use following command:
mysqldump -u [username] -p [database] [table1] [table2] > dump.sql

Custom query

If you want to backup data using some custom query, you will need to use where option provided by mysqldump.
mysqldump -u [username] -p [database] [table1] --where="WHERE CLAUSE" > dump.sql

Example: mysqldump -u root -p testdb table1 --where="mycolumn = myvalue" > dump.sql

Note

  1. By default, mysqldump command includes DROP TABLE and CREATE TABLEstatements in the created dump. Hence, if you are using incremental backups or you specifically want to restore data without deleting previous data, make sure you use the –no-create-info option while creating a dump.
mysqldump -u [username] -p [database] --no-create-info > dump.sql
  • If you need to just copy the schema but not the data, you can use –no-data option while creating the dump. 
    mysqldump -u [username] -p [database] --no-data > dump.sql

2. Copying the database dump on destination server

Once you have created the dump as per your specification, next step is transferring the data dump file to the destination server. We will use scp command for that.
Scp -P [port] [dump_file].sql [username]@[servername]:[path on destination]
Examples:
scp dump.sql root@130.243.18.62:/var/data/mysql
scp -P 3306 dump.sql root@130.243.18.62:/var/data/mysql

3. Restoring the dump

The last step is restoring the data on the destination server. mysql command directly provides a way to restore to dump data to the MySQL.
mysql -u [username] -p [database] < [dump_file].sql
Example:
mysql -u root -p testdb < dump.sql
If your dump includes multiple databases, don’t specify database in above command.
mysql -u root -p < dump.sql

Need to do it regularly? There is a better way!

Following above-mentioned steps, you can migrate your database between two servers easily but it can be quite cumbersome if it is going to be a recurring task. An all-in-one solution like Hevo takes care of this effortlessly and helps in managing all your data pipelines in an elegant and fault tolerant manner.
This is how the process will look like when done through Hevo:
  • Provide your Source and Destination server MySQL credentials and connection details to Hevo.
  • Write any business logic transformations if needed in the Hevo UI.
  • Lastly, define your tables and fields mapping for your databases in Hevo UI.
That’s it! You are all set.
Hevo will automatically catalog all your table schemas and will do all the necessary transformations for moving data between these MySQL instances. Hevo will fetch the data from your source MySQL server in an incremental fashion and restore that seamlessly onto the destination MySQL instance. Hevo will also alert you if there are any schema changes or network failures through email and Slack.
All of this can be achieved from the Hevo UI, no need to manage any servers or cron jobs.
You can also check our detailed tutorial on How to move data from MySQL to Amazon Redshift and analysis on whether you should build or buy your own data pipelines.

source : hevodata.com

0 comments:

Post a Comment