Member-only story
Shared with 3 cases
Sample database preparation
- Setup : 1.1 clone this repo : https://github.com/datacharmer/test_db 1.2 mysql < employees.sql
- Check database info
MariaDB [(none)]> use employees;
MariaDB [employees]> select count(1) from salaries;
+----------+
| count(1) |
+----------+
| 2844047 |
+----------+
Lets pick salaries table since it has 2m rows, and migrate this sample salaries table into another database named ‘test’
Case1 : export table(s) and import into another database, same table structure .
Use mysqldump
export :
mysqldump -u root -p --databases employees --tables salaries > /tmp/salaries.sql
There are lot more options here . e.g. you can add ‘ — insert-ignore’ when need to .
import :
mysql -u root -p test < /tmp/salaries.sql
The export and import entire process only took < 10 seconds (1cpu, 2GB ram) .
Mysqldump is standard approach when backup/restore tables with same structure . it’s fast enough for millions of rows table, but if you have multiple tables with millions of rows, consider multiprocessing parrellel mysqldumpeach table to make process faster .