We wanted to migrate from one database to an other. For the sake of simplicity let’s say we moved from MySQL to PostgreSQL. Our database had 100 million rows in certain tables, so that’s not the easiest migration. Many things can occur because even in the best case scenarios it takes at least 1 hour to move this amount of data. Since it’s really sensitive production data we wanted to know every possible issue which can occur before it’s even happening. This allows us to plan failover scenarios.
So the question was how to test with this amount of data? So we created sqlfuzz, a tool which fill the necessary data into the database very quickly.
Let’s say we have a sample employees table created:
CREATE TABLE employees (
emp_no INT NOT NULL,
birth_date DATE NOT NULL,
first_name VARCHAR(14) NOT NULL,
last_name VARCHAR(16) NOT NULL,
gender ENUM ('M','F') NOT NULL,
hire_date DATE NOT NULL,
PRIMARY KEY (emp_no)
We want to test it with 50000 rows. So we run the sqlfuzz with the command of:
$ sqlfuzz -u test -p test -d test -h 127.0.0.1 -t employees -n 50000 -w 100
2021/03/23 13:14:20 Fuzzing employees table taken: 7.557800032s
It’s finished under 7 seconds and we got the expected result:
mysql> select count(*) from employees;
| count(*) |
| 50000 |
1 row in set (0.01 sec)
The installation process is easy. Under the Github project we created binaries for most of the OS/Architecture pairs. Each release has these so after someone downloads it, it can easily start to fuzz the tables.
Non-supported OS/Architecture pairs can also be built easily since it’s Go and the only requirement is Go 1.15+.
We are warmly welcome any feedback/issue under the Github project. Hopefully we can implement them as soon as possible. We are working on the project to make the production-replicated testing easier. Github Stars are also welcome as a support to make it broadly recognized. If the community is growing the number of the feedbacks can help us to create the best tool for that purpose.