Best way to transfer millions of rows from one database to other

Posted on

Question :

There are two databases: DatabaseA and DatabaseB with the same schema. These databases have millions of rows.

The plan is to convert them into one single database (DatabaseCommon) and multiple schemas (schemaA and schemaB).

What is the best way to transfer data, indexes, etc. from the existing databases to DatabaseCommon?

Is there a Postgres script that can handle this use-case?

We can have downtime. There will be around 20 databases. Users are restricted to their own schema.

Answer :

First rename the schemas in the current databases to have the names you want to end up with:

alter schema public rename to schema_a;

And the same for other databases. Whether you need to change your apps, SQL functions, etc. depends on what kind of coding standards you used in the first place. This can probably be done with the system online, if your apps can cope and if you juggle search_path appropriately.

Then lock people out of each database in turn, and use pg_dump to dump out the schema and data, and load it to the successor database.

pg_dump database_a -n schema_a | psql -1 database_common

If you have extensions in special schemas, elaborate systems of roles and grants, and stuff like that, then you might need to do some work to get that to work. There is no substitute for having a test/QA database and using it for testing and practicing.

Leave a Reply

Your email address will not be published. Required fields are marked *