Recover Single Table From Large Postgres Database

Do you run a large Postgres database and don't have the time to recover every single table? Then you may have found a quick solution.

Recover Single Table From Large Postgres Database

"Hey, I deleted my data. Can you get it back?"

If you're the owner of a large database, you probably shudder when this ticket lands on your desk but, if you have a handful of backups, you might be in for a treat with this post.

In this post, we'll go over how to recover a few rows from a single table in a 2TB Postgres database.

1

Download a custom format backup locally

Our app is hosted on Heroku so we take daily backups of our database using pg_dump. These are stored as a custom format backup which is the more flexible option of them all.

Whilst we could rely on Heroku's Continuous Protection which allows you to fork your DB to an earlier time (good for an accidental TRUNCATE fix), that only goes back a few days and doesn't cover us if we need to recover data from over a few weeks ago.

To download the latest backup, we can use the Heroku CLI:

heroku pg:backups:download
2

Create a new local Postgres database

We'll need a new Postgres database locally to restore our single table to. To create a new database, we can use the following command:

createdb temporary-recovery-db
3

Create a temporary table to recover to

We'll need to create a table to restore our data to. We can do this by running the following command:

CREATE TABLE users (
    id SERIAL PRIMARY KEY,
    name TEXT,
);

By going this route, you avoid the issues of a full restore running into issues with foreign keys and other constraints.

4

Import the backup into the temporary table

Now we have a table to restore our data to, we can import the backup into it. We can do this by running the following command:

pg_restore --data-only -t users -d temporary-recovery-db backup-dump-name.dump

Now, you can run the following to check all the data is there using psql:

SELECT COUNT(*) FROM users;

Important to note that this command will only recover a single table. Even on a 2TB DB this command only takes a minute or two to run (most just finding the right part of the dump to run) which is WAY faster than a full restore will be which could take days on some machines.

5

Export to CSV

Now we have our data in a table, we can export it to CSV. We can do this by running the following command:

psql -c "COPY (SELECT * FROM users) TO STDOUT WITH CSV HEADER" > users.csv
6

Import into the production database

After you've found your customers data you can then move the file into your app repository, upload it to production, and now write a little script to import the old data. We'll use Ruby for this:

require 'csv'

CSV.foreach('path/to/your/csv/users.csv', headers: true) do |row|
  User.create(row)
end
7

Sleep

You've done it! You've recovered your customers data. Now you can sleep easy knowing you've saved the day.

Subscribe to my personal updates

Get emails from me about building software, marketing, and things I've learned building products on the web. Occasionally, a quiet announcement or two.

Email marketing powered by Bento