We can use a tool called replibyte to fix two problems in one go.
replibyte
can backup data from Postgres to S3 storage where S3 is a storage protocol supported by lots of providers.replibyte
allows us to do this and gives us tools to manage the size of the data as well as obscure fields.Ideally we want our backups to run with minimum database privileges. You could add the following to your database migrations.
CREATE ROLE readonly LOGIN ENCRYPTED PASSWORD '****************';
GRANT SELECT ON ALL TABLES IN SCHEMA public TO readonly;
GRANT SELECT ON ALL SEQUENCES IN SCHEMA public TO readonly;
We can create a generic configuration file in crates/db/replibyte.yaml
like so. This will reference ENV vars only.
# This confuration file is for https://www.replibyte.com/ a tool for to tool to
# seed your databases with your production data while keeping sensitive data safe.
#
# optional - encrypt data on datastore
encryption_key: $PRIVATE_ENC_KEY
source:
# A connection to your prod DB. Make it readonly.
connection_uri: $SOURCE_DATABASE_URL
datastore:
# It says AWS but you can use any S3 compatible service by setting the endpoint
aws:
bucket: $S3_BUCKET
region: $S3_REGION
credentials:
access_key_id: $ACCESS_KEY_ID
secret_access_key: $SECRET_ACCESS_KEY
endpoint:
custom: 'https://your-s3-compatible-endpoint'
# If you want to populate a dev database
destination:
connection_uri: $DEST_DATABASE_URL
Now create your S3 bucket and your cloud provider.
To generate an encryption key you can use
Add your secrets to your secrets manager. Here we're using Cloak.
If you're using cloak
.
And you should see your database schema.