Page 1 of 1
[12:01:29] eljojo: Cache_Money: because we currently don't have many users and database integrity is crucial, i back up 4 times a day. it might be an overkill but it doesn't really hurt us.
[12:01:50] eljojo: Cache_Money: I'm backing up to S3 and have a "rule" set up, so backups older than a week go to Glacier
[12:06:38] eljojo: Cache_Money: I actually don't know how to tell you the size of my db. Any easy way of measuring it? I'm using postgres
[12:09:02] eljojo: so, my database is 100 megabytes but I guess my rows are pretty similar, because the resulting, encrypted using pgp, tar file is 14 megabytes
[12:10:31] eljojo: the dump probably doesn't include indexes, which probably reduces the size. Yes, i've had to re-create a database from the backups and it was very useful to have them
[12:20:10] eljojo: Cache_Money: I have about 40 models and around 400 users, but my app is slightly uncommon: we do price comparison
[12:22:22] eljojo: Cache_Money: we're currently just crawling two sites, but it runs "constantly". I have sidekiq jobs that run evenly spread throughout the whole week
[21:10:41] eljojo: i just wanted to know, maybe someone's doing an event here in Berlin or something. just to upgrade my machines.