lkv lkv
Thu Dec 7 08:25:51 PST 2006
Hi guys,

I'm looking at few ideas on how to backup
postgresql data incrementally, using some sort of
checkpoints. The first thing that came to mind is to
use slony log shipping to produce checkpoints
which I can reply later.
(Since the databases already use slony for
replication)

The reason of this is to escape from the need to
take full fledged dumps which can consume quite
some space when stored. Also the archival server
would not be much useful if the dumps are zipped,
so the requirements are: small in size; preferably
plain text (say  sql queries).

So my plan is the following. Restart the shipping
every night at say 3am and transfer the shipping log
produced for the 24h period to a server where it can be
stored on a archival storage system. The storage system
in question is called Venti[1]. The fileserver itself
will compress and store the logs in some namespace
like so /backup/db/foo/ddmmyyyyy/checkpoint.
Every week/month a new dump is taken in order to
make the interval between checkpoints small when
there's a need for the logs to be replayed.

Do you see this doable?

thanks in advance,
Lou






More information about the Slony1-general mailing list