@Pete: actually archiving database content elsewhere could have it's merits, in some cases.
Imagine a huge and constantly changing database of invoices, classifieds, messages, history data etc. Perhaps not the best possible examples, but anyway something that can grow into vast mass. Unless you keep adding extra muscle to the machine running your database (in which case there would only be theoretical limits to worry about), operations could become unbearably slow in the long run.
To avoid that you could decide not to keep records older than, say, two years, in your production database. In case that you don't actually want to completely destroy old records, you'd need a way to move them aside (or archive them) in a way that enables you to later fetch something (doesn't have to be easy, though).
Admittedly not the most common use case, but not entirely unimaginable either
As for the solution, there are quite a few possibilities. In addition to deleting pages periodically you could do one or more of these:
exporting pages via API into CSV or XML file(s)
duplicating existing tables for local "snapshots"
performing regular SQL dumps (typically exporting content into .sql files)
using pages to store data from other pages in large chunks of CSV/JSON (or custom fieldtype per Pete's idea)
In any case all of this isn't really going to be an issue before you've got a lot of data, and by lot I mean millions of pages, even. Like Pete said, caching methods, either built-in ones or ProCache, will make typical sites very slick even with huge amounts of content.
If your content structure is static (unchanged, new fields added and old ones removed or renamed very rarely), custom fieldtype is a good option, and so is a custom database table. These depend on the kind of content you're storing and the features of the service you're building.