Here's an old post I made on the subject,
You're making the big assumptions that hard drives never fail (wrong) and that all the involved parts (mechanical, electronical, power supply, firmware etc.) will always do their job perfectly (proved wrong everyday).
Stuff about hard drives remembering everything
Some say that they proactively "catch" such errors before they occur and manage to copy stuff over to a new hard disk before it's too late, and so on. Yeah right, do you get a new hard disk every 6 months just to be on the safe side/rotate them? Especially if you keep all of your eggs in one basket (your new 1.33 TB drive, of unknown future reliability), you could lose a lot of data in a single instance.
IMO, it's better to scatter/differentitate backups: in general, only data that I burned to (quality) optical media, scattered/replicated accross multiple machines/hard disks and even floppies survived my changing machines and the at least 4 major hard disk crashes/errors leading to total data loss I've had in the last 6 years. Just ONE drive? Not even close.
Actually, we've hit a technology brickwall, at least for semiconductor based digital electronics. That's why most improvements tend to be incremental (e.g. adding more cores or making bigger memory chips), rather than really innovative breakthroughs (e.g. you won't see something akin to going from 1-bit color to 32-bit color today, or inventing a totally new kind of memory, e.g. going from iron core memory to 6-cell registers, or a totally new architecture, like e.g. the Amiga vs single-cpu "dumb" PCs).
Um.... In my 28 years I've seen a lot of new things being invented and still being invented even now. I've lived in 2 different countries with clashing government systems, and I've seen things all around me dynamically changing constantly.
On the software front, things aren't getting much better either: most new "killer apps" are mostly mobile apps or their web equivalents with dumbed-down interfaces and an emphasis on heavy database/personal data usage, rather than e.g. bringing something like Doom in a world dominated by 2D platformers, and even modern OSes are trying to follow suit to this trend.
There seems to be a higher priority in "getting it all together/connected" and cobbling together more complex, monstruous, elephantine apps that use a 1000 enterprisey APIs and frameworks just because the incremental hardware upgrades allow it, rather than advancing the state of the art in software design with something truly innovative. Even exploiting multicore/parallel programming is trapped by rigid paradigms and limited by logical constraints (tons of inherently serial stuff which cannot be parallelized).
Last edited by Maes on Jan 25 2013 at 09:01