I have always been an early adopter. I owned an AIM-65 early 6500 development system, a commodore 64, wrote programs on an Apple II at work, and had the first generation IBM PC on my desk (4 MHz 8088, dual low density floppies, no hard disk). From those prehistoric times to the present, software has always seemed to fall short of the potential of the underlying hardware. Not that the 8086 architecture is the epitome of elegance.
I have suffered through DOS, Windows 3.1, Windows 98, Windows NT, Windows XP, and Windows Vista on the PC, while being exposed to much more stable VMS and Unix operating systems on larger machines. Why, oh why, is Windows still so buggy and unstable? Vista is a step backwards, so I keep XP chugging along on most of my machines at work and at home. Even so, it seems I can't go too many years without having to reload XP when it gets too gummed up from all the exposure to viruses, spyware, and from all my install/removals of software I evaluate or use.
Reloading, reactivating, and trying to get back to where I was has always been a major pain, but at some point, too much accumulated damage occurs too easily with Windows and it eventually becomes unmaintainable. The average user is clueless and doesn't know how to maintain their machine. Even with obtrusive virus and spyware software, their machines quickly get gummed up beyond repair, and they all too soon either buy a new machine or suffer through major loss on reload because they rarely back their stuff adequately. That is if they even still have half of their original disks for all the accumulated "stuff" on their hard drives.
Apple does a much better job with their operating systems, but I don't like their higher price, proprietary hardware, and their closed system attitude toward development on that platform. Their machines are arguably worth it and I have no problem recommending them to new users.
There has be a better way, and I’ve found some which I’ll report on shortly.
No comments:
Post a Comment