At work, I’ve been using the same computer since I joined the company in January 2002. Being in full low-budget mode, they gave me the cheapest desktop money could buy. Usually a programmer’s desktop is slightly beefier than the target platform, so that it can comfortably pretend be the target computer while simultaneously running all the programming programs. Mine has aways been significantly slower.
So I’ve been petitioning for a new computer. One that can simultaneously run my increasingly taxing development software while pretending to be a network of servers. We agreed that once I’d finished a crucial project, I’d get a new machine. A really fancy machine.
I’ve owned mostly Macs and Amigas, and my experience with mainstream PCs has usually been negative. Once you get a PC that mostly works, it’s fine. Just don’t upgrade it or use a weird keyboard, operating system, or hard disk. Whereas Macs and other custom systems have one company in charge of everything, PCs consist of dozens of components slapped together with nobody in charge of overall quality. (Microsoft works around Intel, Intel tries to work around Microsoft, the memory manufacturers don’t talk to the disk manufacturers, etc.) In theory Dell, Gateway, and Lenovo should be in charge, since they sell you the machine, but they have the luxury of being able to blame Intel and Microsoft– or others– for a variety of woes. In contrast, if something goes wrong with a Mac, Apple gets the blame, period.
In January I got a build-to-order dual-core Athalon with the latest fast disk drives and four gigabytes of memory. It’s getting returned today because I haven’t gone more than a day without it crashing on me. Among the issues I’ve discovered:
- The motherboard didn’t like the hard disks. It turns out that the manufacturer fixed this, and we installed a patch.
- The AMD Athalon processors crash when they send too many zeroes all at once to the memory chips. Mind you, the whole point of the processor– and digital computers in general– is to send zeroes and ones here and there. You’d think this would be unacceptable. But it’s a known bug, and it’s been around for over a year. The only reason it’s not more of a problem is that most of the time the computer gets delayed or interrupted before it can transmit a trillion zeroes in a row. The more inefficient the rest of the computer, the more likely it is to get interrupted. But since I was running the latest versions of Linux (instead of Microsoft Windows) with vast amounts of memory (which is initially all zeroes) I hit this frequently.
Dan (the system administrator) installed the latest version of Ubuntu Linux, and ran it in 32-bit mode (instead of the bleeding-edge 64 bit mode.) This made it just inefficient enough that we’d be less likely to run into the Athalon bug.
After over a week of back-and-forth between Dan and the store that built the computer, I finally got it back. My first test was to install my home directory from the old machine (backed up to DVD) onto the new machine.
It crashed before it could finish copying my files. I don’t know why or how, but that’s one too many problems. It passes all the tests that General Nanosystems (the local company which built it) could throw at it. But they didn’t try writing all zeroes. Nor did they try sitting down and using it as a programmer’s desktop. And the latter is what I need it to be capable of handling.
So I’m back to my four-year-old $500 computer. When he gets the chance, Dan is going to very carefully spec out a new machine for me. There’s a good chance it will be a server in a desktop’s body, rather than a ultra-high-end gaming machine with mediocre graphics and sound. Which means that, on paper, it will be very similar, but with parts that cost twice as much to do the same thing. But those parts will be at the low end of what’s used in $1000-$10,000 machines, rather than the high end of what’s used in $500-$2000 machines that people don’t mind rebooting all the time.