In Tracy Kidder’s book called The Soul of the New Machine you can read how IBM, in its heyday, didn’t recognize what the market wanted. They owned the Big Iron world and pretty much ignored the minicomputer. This allowed DEC and Data General and others to change computing.
But IBM learned from this lesson. When they introduced their PC in 1981 they went with an industry standard architecture. This allowed for clone makers such as Compaq to spring up and for component manufacturers to get a piece of the pie. But this move also greatly expanded the pie so that everyone benefited. By using standard components and software, a consumer could shop around and find a variety of add-ons all at competing prices.
Apple didn’t get this. They claimed to produce the “computer for the rest of us” but by rigidly controlling not only the hardware but also the software, they doomed themselves to the 3% or so of the market they now hold.
Plus, “The rest of us” could not always afford Apple’s products because there were not alternatives and/or third party vendors producing goods for their stuff and hence the prices were always way above IBM clone prices.
Now after 30 years they might be catching on. With an Intel CPU inside and the ability to natively boot Windows XP, the Mac will finally be able to run the plethora of Windows software that is out there. Years ago there were always a few good apps – usually graphics apps – that only ran on a Mac. But in recent years, anything you can find on the Mac you can find many more choices (and usually less expensive choices!) in the Windows or open source world.
Is it too little too late? So far the voters in the Great Lakes Geek survey don’t really care that the Mac will be able to boot to XP. What do you think?
No comments:
Post a Comment