Setting Apple Apart
With the iBook, 1999
Clams, Ice Cubes, and Sunflowers
Ever since the introduction of the iMac in 1998, Jobs and Jony Ive had made beguiling design a signature of Apple’s computers. There was a consumer laptop that looked like a tangerine clam, and a professional desktop computer that suggested a Zen ice cube. Like bell-bottoms that turn up in the back of a closet, some of these models looked better at the time than they do in retrospect, and they show a love of design that was, on occasion, a bit too exuberant. But they set Apple apart and provided the publicity bursts it needed to survive in a Windows world.
The Power Mac G4 Cube, released in 2000, was so alluring that one ended up on display in New York’s Museum of Modern Art. An eight-inch perfect cube the size of a Kleenex box, it was the pure expression of Jobs’s aesthetic. The sophistication came from minimalism. No buttons marred the surface. There was no CD tray, just a subtle slot. And as with the original Macintosh, there was no fan. Pure Zen. “When you see something that’s so thoughtful on the outside you say, ‘Oh, wow, it must be really thoughtful on the inside,’” he told Newsweek. “We make progress by eliminating things, by removing the superfluous.”
The G4 Cube was almost ostentatious in its lack of ostentation, and it was powerful. But it was not a success. It had been designed as a high-end desktop, but Jobs wanted to turn it, as he did almost every product, into something that could be mass-marketed to consumers. The Cube ended up not serving either market well. Workaday professionals weren’t seeking a jewel-like sculpture for their desks, and mass-market consumers were not eager to spend twice what they’d pay for a plain vanilla desktop. Jobs predicted that Apple would sell 200,000 Cubes per quarter. In its first quarter it sold half that. The next quarter it sold fewer than thirty thousand units. Jobs later admitted that he had overdesigned and overpriced the Cube, just as he had the NeXT computer. But gradually he was learning his lesson. In building devices like the iPod, he would control costs and make the trade-offs necessary to get them launched on time and on budget.
Partly because of the poor sales of the Cube, Apple produced disappointing revenue numbers in September 2000. That was just when the tech bubble was deflating and Apple’s education market was declining. The company’s stock price, which had been above $60, fell 50% in one day, and by early December it was below $15.
None of this deterred Jobs from continuing to push for distinctive, even distracting, new design. When flat-screen displays became commercially viable, he decided it was time to replace the iMac, the translucent consumer desktop computer that looked as if it were from a Jetsons cartoon. Ive came up with a model that was somewhat conventional, with the guts of the computer attached to the back of the flat screen. Jobs didn’t like it. As he often did, both at Pixar and at Apple, he slammed on the brakes to rethink things. There was something about the design that lacked purity, he felt. “Why have this flat display if you’re going to glom all this stuff on its back?” he asked Ive. “We should let each element be true to itself.”
Jobs went home early that day to mull over the problem, then called Ive to come by. They wandered into the garden, which Jobs’s wife had planted with a profusion of sunflowers. “Every year I do something wild with the garden, and that time it involved masses of sunflowers, with a sunflower house for the kids,” she recalled. “Jony and Steve were riffing on their design problem, then Jony asked, ‘What if the screen was separated from the base like a sunflower?’ He got excited and started sketching.” Ive liked his designs to suggest a narrative, and he realized that a sunflower shape would convey that the flat screen was so fluid and responsive that it could reach for the sun.
In Ive’s new design, the Mac’s screen was attached to a movable chrome neck, so that it looked not only like a sunflower but also like a cheeky Luxo lamp. Indeed it evoked the playful personality of Luxo Jr. in the first short film that John Lasseter had made at Pixar. Apple took out many patents for the design, most crediting Ive, but on one of them, for “a computer system having a movable assembly attached to a flat panel display,” Jobs listed himself as the primary inventor.
In hindsight, some of Apple’s Macintosh designs may seem a bit too cute. But other computer makers were at the other extreme. It was an industry that you’d expect to be innovative, but instead it was dominated by cheaply designed generic boxes. After a few ill-conceived stabs at painting on blue colors and trying new shapes, companies such as Dell, Compaq, and HP commoditized computers by outsourcing manufacturing and competing on price. With its spunky designs and its pathbreaking applications like iTunes and iMovie, Apple was about the only place innovating.
Intel Inside
Apple’s innovations were more than skin-deep. Since 1994 it had been using a microprocessor, called the PowerPC, that was made by a partnership of IBM and Motorola. For a few years it was faster than Intel’s chips, an advantage that Apple touted in humorous commercials. By the time of Jobs’s return, however, Motorola had fallen behind in producing new versions of the chip. This provoked a fight between Jobs and Motorola’s CEO Chris Galvin. When Jobs decided to stop licensing the Macintosh operating system to clone makers, right after his return to Apple in 1997, he suggested to Galvin that he might consider making an exception for Motorola’s clone, the StarMax Mac, but only if Motorola sped up development of new PowerPC chips for laptops. The call got heated. Jobs offered his opinion that Motorola chips sucked. Galvin, who also had a temper, pushed back. Jobs hung up on him. The Motorola StarMax was canceled, and Jobs secretly began planning to move Apple off the Motorola-IBM PowerPC chip and to adopt, instead, Intel’s. This would not be a simple task. It was akin to writing a new operating system.
Jobs did not cede any real power to his board, but he did use its meetings to kick around ideas and think through strategies in confidence, while he stood at a whiteboard and led freewheeling discussions. For eighteen months the directors discussed whether to move to an Intel architecture. “We debated it, we asked a lot of questions, and finally we all decided it needed to be done,” board member Art Levinson recalled.
Paul Otellini, who was then president and later became CEO of Intel, began huddling with Jobs. They had gotten to know each other when Jobs was struggling to keep NeXT alive and, as Otellini later put it, “his arrogance had been temporarily tempered.” Otellini has a calm and wry take on people, and he was amused rather than put off when he discovered, upon dealing with Jobs at Apple in the early 2000s, “that his juices were going again, and he wasn’t nearly as humble anymore.” Intel had deals with other computer makers, and Jobs wanted a better price than they had. “We had to find creative ways to bridge the numbers,” said Otellini. Most of the negotiating was done, as Jobs preferred, on long walks, sometimes on the trails up to the radio telescope known as the Dish above the Stanford campus. Jobs would start the walk by telling a story and explaining how he saw the history of computers evolving. By the end he would be haggling over price.
“Intel had a reputation for being a tough partner, coming out of the days when it was run by Andy Grove and Craig Barrett,” Otellini said. “I wanted to show that Intel was a company you could work with.” So a crack team from Intel worked with Apple, and they were able to beat the conversion deadline by six months. Jobs invited Otellini to Apple’s Top 100 management retreat, where he donned one of the famous Intel lab coats that looked like a bunny suit and gave Jobs a big hug. At the public announcement in 2005, the usually reserved Otellini repeated the act. “Apple and Intel, together at last,” flashed on the big screen.
Bill Gates was amazed. Designing crazy-colored cases did not impress him, but a secret program to switch the CPU in a computer, completed seamlessly............