Five years ago today, Steve Jobs passed away — just one day after Apple introduced Siri on the new iPhone 4s. While the pundit press immediately dreamed of a rapid collapse of Apple in his absence, instead the company Jobs cofounded spiraled upward to become the largest and most profitable ever. Why were they all so very wrong?
Steve Jobs and the Confident No
Prior to his premature passing, Jobs impacted an incredible four decades of the PC industry. In the 1970s, he played an oversized role as a young, newly minted executive in the sprouting of Silicon Valley. He witnessed—and materially participated in—the birth of an entirely new personal computing industry. He observed first hand what worked and what didn’t when selling technology products to mainstream users.
In the 1980s he continued to grow up alongside the PC industry. Faced with looming competition from an entrenched IBM, he drove the underdog Apple’s ambitious level of investment in the Macintosh. That new computing platform delivered an entirely unique set of human interface guidelines that defined the “right way” to present information, to respond to a human user and to provide feedback. It also defined wrong activities: it said “No” to developer “freedoms,” tightly reining in platform behaviors to establish a consistent, intuitive way to work for users.
The Macintosh also said No to a variety of legacy layers. It said No to standard 5.25 inch floppies and wasn’t compatible with earlier Apple II software or the emerging definitions of the IBM PC. It initially said No to the command line as a fall back to using the Mac Desktop.
Jobs’ “courage of conviction” in a variety of decisions related to the Macintosh and its future led to a solid user environment for desktop publishing but eventually met with resistance inside Apple, where Jobs’ convictions on how to invest the company’s profits and how to sell new hardware weren’t unanimous. By 1986, Jobs chose to leave Apple and start over on a project he could more completely control: NeXT.
At NeXT, Jobs—along with a lot of engineering talent poached from Apple—boldly charted a new future for workplace computing using powerful, networked systems featuring sophisticated software development frameworks running on a solid Unix foundation into the 1990s. NeXT relaxed some No decisions from the Macintosh era (like the Unix command line), but advanced a variety of even more courageously bold design choices.
As a smaller player in a restricted market (Apple effectively limited NeXT to selling mostly to higher education), it was harder for NeXT to confidently and successfully express No in terms of design. But in 1996, when Apple acquired NeXT and brought Jobs back home to Cupertino, Jobs regained the ability to define a confident definition of product and platform that liberally said No.
Saying No is critical to good design and engineering.
Saying No is critical to good design and engineering. Without a confident courage backed with informed conviction, products can lack clear definition and purpose. The 1990s Apple was already suffering from a Culture of Yes, where products like the Newton (and even the Copland future of Mac OS) were enthusiastically agreeing to lots of objectives without being able to complete them. No isn’t just negative, it’s formative. Saying No means you can concentrate on a limited number of Yes features. No is an expression of mature restraint.
Contempt for Jobs, and afterward
Jobs’ charisma and confidence in delivering products that were not afraid to say No attracted lots of criticism from detractors, particularly those who supported rival firms promising an ability to say Yes to everything.
Since Jobs’ death, tech punditry has constantly hammered Apple for doing—or not doing—what, in their opinion, “Steve would have done.” However, across the last two years of Jobs’ life, the tech media almost unanimously criticized every step Jobs made, ranging from a contemptuously skeptical reception of 2010’s iPad that was so ignorantly dismissive of Apple’s new tablet that Jobs would later confide to his biographer that he was left feeling annoyed and depressed—to the extended, contrived “scandal” of AntennaGate that desperately aimed to derail iPhone 4 later that same year.
Across 2011, they took gleeful delight in prying into Jobs’ failing health as he struggled with cancer, posting photos backed with their personal speculation of how soon he might die. Make no mistake, the punditry serving the commodity PC industry loathed Jobs as deeply and as blindly as American conservatives hate Hillary Clinton. They even employed the same playbook: a snowstorm of invented scandals absurdly named after Nixon’s Watergate, personal scapegoating vilifications of blame for every ill in the world (including suicides in China), and trolling health concerns.
However, as soon as Jobs actually passed away, they immediately spun around and pretended to respect his accomplishments, with the transparent purpose of suggesting that without him, Apple would quickly run out of ideas and die itself.
Suddenly, 2007’s iPhone and 2010’s iPad were the only accomplishments Apple had made in recent history, and there was no evidence of a new form factor with the power to similarly revolutionize the world. And without Jobs, how could there ever be a new one?!
This was, of course, just another lie. Jobs didn’t single handily invent the iPhone and then sit down to hand-draw the iPad as an encore. As was long rumored—and then actually revealed in court documents—teams at Apple had initially developed a “Safari Pad” prototype first, but then adapted that technology to deliver a new class of smartphone, which the company realized it could more effectively bring to market and sell.
The revolution of iPhone and iPad were not two genius hardware form factors that could never be duplicated again. Both were simply marketable instances of a new kind of technology portfolio Apple began building on the foundation of the Macintosh and its NeXT-derived development platform.
The bold step was the development of a mobile-first technology platform: iOS. It had a clear strategy, it had strong definition, and most importantly, it was not afraid to say No.
Steve Jobs greatest accomplishment was not a series of remarkable hardware product introductions
Steve Jobs greatest accomplishment was not a series of remarkable hardware product introductions. It was instead a continuum of regimented platform development working to make practical application of the most promising technology advances as they became available. It just so happened that the Macintosh was followed by NeXT and the iMac, iPod and MacBooks and then iOS devices. It was all a stream of technology investment that flowed toward where the puck was headed: first in office desktops, then as portable devices, then as mobile devices.
Apple is now evolving to deliver ultra-mobile wearables with Apple Watch and AirPods. The company isn’t lacking a Jobs-worthy vision for the next new hardware shape. It has distilled and cultivated the very thinking that delivered all of those examples of success that occured under Jobs’ direction. And central to that vision is the ability to say No.
My name is No. My sign is No. My number is No.
When Jobs introduced iPhone in 2007, he highlighted three key features: a “revolutionary mobile phone” that also combined a “widescreen touch-control iPod” and a “breakthrough Internet communications device.”
That was marketing. What actually made iPhone radically different from previous smartphones was that it made an ambitious leap in computing sophistication, packing more processing power and system memory than anything from Nokia or Samsung or Sony or Palm or Blackberry or Microsoft, and loading a powerful computing platform with the capability to run an actual web browser and desktop email that no other maker had considered feasible given the limited processing power on existing mobile phones.
What Apple managed to shoehorn into the tiny device was impressive, but even more “courageous” were the bold decisions on “features” Apple expressly opted to leave out. This included a variety of industry checkboxes that everyone else had deemed essential for selling phones. This was particularly emphasized by PR journalists tasked with carrying the water of Apple’s competitors.
iPhone omissions included a lack of support for BlackBerry Enterprise Server messaging or the signature Blackberry physical keyboard that RIM had popularized (both of which had converted business users into “crackberry” addicts). It also included no support for the WAP “baby Internet” of simplified, mobile-only websites (nor Japan’s own i-mode baby Internet) and made no pretext of ever running existing mobile software created for Sun’s JavaME, Adobe’s Flash Lite, Nokia’s Symbian, PalmOS or Windows Mobile.
At the time, virtually every “smartphone” was trying to run JavaME and many had licensed Flash Lite. Palm had even jumped ship to bundle Windows Mobile on its phones in a bid for relevancy in the enterprise. Here was Apple with a device that not only snubbed the status quo in smartphones, but even failed to support GSM’s vision for MMS picture messaging and the entire global deployment of CDMA, hailed as the future of mobile networks.
If we weren’t today looking back with perfect hindsight, it might seem hard to believe that a company the size of Apple back in 2006 was aiming to gut the entire mobile industry with a disrupting product that didn’t do any of the things that seemed—by way of assumptions grounded on rival vendors’ marketing—essential at the time.
Saying No was core to the successful launch of iPhone
However, saying No was core to the successful launch of iPhone because it enabled Apple to focus on differentiating features rather than being consumed with trying to maintain parity with the shifting specifications of its rivals. The allure of iPhone was the magical speed and simplicity of its multitouch user interface, the capacity of its Mail, Safari and Maps client apps, and its extension of the iPod ecosystem as a music and movie player.
iPhone wasn’t Apple’s first No
Years earlier, Apple had launched iPod, featuring lots of No as well. It purposely didn’t play Windows Media DRM, even though the commodity chips it used had the ability to do this. As a result, the popularity of iPod meant that iTunes and DRM-free sources of music and videos would sustain not only iPods, but also ensure that compatible music and movies remained available for Mac users. Microsoft had been working to wield its PC monopoly to make the Mac irrelevant as a media playback system, but iPods and iTunes broke right through that strategy.
Similarly, if Apple had saddled iPhone with compatibility for Java applets or Flash content, it would have been held at the whim of Sun or Adobe, a history Apple had already experienced on the Mac—where an alarming number of security flaws and performance issues were rooted in its support for Java and Flash plugins that Apple couldn’t force its users to live without. On iOS, Apple could start over and do things right. Priority number one: no dependance on incompetent partners’ middleware platforms.
This “new” mobile platform strategy had been core to Jobs’ vision for decades. At times it had appeared to be a mistake, and was widely considered to be a core reason for NeXT failing to accomplish much commercially over its first decade. However, in hindsight the work done at NeXT was far superior to the parallel developments at Apple, where from 1986-1996, Jobs Strategy of No was replaced with a new Culture of Yes, Sure, Why Not!?
After Jobs, Apple turned strong Yes
While today’s pundits seem to think that Apple and IBM remained enemies right up until Tim Cook forged the Mobile First partnership for iOS Apps in 2014, the reality was that Apple began working on how to make Macs more “Yes” in a variety of ways in the late 1980s. It worked to make the Mac desktop run on top of Unix machines, and partnered with IBM on a series of levels ranging from PowerPC chips to the OS/2 Microkernel to Taligent software development frameworks to Kaleida Labs multimedia.
The Post-Jobs Apple worked on its own vision of making Macs run old software next to modern new software; considered running multiple platforms (Unix, MacOS, OS/2) on the same hardware; layered various strategies for electronic messaging, group chats, modular software, media sharing and networking ideas, even advanced research into virtual reality and the organization and indexing of data for V-Twin search.
Between 1986 and 1996, Apple released various small batch hardware products that it struggled to market and sell in sustainable quantities, and it eventually worked with licensee partners to co-develop Mac and Newton-branded devices, ranging from wireless tablets built by Motorola to Mac-based game console boxes built by Bandai. If this all sounds familiar, its because the non-Jobs Apple is virtually identical to today’s Google: the epitome of Yes-Yes, as the search giant demonstrated yesterday.
All that Yes nearly killed the old Apple. In trying to do everything, it was accomplishing nearly nothing. It was spending millions on research and technology but was unable to effectively bring much of it to market. Further, its shifting strategies were turning off third party developers and rendering it as unfit for use in the Enterprise, where random roadmap changes are a liability, not a feature. These are all problems Google’s Yes-Yes! Android also suffers, for the same reasons.
By 1996, Apple was facing a serious strategic crisis. A series of chief executives had aimed to sell off the company as a technology portfolio to IBM, Sun or Oracle, or to simply slash through the wasteful spending that was getting the company nowhere (which is exactly what Google has been doing over the last couple years). In parallel, Jobs’ NeXT offered Apple an alternative: a streamlined future strategy for rebuilding the Mac into a modern platform consumers would want to buy.
Jobs brings Apple back with a strong No
Getting Apple back on track would require discipline and focus. Jobs famously slashed away Apple’s clones, the company’s own confusing array of Mac models and sub-brands, and terminated all sorts of internal developments ranging from QuickDraw 3D to PowerTalk to the “Yes, to everything” Newton tablet. The remaining focus allowed Apple to target what consumers were actually ready to buy: the luggable new iMac, an easy to use PC for accessing the Internet; revamped, sleek new PowerBooks for powerful mobile computing; and a few years later, the very mobile iPod for taking music on the go.
As the 2000s began, the new Apple made its first major misstep: the PowerMac G4 Cube, a throwback premium 1990s desktop computer offered to a market that was not only being rocked by the 2001 Dotcom Collapse, but was also decisively going mobile. Fortunately for Apple, the company did largely recognize the clear and incessant march toward mobility. Desktop PCs were not the future. Apple also muddled through the Mac mini and Xserve and Mac Pro, but didn’t make any of those smaller products central to its computing strategy.
Sales of iPods and PowerBooks, augmented with increasing sales of consumer iBooks and then MacBooks, had primed Apple to focus on integrated software and hardware targeting mobility, via battery chemistry, efficient computing and a reduction of size and weight. There were lots of No decisions enabling that focus on mobility.
Apple could have just scaled down the mobile Mac slightly to deliver the same thing that Microsoft and its partners had been working on: Pocket PC PDAs with Compaq and convertible laptop-tablets with Samsung (below). These products were heavy, thick, expensive and not very powerful, but dutifully bundled a stylus pen just like the Newton had a decade prior. They were Yes to the core: everything anyone could ask for. Tech media pundits incessantly praised them despite their limited utility and unclear value.
In stark contrast, Job’s Apple of No worked to strip down the Mac platform and build a core new mobile-centric user interface driven by multitouch rather than a keyboard, trackpad or 1990s stylus. At the same time, it leveraged powerful NeXT development frameworks to enable very light and efficient devices to run powerful desktop-class apps.
However, Apple’s first prototype for a new, highly mobile tablet capable of running Safari (but not all Mac legacy software) ran into its own No: it wasn’t clear who would pay for it. It was, however, becoming clear that if the device could be made even smaller, it could be paired with a phone and iPod features to make a very powerful new class of smartphone. Rather than being seen as a not-very-powerful Mac tablet that couldn’t run Mac software, it could be a very powerful new iPhone mobile enough to carry anywhere, at all times.
That was the birth of iPhone, but more importantly it was the birth of iOS.
No vs Yes
Apple’s effective use of No to reach engineering milestones resulted in a deliverable, salable smartphone that was strongly differentiated from its competitors. After the iPhone appeared, Microsoft’s Windows Mobile partners—including Samsung and HTC—scrambled to polish their basic devices and claim some Yes connection to running Windows desktop software (they couldn’t) or providing a stylus (a liability, not a feature). Many of them then turned to Symbian, hoping Nokia’s basic PDA platform could launch a defensive strike.
By the end of 2009, all existing software platform alternatives to iOS were facing extinction. A variety of failed Windows Mobile and Symbian licensees decided to try supporting Google’s Android, which delivered the closest approximation of an iPhone-like product. However, Google, like the previous decade’s Apple, was all about Yes rather than making strong technology leadership decisions.
Android worked to be all things to anyone, promising that software could be found anywhere and liberally passed around without any pesky centralized security of a Walled Garden. Yes to trackballs and the stylus and physical slide out keyboards. Yes to carrier restrictions on WiFi. Yes to cost cutting by manufacturers installing paltry RAM or faking hardware features to save money.
Predictably, the Yes Android campaign ended in lots of broken promises. But while Google was trying to help foreign manufacturers copy the iPhone, Apple was working to develop a new computing form factor that leveraged many of the advantages of the iPhone with the greater canvas of a tablet. After proving successful, Android cloners tried to copy iPad too, with an expanded Yes strategy that heaped contempt upon Apple’s intentional lack of support for Adobe Flash.
Google said Yes to Flash, Yes to multiple windows, Yes to lots of ports and removable memory cards and Yes to detachable battery packs. Rather than copying iPad’s success, Android 3.0’s tablet focus was a huge and embarrassing flop throughout 2011. Today, after years of Yes tablets, Android still has no real platform strength in tablet apps and the enterprise shuns Android tablets as solidly as it rested oddball Mac users in the 1990s.
In parallel with Google, Microsoft tried to reestablish a market for Windows Mobile and also said Yes to a more mobile Windows for tablets. Yes to no compromises. Yes to Intel desktop chips, and Yes to ARM chips. Yes to Surface RT tablets that couldn’t run legacy Windows apps.
Customers said yes to Apple’s No and no to all the Yes products.
Apple’s iOS says Yes only when it can
Apple’s willingness to say No is not an arbitrary gimmick. Many of the No decisions evident on the original iPhone were later changed, as circumstances and market power enabled Apple to expand its functionality and broaden its availability. That included adding subsequent support for MMS, BES, CDMA and many other examples.
The evolution of Apple’s iOS platform has also enabled smaller format tablets like the iPad mini after initially settling on 9.7 inches as the optimal format for tablet apps. Apple also expanded to a larger iPad Pro, but only after it has developed sufficient technology to support selling a larger format and supporting third party development. Apple did the same with larger iPhones in 2014, making the move only after high quality screens, faster Application Processors and development tools were in place to support a 5.5 inch iPhone.
The continuing development of the platform technologies in iOS are also enabling ecosystem expansions with Car Play and HomeKit, and new platforms for Apple TV and for Apple Watch, both of which expand the demand for iOS Apps and related Services.
Unsurprisingly, Apple is still getting considerable flack for making bold No decisions, such as the loss of an analog headphone jack on iPhone 7—a casualty of the decision to deliver robust IP67 weatherproofing, to expand upon haptics and 3D Touch and to double down on wireless audio distribution through enhanced Bluetooth or AirPlay.
Every year since Jobs’ passing back in 2011, Apple’s tenacious grasp of the overriding principle of learning to say No in engineering decisions—where the advantages of doing so outweigh the drawbacks of maintaining the status quo—has resulted in incremental progress unimpeded by the boat anchors of legacy and the albatrosses of Yes-Yes decision making among rivals that has not resulted in similar commercial success.
Steve Jobs may not have made all the same decisions currently being made by today’s Apple, but he’d no doubt be impressed to see the results of others applying his experience gained across decades of learning how to build “insanely great” products that aren’t afraid to say No when necessary.
In the tech world, five years feels like centuries. But for Apple, the legacy of Steve Jobs lives on.
The charismatic co-founder of the world’s most profitable company died October 5, 2011, after a long battle with pancreatic cancer. He was 56.
While Jobs has been gone for five years, he’s remained in the public eye through books and films that portray his life, career and personality. Jobs was always a charismatic and controversial leader, beloved by many (including millions of Apple fanboys throughout the world) but hated by those who faced his wrath.
On Wednesday, Apple CEO Tim Cook tweeted in remembrance of his predecessor and his impact on the world at large.
Attention to detail
Apple hired designer Clement Mok in 1982 to work on branding for 1984’s Mac launch. He became co-manager of Apple Creative Services in 1985 and served as creative director for corporate and education marketing. He’s one of the people responsible for the iconic imagery of Apple in its marketing and packaging, including the squiggly line drawings gracing early Mac promotional materials.
Another of Mok’s duties was to redesign Jobs’ business cards when Apple updated its brand identity and logo.
“I was to go over the Steve’s office and say, ‘This is your new business card. I want you to take a look at it before we send it out to the printers,'” Mok said.
Jobs examined the business card closely. “At that point, no one knew he took calligraphy,” Mok said. “Jobs was a fanatic about different typefaces. But we had no idea, at least many of us had no idea, that he had an appreciation of typography at that depth we now understand.
“He looked at the card and said, ‘Shouldn’t the kerning [the space between the letters] be tighter here and here? And here is too tight.’ I was flabbergasted that he would be so into the weeds on that one little detail,” Mok said. “That’s how obsessed Steve was with details. I think I gained an incredible respect for him at that point. I thought I could say, ‘Hey Steve, here’s your card. FYI.’ But he took time.”
By the way, Mok said he agreed with Jobs — the kerning was off on his original business card design.
Jobs’ obsession with detail went well beyond his business cards. He also cared deeply about the packaging used to sell Apple’s products. The sleek white iPhone and Mac boxes, now iconic, wouldn’t have happened if not for Jobs, said Tom Suiter.
Suiter served as Apple’s first director of creative services and helped launch the Mac in 1984. He was also part of a revamp of product packaging.
“When you think about [Apple packaging] and go into an Apple Store today and buy that package, it’s such a delightful experience…It’s so gorgeous. Apple’s known for that. But I was lucky enough to be around when it was really bad.”
When Apple launched its products in the early 1980s, “packaging was fragmented,” Suiter said. Different divisions had different designers who made their packaging distinct from other groups. That “was costing us a lot of money,” he said. Suiter’s team was tasked with making a new, universal Apple package design in 1984.
They came up with two versions. One was “very cost-effective,” the other “at least” triple that price, he recalls.
The cheaper version had two colors on corrugated paper stock. “It was very practical,” Suiter said. “There was another version that was absolutely gorgeous. It used all of the six colors of the Apple logo. It had the Apple logo on one side and a black-and-white photo on the box.”
Suiter’s team presented the packaging options to the different groups at Apple. “The difference was dramatic in terms of cost,” he said. “[We figured] there was no way we could pay that kind of money, and we’d have to go with that [cheaper] version.”
But Jobs surprised Suiter. “Steve stopped everybody and said, ‘No, here’s how we’re going to pay for it. We’re going to take money from the advertising budget. I believe packages are like billboards. When people are carrying boxes around and putting [them] in their cars, it’s a moving billboard for Apple so that’s what we’re going to do.'”
Apple still uses a similar design for its packaging today.
Standing up to Steve
Jobs was considered a genius by many, but he also had a temperamental side, which his employees knew all too well.
“[Steve] would come marching down the hall or skipping down the hall, calling…’What an idiot. I can’t believe you did this stupid thing,'” said Debi Coleman, who joined Apple in 1981 as finance controller for the Macintosh.
It took her a year to learn how to confront Jobs. Coleman credits Joanna Hoffman, the executive in charge of Mac marketing, as her teacher. “Joanna said, ‘Look him in the eye. You’ve got to stand up.’ From that point on — I’m not saying he wasn’t tough, totally demanding and totally critical — but he was totally wonderful to me.”
Coleman became head of Mac manufacturing in 1984 and was one of the highest-ranking women in the tech industry. She took over the role of Apple chief financial officer in 1986. At a November 2015 reunion of women on the Mac team, Coleman attributed a big part of Apple’s success to Jobs, saying he made people at Apple believe they could change the world. And even if he was intimidating, he had a softer side, she said.
One Sunday morning in the early days of the Macintosh computer, Coleman got a call from Jobs, asking that she meet him at the Mac factory. He wanted to give a tour to his father, Jobs said.
“That was a real wonderful experience to see how Steve loved and respected his adopted father,” Coleman said. “I never saw anything like it before or since.”
Dancing Pepsi cans
Jobs recruited John Sculley in the early 1980s to help him grow Apple’s business. At the time, Sculley was CEO of Pepsi and had helped it overtake Coca-Cola as the top beverage maker. Jobs famously convinced Sculley to take the CEO role at Apple in 1983 by asking if he wanted to “sell sugar water for the rest of his life” or if he wanted to “come with me and change the world.” Sculley, who was close with Jobs before helping to oust him in 1985, served as Apple’s chief executive for a decade until being forced out himself.
Sculley still remembers the first time he visited Apple’s Silicon Valley offices in 1982.
“I show up at this address and think I’m at the wrong place because there are no buildings, just houses,” Sculley said. He met Jobs in the house used as Apple’s executive staff offices, and then the two headed to the Mac building a couple blocks away.
“It was a beautiful blue-sky day, and there was a Jolly Roger pirate flag flying from the roof,” Sculley said. “Steve was in great competition with the Lisa [computer] group. Lisa was the Navy so Steve wanted to be the pirates.”
Inside the Mac building was an expensive piano for some of the team engineers, as well as a motorcycle. When he walked into the engineering lab, Andy Hertzfeld, an original member of the Macintosh team who designed the system’s software, had set up a demo.
“Steve had used the ruse that I was not interviewing for a job but I was there as the CEO of Pepsi and interested in Macs for Pepsi,” Sculley said. “Andy had put together dancing Pepsi cans on the screen of the Mac. I didn’t know that this was really pretty hard to do, was pretty novel. … I was wondering why Andy was smiling with his Cheshire cat grin. That was the first introduction I had to what Apple was like. It was totally a startup.”
Jobs turned the product launch into an art form. He also leaves a legacy by which entrepreneurs can learn to dazzle their audiences. The following five keynotes will help anyone give the presentation of a lifetime.
1. The Mac launch
Every Steve Jobs presentation had one moment that people would be talking about the next day. These “moments” were tightly scripted and relentlessly rehearsed. Remarkably, Jobs’ flair for the dramatic started before PowerPoint or Apple Keynote were available as slide design tools, which proves you don’t need slides to leave your audience breathless.
On Jan. 24, 1984, Steve Jobs introduced the first Macintosh with a magician’s flair for the big reveal. He showed a series of images and said, “Everything you just saw was created by what’s in that bag.” And with that Jobs walked to the center of a darkened stage that had a table and a canvas bag sitting on top it. He slowly pulled the Mac from the bag, inserted a floppy disk, and walked away as the theme fromChariots of Fire began to play as images filled the screen.
The lesson: A presentation doesn’t always need slides to wow an audience.
2. The iPhone
The rule of three is one of most powerful concepts in writing. The human mind can only retain three or four “chunks” of information. Jobs was well aware of this principle and divided much of his presentations into three parts. Sometimes he even had fun with it.
For example, on Feb. 16, 2007, Jobs told the audience to expect three new products: a new iPod, a phone and an “Internet communication device.” After repeating the three products several times, he made the big reveal — all three products were wrapped in one new device, the iPhone.
The lesson: Introduce three benefits or features of a product, not 23.
3. The first MacBook Air
When Jobs introduced the “world’s thinnest notebook,” the MacBook Air, he walked to the side of the stage, pulled out a manila envelope hiding behind the podium and said, “It’s so thin it even fits inside one of those envelopes you see floating around the office.” With a beaming smile, he slowly pulled it out of the envelope for all to see.
Most presenters would have shown photographs of the product. Jobs took it one step further. He knew what would grab people’s attention. This did. Most of the blogs, magazines and newspapers that covered the launch ran a photograph of Steve Jobs pulling the computer out of the envelope.
The lesson: Don’t just tell us about a product, show it to us, and do it with pizzazz.
4. The iTunes Store
Every great drama has a hero and a villain. Steve Jobs was a master at introducing both heroes and villains in the same presentation. On April 28, 2003, Jobs convinced consumers to pay 99 cents for songs. Jobs began with a brief discussion of Napster and Kazaa, sites that offered “near instant gratification” and, from the user’s perspective, free downloads. On the next slide he listed the “dark side.” They were:
- Unreliable downloads
- Unreliable quality (“a lot of these songs are encoded by 7-year-olds and they don’t do a great job.”)
- No previews
- No album cover art
- It’s stealing (“It’s best not to mess with karma.”)
In the next section of the presentation Jobs replaced each of the drawbacks with the benefits of paying for music.
- Fast, reliable downloads
- Pristine encoding
- Previews of every song
- Album cover art
- Good Karma
The lesson: Great presentations have an antagonist — a problem — followed by a hero — the solution.
5. The genius in their craziness
In 1997, Jobs returned to Apple after a 12-year absence. Apple was close to bankruptcy at the time and was quickly running out of cash.
Near the end of Jobs’ keynote at Macworld in August 1997, he slowed the pace, lowered his voice, and said: “I think you always had to be a little different to buy an Apple computer. I think the people who do buy them are the creative spirits in the world. They are the people who are not out just to get a job done, they’re out to change the world. We make tools for those kind of people. A lot of times, people think they’re crazy. But in that craziness, we see genius. And those are the people we’re making tools for.”
The lesson: Don’t forget to motivate your internal audience — your team, employees and partners. Give them a purpose to rally around.
When I wrote The Presentation Secrets of Steve Jobs, I argued that Jobs was the world’s greatest brand storyteller. When I watch these presentations over again, I’m convinced he’s still the best role model for entrepreneurs who will pitch the next generation of ideas that will change the world.
Google CEO Sundar Pichai thinks we are now living in an “artificial intelligence-first world.” He’s probably right. Artificial intelligence is all the rage in Silicon Valley these days, as technology companies race to build the first killer app that utilizes machine learning and image recognition. Today, Google announced an AI-powered assistant built into its new Pixel phones. But there’s a pivotal downside to the company’s latest creation: Because of the very nature of artificial intelligence, our data is less secure than ever before, and technology companies are now collecting even more personal information about each one of us.
Google’s new assistant, which debuted in the company’s new messaging app Allo, works like this: Simply ask the assistant a question about the weather, nearby restaurants, or for directions, and it responds with detailed information right there in the chat interface. It is undoubtedly neat and useful. Pichai stressed at today’s Google event that this is just the beginning for artificial intelligence. Google’s artificial intelligence will only become smarter, faster, and more accurate. It will learn things about your habits and preferences to better serve you personalized results and to answer more specific questions.
But this is where the problems start.
Because Google’s assistant recommends things that are innately personal to you, like where to eat tonight or how to get from point A to B, it is amassing a huge collection of your most personal thoughts, visited places, and preferences. Google is pretty vague about what exactly the assistant is collecting. It can access information on your devices like contacts or storage (read: literally anything stored to your device), and it can also access “content on your screen.” In order for the AI to “learn” this means it will have to collect and analyze as much data about you as possible in order to serve you more accurate recommendations, suggestions, and data.
In order for artificial intelligence to function, your messages have to be unencrypted. Computer scientists are trying to figure out a way to make “searchable encryption,” but that’s a ways off. Besides, even standard encryption still has problems. Google offers state-of-the art encryption within its Allo messaging app, but if you turn it on, say goodbye to your fancy AI assistant.
That means Google’s stuck between a rock and a hard place here. The security engineers at Google know, and cryptography experts agree, that automatic encryption is the best way to defend personal data and conversations from hackers and government surveillance. But in order to stay competitive against all the other technology companies that have (or will eventually have) AI-powered assistants, Google has no choice. Kudos to Google for offering users the choice of encrypting their messages, but I wished we lived in a world where people could use Google’s cool new feature while keeping their messages secure at the same time.
Googles isn’t alone in this push-and-pull. In fact, Facebook has pretty much the exact same problem. Facebook Messenger also has opt-in encryption, and uses what is widely regarded as the gold standard for encrypting messages, just like Google does. But in order for users to do things like call an Uber from the app or use a fun bot, their messages have to be unencrypted.
These new assistants are really cool, and the reality is that tons of people will probably use them and enjoy the experience. But at the end of the day, we’re sacrificing the security and privacy of our data so that Google can develop what will eventually become a new revenue stream. Lest we forget: Google and Facebook have a responsibility to investors, and an assistant that offers up a sponsored result when you ask it what to grab for dinner tonight could be a huge moneymaker.
Google is betting that people care more about convenience and ease than they do about a seemingly oblique notion of privacy, and it is increasingly correct in that assumption. Google’s job is to innovate and make money, and if nothing else, be glad the company is offering you a robust option to protect your data. But, you know, an option that means sacrificing some very helpful AI-powered assistance.
Quote from Richard Branson before Disruptors
Ahead of our Virgin Disruptors event, Virgin.com has been asking each speaker to name their ultimate disruptor. Here’s mine…
It’s impossible to name just one ultimate disruptor (which is why I’ve been stretching mine out with a series of dream dinner parties!) person but if I had to name one, it would be Steve Jobs. He was a truly great businessman, but more than that he was an inspiration to young people, entrepreneurs, inventors, designers, early adopters, budding musicians, and people with disabilities, who through his vision discovered a better way to engage with the world.
So many people drew courage from Steve and related to his life story: college drop-outs, struggling entrepreneurs, ousted business leaders figuring out how to make a difference in the world, and people fighting debilitating illness. We have all been there in some way and can see a bit of ourselves in his personal and professional successes and struggles.
He lived and worked by the message “Your time is limited, so don’t waste it living someone else’s life… have the courage to follow your heart and intuition.” This is something I wholeheartedly subscribe to.
In Apple’s 1997 “Think Different” marketing campaign, he said: “Here’s to the crazy ones. The misfits. The rebels. The troublemakers. The round pegs in the square holes. The ones who see things differently.” I am proud to say that, in the accompanying montage, he counted me as one of them. I think it’s an attitude that’s shared by all leaders who seek to make a real difference. Even if you have been around for a long time and have learned all the ins and outs of the game, the best way to be disruptive is to forget the rules.
Although he was completely different from me – he used to shout at employees that made mistakes and didn’t delegate much – I admired Steve so much. He was innovative, determined and, above all, passionate. He was, at all times, disruptive. Today, more than ever, you’ve got to do something radically different to make a mark. Under Steve’s guidance, Apple became the world’s most successful change-maker brand.
By leaving it up to you to delete your messages, Google says its AI-driven Allo chat app can do a better job. But your privacy is a trade-off.
Google’s Allo offers users messaging app with Google Assistant built in, offering automatically generated responses called Smart Replies and other computer-generated suggestions for your everyday life.
To make those features as useful as possible, Google made a trade-off with your privacy, the company confirmed Wednesday. Instead of keeping your messages on company servers for a short period of time, the company will keep them indefinitely, or at least until you manually delete them.
That, Google acknowledged Wednesday, is a change from what the company told some reporters before its annual developer conference, I/O, in May. While the company initially considered keeping messages in a “transient” fashion, testing of Allo revealed that its Smart Reply technology worked better if it had a longer history of user messages to draw from.
The change sets Allo apart from other messaging apps that have built in privacy settings by default rather than leaving it up to the user to make sure messages don’t hang around on company servers. It also means Allo is less likely to cause Google anygrief with governments around the world that have struck back at companies that don’t keep copies of their users’ messages.
In a statement, Google framed its decision as one that empowers users.
“We’ve given users transparency and control over their data in Google Allo. And our approach is simple — your chat history is saved for you until you choose to delete it. You can delete single messages or entire conversations in Allo,” a Google spokesperson said.
What’s more, Allo offers Incognito Mode, which provides end-to-end encryption, meaning they remain scrambled up and unreadable when they pass through company servers.
But all this user choice isn’t necessarily a good thing, said Eva Galperin, a global policy analyst at the Electronic Frontier Foundation, which advocates for online privacy. In apps that let users switch between private and less-private modes, users either choose the wrong mode or mistakenly believe the whole app is safe.
“When people have those kinds of choices, it’s too easy to mess up,” she said.
Other messaging services, such as WhatsApp, owned by Facebook, and Apple’s iMessage, have taken a different approach. It’s one that doesn’t leave it up to users to delete messages or choose an encrypted setting. Instead, users’ messages are encrypted end to end by default settings.
Google wasn’t trying to offer a messaging service with default end-to-end encryption, because it needs to read messages for its Smart Reply technology to work. If they had kept the messages for as short a time as possible, it would have been a concession to privacy in its new, artificial intelligence-based app.
According to Google, that compromise detracted too much from the Smart Reply feature.