Before iPlot will comment on the inevitable iPhone (yes, June 29 is upon us!), here are some other remarkable user experiences, based on Scotch Tape:
More images on Gather.com
Before iPlot will comment on the inevitable iPhone (yes, June 29 is upon us!), here are some other remarkable user experiences, based on Scotch Tape:
More images on Gather.com
Browsing through some recent writings on sustainability, I re-discovered "Slow Design", an initiative that attempts to apply the principles of the "Slow Food" movement to the realm of design:
“Above all, the slow approach means the simple, but in current times revolutionary, affirmation that it is not possible to produce and appreciate quality if we do not allow ourselves the time to do so, in other words, if we do not activate some kind of slowdown. However, slow does not only mean this. It also means a concrete way of actually putting this idea into practice. It means cultivating quality: linking products and their producers to their places of production and to their end-users who, by taking part in the production chain in different ways, become themselves co-producers.”
The latter part seems a little bit forced, and I also wonder how slow design, food, home, or love go together with BlackBerrying, instant messaging, and...yes...blogging. Instant gratification and long-term thinking, Twitter and "Long Now," "digital disorder" and "green order," "the short form" and the "long form" - aren't these opposing forces that can hardly be tamed in the -- pun intended -- long run? How can you expect me to think about the consequences of my actions in ten or twenty years if I can't even make it through the day without checking my inbox every five minutes (you can, of course, write yourself an email that you will receive in the future). How can history matter to me if my attention span ranges from Paris Hilton to the iPhone? Green is good but attention is faster. Whoever manages to reconcile both forces will have a very sustainable business.
The classic antagonism between excellence and efficiency (read Apple vs. Dell) is enjoying a renaissance these days, and along with it, the old conundrum of creativity versus process in fostering innovation has re-emerged. It is thus not too surprising that, in a recent issue, Business Week questions the value of Six Sigma, the Holy Grail of process (and performance) optimization in which many corporate heavyweights fervently believe. The Six Sigma methodology, with its emphasis on quality management, was once understood to be a strong innovation enabler. However, as competitive advantages are increasingly built upon top-line growth, in other words, explorative rather than exploitative strategies, it suddenly seems to be a little dusty. Six Sigma was developed to minimize risk and variation, but for free-spirited innovation to blossom, allowing risk and variation is paramount. The magazine illustrates the ramifications of ill-directed Six Sigma rigor by juxtaposing the leadership philosophies of James McNerney and George Buckley at 3M and showing how an over-emphasis on process-quality led to “sameness” and the gradual waning of the company’s innovative power. Process-obsession may in fact be opposed to “the new age of creativity” that propels hyper-customization and attempts to save brands from the “death spiral of commoditization.” If you measure everything you manage, risky ideas will not spark. Employees need to be able to dream, and they need to know that their company not only encourages risk but in fact exhibits a high tolerance for failure.
Not coincidentally, the 3M critique coincides with the eulogies that Apple is receiving upon the release of the iPhone. The Economist, for example, praises Apple as “the company that epitomizes the art of innovation,” deriving four principles of innovation from Apple’s success. In short, they are:
1. Network innovation – “very welcome” vs. “not invented here”
2. User first! – design around the needs of the user not the demands of the technology.
3. Discover untapped markets – listen to what your customers do not say rather than relying solely on the requirements they articulate
4. Fail wisely – allow failure but don’t make the same mistake twice
None of these four principles appears to be very original on paper – in fact, they’ve been recurring themes throughout the business media in the past five years or so. However, applying them is a whole different ballgame. There are too many companies attempting to follow the Apple way and still not getting anywhere close to Apple’s mastery of the “art of innovation.” Which is why perhaps the most important lesson of all is: Study Apple, but don’t imitate it! Do innovation your own way: Follow your guts and make sure your organization is following you. The next Apple will not be a believer; it will be a heathen.
Let’s stay with Apple though for a moment and take a closer look at the other Holy Grail of innovation – convergence. There are many pundits who herald Apple for its convergence strategy by simply refering to the facts: iTunes is on over 300 million computers, iTunes ranks number 3 in all music sales, the analysts are predicting iPod sales of 10 million units this quarter, Apple TV has been launched, and the iPhone should emerge as the most talked about new consumer electronic device in history and is expected to fuel the launch of more all-in-one gadgets from competing consumer electronic makers. Experts foresee that Apple “could well be the big winner when our home computing environments and our home entertainment environments merge into one integrated environment that makes moving between media sources, data sources and applications seamless and easy.” Convergence is – once again – all the rage, not just with regard to Apple. Reuters reports that top industry executives forecast that wireless, TV, and Internet services would increasingly converge. And just two days ago, for example, in a meditated step toward converged video and phone communications, AT&T unveiled at the industry trade show NXTcomm a video-sharing service for its wireless customers. With this Video Share service, users are able to send live videos to each other during phone conversations.
But what does convergence exactly mean? Let’s try a very simplified overview. First of all, there is the media convergence between the worlds of telecoms, TV, Internet, and computing, including fixed-mobile convergence, voice-and data convergence, and -- most importantly -- three-screen-convergence. Then there is what you may call messaging convergence: email, chat, video-conferencing, and other messaging tools are becoming more and more integrated. And then there is device convergence: Everything from a laptop to a mobile phone to a television to a games console is now, arguably, the same kind of device: each consists of a microprocessor, a screen, some storage, an input device and a network connection. You can make phone calls on your laptop, play games on your mobile phone, and watch videos on your games console. Finally, there is the convergence of the user experience, which media scholar Norbert Bolz approaches from a philosophical, design-thinking angle: “Shaping the interface between telecommunications, new media, and computer technologies is the most important task of the future. There is a computer in an increasing number of things around us. And an increasing number of things that surround us are the products of design.” Indeed, there is a gradual convergence of things with Internet and the Internet of things. This is a reality for all of us and a daunting challenge for designers: Ubiquitous computing requires ubiquitous design.
Yet this ubiquitous design does not necessarily need to be convergent. According to Al Ries of Ad Age, “In the high-tech world, divergence devices have been spectacular successes. But convergence devices, for the most part, have been spectacular failures.” Ries provides some examples: “The first MP3 players (the Diamond Rio, for example) were flash-memory units capable of holding only 20 or 30 songs. The first iPod, on the other hand, had a hard drive and could hold thousands of songs. Now there were two types of MP3 players, a classic example of divergence at work. Every high-tech device has followed a similar pattern. The first computer was a mainframe computer, followed by the minicomputer, the desktop computer, the laptop computer, the handheld computer, the server and other specialty computers. The computer didn't converge with another device. It diverged. When the cellphone was first introduced, it was called a ‘car phone’ because it was too big and heavy to lug around. You might have thought it would eventually converge with the automobile. It did not. Instead it diverged and today we have many types of cellphones. Every Best Buy and Circuit City is filled with a host of other divergence devices that have been enormously successful: the digital camera, the plasma TV, the wireless e-mail device, the personal video recorder, the GPS navigation device. What convergence device has been a big success? Not many, although there have been a lot of convergence failures.
• The computer/phone. AT&T, Motorola and others introduced combination products. Few were ever purchased.
• The computer/TV. Apple, Gateway, Toshiba, Philips and others tried to market combination products with little success.
• Interactive TV. Microsoft spent $425 million to buy WebTV and then poured more than half a billion dollars into the venture. That didn't work, so it moved on to Ultimate TV, which didn't work either.
• Cellevision. Everybody is talking about the third screen, watching TV on your cellphone, but relatively few people do. (The real action in TV is the booming market for divergence products such as big-screen plasma and LCD sets.)
• Media-center PCs. Everybody was going to run everything in their homes from personal computers. It never happened.”
Ries’ theses are backed by the findings of a recent research project, conducted by Swisscom’s R&D division. The study examines how users really use their cell phones, and it unearths some surprising insights that run counter to the widespread gospel of convergence: “People are in fact using different communications technologies in distinct and divergent ways. The fixed-line phone is the collective channel, a shared organisational tool, with most calls made ‘in public’ because they are relevant to the other members of the household…(…) Mobile calls are for last-minute planning or to co-ordinate travel and meetings. Texting is for ‘intimacy, emotions and efficiency.’ E-mail is for administration and to exchange pictures, documents and music. Instant-messaging (IM) and voice-over-internet calls are ‘continuous channels,’ open in the background while people do other things. Each communication channel is performing an increasingly different function.”
So what do innovators not only in the mobile sector but also in the broader media/communications/entertainment space learn from the Swisscom research? Is convergence a myth that users don’t really care about, or is the current state of convergence just not satisfying enough to lure users out of their divergent use patterns? Is the answer to divergence more divergence or better convergence? “We have to be extremely careful that we don't go in the Swiss army knife kind of direction where we lose focus on what the consumer wants,” says Olli-Pekka Kallasvuo, the boss of Nokia.
I think he’s right. What will be critical is that companies have a “convergent view” on product ecosystems and know when to design the space between the devices and platforms and when not. This will result in a greater diversity of hybrid devices that orchestrates convergence and divergence to the benefit of a more satisfying user experience. Convergence must not equal “sameness.” Convergence and divergence are not fixed attributes; they are adverbs that describe the modality of an experience. A convergence/divergence combination will enable variety, which will again enable true personalization. The Holy Grail is not Six Sigma; it is not convergence. It is a slight variation of a line by Malcom Gladwell: “There is no perfect product. There are only perfect products.”
Posted at 07:56 PM in Apple, Consumer Electronics, Design, Digital, Entertainment, Hollywood, Innovation, Instant Messaging, Internet Television, iPhone, iPod, Leadership, Marketing, Netwar, Online Music, Online Video, Strategy, Video-Sharing, Voice over IP, Web 2.0 | Permalink | Comments (5) | TrackBack (0)
“As an able person or the owner of a competence, the individual subject must become a bearer of an abstract willingness to achieve/perform and factually possess the ability to perform. His social pride and private dignity are based in the awareness of his ability to contribute to a totality of interesting achievements.“
“None of the bloggers called us to fact check the information.”
-- Spokesperson of the German Grimme awards, after this year’s winner were accidentally published on the organization’s web site and bloggers picked it up
I had the pleasure of visiting Vanity Fair’s offices in Berlin last week, a fancy loft that resembles an Apple store and is full of devils wearing Prada. Conde Nast is obviously putting a lot of money behind its latest national franchise (a weekly in Germany) but both the journalistic quality and the sales numbers of the German issue, which appears to be a mixed bag of tabloid, news magazine, and fashion guide, are disappointing so far.
I realized once again on my flight back to the US why Vanity Fair is still a household name in magazine journalism. I bought this month’s special Africa issue, which was co-edited by Bono and is essentially the (Magazine) Red. Although the magazine’s editor, Graydon Carter, eventually shied away from giving in to Bono’s pledge to rename it “Fair Vanity,” the special edition – despite its serious cause -- is fun to read. From Bono’s “Message 2U” editorial (in which he calls U2 an “Irish post-punk combo”) to Chris Rock’s blasphemous “My trip to South Africa” travelogue, the contributions radiate a pleasant sense of self-irony and depict the world’s “forgotten continent” in bright colors and with an optimistic tone.
Naturally, this approach is controversial and has caused some harsh criticism. Ethan Zuckerman, a fellow at the Berkman Center for Internet and Society writes: “The message of the cover is that Africa is important and sexy because important and sexy people care about it and are willing to lend their ‘talent’ and celebrity appeal to the ‘cause.’ This tends to piss off my friends who are begging the world to think of Africa less as a cause and more as a continent, particularly as a continent open for business. How hard would it have been for Vanity Fair to pair some of these well-meaning celebrities with actual Africans working to build businesses, repair hospitals and save forests?.... It’s possible - and quite likely - that some of these stories are excellent and worth reading. But the overall picture is the one that so many Africans find themselves fighting - Africa as basket case.”
To me, these accusations seem unfair. At the end of the day, it’s Vanity Fair, and with its African issue the magazine does what it does best: Celebrity-studded people journalism. It’s a diverse, riveting view on Africa that takes full advantage of what Bono calls “judo strategy” – winning a cause by leveraging the power of the enemy (that is, in this case, the blunt hedonism and consumerism of the Western World).” He says that they were trying to “bring some sex appeal to wanting to change the world. Well done.
The iPhone is not the only phone these days that makes the headlines. Other phones toy with the notion of the phone being our main gateway to meaning. The 20-year-old Ryan Fitzgerald, for instance, became famous for his “phone experiment.” He posted a video on YouTube in which he suggested that anyone who wanted to chat should call him. He gave his number, and it didn’t take long until his cell phone rang. In the following days, he received more than 5,000 calls from all over the world (mostly from nordic countries though….).
Speaking of nordic countries: Forget about Bjoerk - here’s the real soundtrack of our lives. Glaswegian artist Katie Paterson submerged a phone underneath a melting glacier in Iceland. With the help of sponsors, Paterson was able to drop a hydrophone into the icy lagoon where the glacier is disappearing and pick up the sounds. The waterproof microphone was linked to a phone and amplifier housed in a tent on land. You could call up and listen to the ice die: "It is really poetic: a river of ice slowly disintegrating, quite discreetly, quite invisibly. Sheets of ice are constantly slipping off, huge bits cracking, moving very slowly. It is sad to see a vanishing world." Only one caller at a time could get through, which was deliberate so people could have a "one-to-one beautiful and intimate moment" with the glacier, she said. The work, entitled "Vatnajokull (the sound of)", ended on June 13.
As I’m trying to devote my attention exclusively to one single task – writing an article – for a whole week (no TV, no Internet, no phone calls, etc.), I suddenly realize that the absence of distraction equals some kind of mental imprisonment. If you’re not allowed to desert a given task for at least a few moments of aimless meandering through competing attention-grabbing events, you’re deprived of the fundamental dignity of responding to social stimuli. I want to be able to think and look left and right of the "center of attention." I want to be a floating topic and browse through a pool of everpresent options. Distraction is a human right!
A 747 of Dutch KLM carried my global soul* to Amsterdam-Schiphol where I’m sitting now in a café. I didn’t sleep at all on the plane but I was reading and writing and curiously scanning the polar ice far down below me from time to time to see whether it was true what my friend Veronica said the other day – that you literally see the ice melt these days, due to global warming. Well, I didn’t see it, sorry. But I did see a lot of ice. In fact, between the States and Europe there seems to be nothing but ice (which is equally concerning as the proposition above)! With the ice behind me, I’m having a global breakfast now and will soon board my flight to my very local hometown Esslingen in Germany.
Visiting my former hometown makes me nervous, always, because it makes my global soul shrink. Sounds stupid, I know, but on the eve of every departure to Europe, I tend to lose my English and fall back to thinking with my native tongue. The self-acclaimed world citizen becomes a shy, not-so-worldly Swabian who washes his Mercedes on a Saturday morning, wearing a wife-beater. But then again, isn’t that what people do in L.A. as well? “My home is not where I’m coming from; my home is where I’m going to,” I sang when I was 24 and member of a pop group. That was cheesy – and why I don’t want to go back. Ever.
(* gratefully borrowed from Pico Iyer’s book)