IT IS a familiar-sounding tale: after decades of simmering discontent a new form of media gives opponents of an authoritarian regime a way to express their views, register their solidarity and co-ordinate their actions. The protesters’ message spreads virally through social networks, making it impossible to suppress and highlighting the extent of public support for revolution. The combination of improved publishing technology and social networks is a catalyst for social change where previous efforts had failed.
That’s what happened in the Arab spring. It’s also what happened during the Reformation, nearly 500 years ago, when Martin Luther and his allies took the new media of their day—pamphlets, ballads and woodcuts—and circulated them through social networks to promote their message of religious reform.
LOOKING for needles in haystacks is boring. But computers do not get bored. Contracting out to machines the tedious business of assessing the dangerousness of cancer cells in histological microscope slides ought thus to be an obvious thing to do. Cervical-cancer smear tests aside, however, such electronic intrusions into the pathology laboratory are limited. Grading cancer cells into “indolent” and “aggressive”, and hazarding an opinion about whether they spell a treatable condition or an untreatable one, has remained the realm of the human expert.
But not for much longer, if Daphne Koller, a computer scientist at Stanford University, and her colleagues have their way. They recently reported in Science Translational Medicine that they have written a program which can distinguish between grades of breast-cancer cell—and in a way that provides a more accurate prognosis than a human pathologist can.
Over the next three years, according to Cole, the tablet will become the primary tool for personal computing. Use of a desktop PC will dwindle to only 4-6 percent of computer users – writers, gamers, programmers, analysts, scientists, and financial planners – and laptop use will decline as well.
“The tablet is such an inviting gadget,” said Cole. “The desktop PC is a ‘lean forward’ device – a tool that sits on a desk and forces uses to come to it. The tablet has a ‘lean-back’ allure — more convenient and accessible than laptops and much more engaging to use. For the vast majority of Americans, the tablet will be the computer tool of choice by the middle of the decade, while the desktop PC fades away.
“We don’t see a negative consequence in the move to tablets,” said Cole, “but the coming dominance of tablets will create major shifts in how, when, and why Americans go online – changes even more significant than the emergence of the laptop.”
“Circulation of print newspapers continues to plummet, and we believe that the only print newspapers that will survive will be at the extremes of the medium – the largest and the smallest,” said Cole. It’s likely that only four major daily newspapers will continue in print form: The New York Times, USA Today, the Washington Post, and the Wall Street Journal. At the other extreme, local weekly newspapers may still survive.
“The impending death of the American print newspaper continues to raise many questions,” Cole said. “Will media organizations survive and thrive when they move exclusively to online availability? How will the changing delivery of content affect the quality and depth of journalism?”
Outside Tapachula, Chiapas, Mexico—10 miles from Guatemala. To reach the cages, we follow the main highway out of town, driving past soy, cocoa, banana and lustrous dark-green mango plantations thriving in the rich volcanic soil. Past the tiny village of Rio Florido the road degenerates into an undulating dirt tract. We bump along on waves of baked mud until we reach a security checkpoint, guard at the ready. A sign posted on the barbed wire–enclosed compound pictures a mosquito flanked by a man and woman: Estos mosquitos genéticamente modificados requieren un manejo especial, it reads. We play by the rules.
Inside, cashew trees frame a cluster of gauzy mesh cages perched on a platform. The cages hold thousands of Aedes aegypti mosquitoes—the local species, smaller and quieter than the typical buzzing specimens found in the U.S. At 7 a.m., the scene looks ethereal: rays of sunlight filter through layers of mesh creating a glowing, yellow hue. Inside the cages, however, genetically modified mosquitoes are waging a death match against the locals, an attempted genocide-by-mating that has the potential to wipe out dengue fever, one of the world’s most troublesome, aggressive diseases.
Throughout a swath of subtropical and tropical countries, four closely related dengue viruses infect about 100 million people annually, causing a spectrum of illness—from flu-like aches to internal hemorrhaging, shock and death. No vaccine or cure exists. As with other mosquito-borne diseases, the primary public health strategy is to prevent people from being bitten. To that end, authorities attempt to rid neighborhoods of standing water where the insects breed, spray with insecticides, and distribute bed nets and other low-tech mosquito blockers. They pursue containment, not conquest.
I first started reading biographies of men of great accomplishments in high school; the first was that of Eddie Rickenbacker. I haven’t stopped, either; the most recent was that of Steve Jobs. Sometime after I’d started my career in the automotive industry, I took to reading books about the men who had created that industry. One thing you learn quickly about these individuals is that most had suffered serious financial setbacks before they finally succeeded. In fact the setbacks they encountered would have stopped the average individual in his tracks; but those who finally succeeded to greatness seemed to brush off defeat even faster than they accepted their ultimate success.
The other fact one notices in reading great car guys’ biographies is that many of the greatest names in business history actually started in the absolute worst of economic times. Others, such as GM’s Alfred Sloan, made their reputations in periods of horrendous economic activity.
From the uprisings across the Arab world to the devastating earthquake, tsunami and nuclear disaster in Japan, there was no lack of news in 2011. Reuters photographers covered the breaking news events as well as captured more intimate, personal stories. In this showcase, the photographers offer a behind the scenes account of the images that helped define the year.
You can always tell a Burt Rutan airplane, just as you can always tell a Dr. Seuss drawing or a Beatles song. It’s not only the configurations — though canards, winglets, or twin booms sometimes give them away. It’s not just the materials, though composites have been key to Rutan’s achievements and helped make him the hero of the homebuilder. And it’s not just the futurism, though Rutan designs always look like they flew in from a decade off in the distance. There’s some other quality rolled up with those three that makes you know it’s a Rutan. We think of it as playfulness.
Consider SpaceShipOne, Rutan’s best-known creation, which made history in 2004 as the world’s first private spaceship. It looks the way it does for sound engineering reasons: Its famous tail feathers were deployed to slow and control its atmospheric reentry, its tubby fuselage has a diameter of five feet to accommodate an oxidizer tank of similar dimension and a comfortable cabin, and its pointy little nose is sprinkled with small round windows so that the pilot could see the horizon at all times during the flight up to 60 miles and back. But SpaceShipOne is also toy-like. Can anyone doubt kids would be delighted by a small model of it?
The European: A computer “is a simple mind having a will but capable of only two ideas”, you have said.Does it make sense to think of a technical apparatus in biological terms?
Dyson: The quote comes from an illustration of a circuit diagram that Lewis Fry Richardson produced in 1930. It was a very prophetic idea, like most of the stuff that Richardson did. He had drawn this diagram of an indeterminate circuit, so it was impossible to predict which state the circuit would be in. Maybe those are the origins of mind: A simple and indeterminate circuit. The significance of Richardson’s idea was that he broke with the assumption that computation had to be deterministic, because so few others things in the universe are deterministic. Alan Turing was very explicit that computers will never be intelligent unless they are allowed to make mistakes. The human mind is not deterministic, it is not flawless. So why would we want computers to be flawless?
The European: The ultimate indeterminate process on Earth is evolution. Yet evolution doesn’t really require input and commands, it sustains and develops itself. That seems fundamentally different from the way we think about technological evolution…
Dyson: Biological evolution is a bottom-up process. There are differences between the two realms, but there are also similarities: In both biology and technology, things develop into structures of increasing complexity. That’s what Nils AallBarricelli saw right away. He tried to understand the origins of the genetic code and apply that to the development of computers. The question was whether you could run computer experiments that allowed increases in systemic complexity to happen. And very quickly that stopped being an experiment and codes began evolving in the wild—not by random mutation, but by crossing and symbiosis, exactly as Barricelli prescribed.
Some hobbyist hackers have rigged up an iPhone 4S to collect brain wave patterns from some simple ECG pads, translate them into synthesized speech, which is in turn pumped through the 3.5 mm headphone jack, and recognized by Siri as a usable command. Besides pressing the home key to initiate Siri, all you have to do is think your command, and your iPhone 4S will hop to it. The engineers expect that they’ll even be able to eliminate the need to press the home key, making it fully automatic. So far, the guys at Project Black Mirror have been able to link 25 brain wave patterns to specific Siri commands. Of course, right now the project is a bulky Arduino test board hooked up to a Macbook, which also occupies the headphone jack, and makes the user look like he belongs in Clockwork Orange, but these guys are putting up a Kickstarter page shortly to get funding and turn this thing into a real product.
Consumer technology has come a long way since that day. Digital gadgets—then too often designed by techies for techies—have become essential to our lives, and much easier to use, even if we still need the Geek Squad and the Genius Bar more than we should. And the pace of change has been mind-boggling.
In 1991, most consumer computers didn’t have built-in audio beyond just the ability to beep. Most lacked any way to communicate with the outside world—even via a slow, dial-up modem. The Internet wasn’t available to most people. Search engines and social networks didn’t exist.
Mobile phones were huge bricks. Digital cameras for consumers cost a fortune and took monochrome pictures. Digital music players and video recorders, e-readers and tablets were nowhere to be found.
So, this week, I decided to take a look back at some of the game-changing products that appeared in this column over the past two decades and propelled us from that primitive landscape to today’s interconnected digital world. This list of milestones is just a sampling; yours might differ. Also, since I write for average consumers, the list is weighted towards consumer products, not gadgets for geeks or corporate use.