The chief executive of Apple, Timothy D. Cook, has a prediction:
the day will come when tablet devices like the Apple iPad outsell traditional
personal computers.
His forecast has backing from a growing number of analysts and veteran
technology industry executives, who contend that the torrid growth rates of the
iPad, combined with tablet competition from the likes of Amazon.com and
Microsoft, make a changing of the guard a question of when, not if.
Tablet sales are likely to get another jolt this week when Apple introduces its
newest version of the iPad, which is expected to have a higher-resolution
screen. With past iterations of the iPad and iPhone, Apple has made an art of
refining the devices with better screens, faster processors and speedier network
connections, as well as other bells and whistles — steadily broadening their
audiences.
An Apple spokeswoman, Trudy Muller, declined to comment on an event the company
is holding Wednesday in San Francisco that is expected to feature the new
product.
Any surpassing of personal computers by tablets will be a case of the computer
industry’s tail wagging the dog. The iPad, which seemed like a nice side
business for Apple when it was introduced in 2010, has become a franchise for
the company, accounting for $9.15 billion in revenue in the holiday quarter, or
about 20 percent of Apple’s total revenue. The roughly 15 million iPads Apple
sold in that period was more than twice the number it sold a year earlier.
In the fall, Amazon introduced the iPad’s first credible competitor in the $199
Kindle Fire. Although Amazon does not release sales figures for the device, some
analysts estimate it sold about four million in the holiday quarter. Later this
year, tablets from a variety of hardware manufacturers based on Windows 8, a
new, touch-screen-friendly operating system from Microsoft, could further propel
the market.
“Tablets are on fire, there’s no question about that,” said Brad Silverberg, a
venture capitalist in Seattle at Ignition Partners and a former Microsoft
executive, who hastened to add that he was speaking mainly of the iPad, which
dominates current sales.
Tablets are not there yet. In 2011, PCs outsold tablets almost six to one,
estimates Canalys, a technology research company. But that is still a
significant change from 2010, the iPad’s first year on the market, when PCs
outsold tablets 20 to one, according to Canalys. For the last two years, PC
sales were flat, while iPad sales were booming. The Kindle Fire and Barnes &
Noble’s Nook gave the market an additional lift over the holidays. Apple is
banking on the tablet market. Its iPad brought in nearly 40 percent more revenue
during the holidays than Apple’s own computer business, the Macintosh, did.
“From the first day it shipped, we thought — not just me, many of us thought at
Apple — that the tablet market would become larger than the PC market, and it
was just a matter of the time that it took for that to occur,” Mr. Cook of Apple
said recently at a Goldman Sachs investor conference.
Gene Munster, an analyst at Piper Jaffray, estimated that Mr. Cook’s prediction
would come true in 2017, but others contend tablets will be on top sooner than
that.
For example, in a blog post on Friday, Horace Dediu, an analyst with Asymco in
Finland, made a detailed argument that tablet sales would pass traditional PC
sales in the fall of 2013. His projections rest heavily on an assumption that
Apple will face more serious competition in the tablet market from Amazon’s
Kindle Fire, Windows 8 and a wave of other devices based on Google’s Android, an
operating system that has been mostly successful in the smartphone market.
Tim Bucher, an entrepreneur who has held senior positions at Apple, Microsoft
and Dell, said tablet sales would “absolutely” pass those of PCs, a trend he
argued would become even more pronounced as a younger, tablet-savvy generation
ages.
“I think the older generation does not pick up on the way of interacting with
the new devices,” Mr. Bucher said, contrasting older people with the next
generation. “I don’t know how many YouTube videos there are out there showing
everyone from babies to animals interacting with iPads.”
Where does that change leave the PC, the lowly machine that defined computing
for decades?
At a technology conference in 2010, Steven P. Jobs, then Apple’s chief
executive, heralded what he called the post-PC era and compared personal
computers to the trucks that prevailed in the automobile industry until society
began moving away from its agrarian roots. PCs are “still going to be around and
have a lot of value,” said Mr. Jobs, who died in October. “But they’re going to
be used by one out of X people.”
Even Mr. Cook in his recent speech said he was not predicting the demise of the
PC industry, although he did say the iPad was cannibalizing some computer sales,
more Windows PCs than the much smaller market for Macs. One category of PCs
where that is especially true is netbooks, the inexpensive notebook computers
that have had a steep decline in shipments in the last couple of years. “What
the iPad is doing is taking growth away from the PC market that would have gone
to a secondary or tertiary device,” said Mr. Dediu. “It’s not so much people are
going to drop PCs. They’re going to add this additional device.”
Traditional PCs are not standing still. Boxy desktop computers are an
ever-diminishing part of the PC business, while Apple’s MacBook Air and a
category of Windows laptops with Intel processors called ultrabooks have
reinvented traditional clamshell notebooks as superthin devices that turn on
instantly like tablets.
Microsoft’s introduction of Windows 8 promises to shake up computer designs
further. Microsoft and its hardware partners have shown laptops with keyboards
that can be swiveled around or removed altogether, turning them into tablets.
“The tablet and PC markets are all going to blur,” said Tim Coulling, an analyst
at Canalys. “We’re going to see a lot of form-factor innovation. We’ll be
asking, What is a tablet and what is a traditional PC?”
SAN FRANCISCO — It’s 1 p.m. on a Thursday and Dianne Bates, 40,
juggles three screens. She listens to a few songs on her iPod, then taps out a
quick e-mail on her iPhone and turns her attention to the high-definition
television.
Just another day at the gym.
As Ms. Bates multitasks, she is also churning her legs in fast loops on an
elliptical machine in a downtown fitness center. She is in good company. In gyms
and elsewhere, people use phones and other electronic devices to get work done —
and as a reliable antidote to boredom.
Cellphones, which in the last few years have become full-fledged computers with
high-speed Internet connections, let people relieve the tedium of exercising,
the grocery store line, stoplights or lulls in the dinner conversation.
The technology makes the tiniest windows of time entertaining, and potentially
productive. But scientists point to an unanticipated side effect: when people
keep their brains busy with digital input, they are forfeiting downtime that
could allow them to better learn and remember information, or come up with new
ideas.
Ms. Bates, for example, might be clearer-headed if she went for a run outside,
away from her devices, research suggests.
At the University of California, San Francisco, scientists have found that when
rats have a new experience, like exploring an unfamiliar area, their brains show
new patterns of activity. But only when the rats take a break from their
exploration do they process those patterns in a way that seems to create a
persistent memory of the experience.
The researchers suspect that the findings also apply to how humans learn.
“Almost certainly, downtime lets the brain go over experiences it’s had,
solidify them and turn them into permanent long-term memories,” said Loren
Frank, assistant professor in the department of physiology at the university,
where he specializes in learning and memory. He said he believed that when the
brain was constantly stimulated, “you prevent this learning process.”
At the University of Michigan, a study found that people learned significantly
better after a walk in nature than after a walk in a dense urban environment,
suggesting that processing a barrage of information leaves people fatigued.
Even though people feel entertained, even relaxed, when they multitask while
exercising, or pass a moment at the bus stop by catching a quick video clip,
they might be taxing their brains, scientists say.
“People think they’re refreshing themselves, but they’re fatiguing themselves,”
said Marc Berman, a University of Michigan neuroscientist.
Regardless, there is now a whole industry of mobile software developers
competing to help people scratch the entertainment itch. Flurry, a company that
tracks the use of apps, has found that mobile games are typically played for 6.3
minutes, but that many are played for much shorter intervals. One popular game
that involves stacking blocks gets played for 2.2 minutes on average.
Today’s game makers are trying to fill small bits of free time, said Sebastien
de Halleux, a co-founder of PlayFish, a game company owned by the industry giant
Electronic Arts.
“Instead of having long relaxing breaks, like taking two hours for lunch, we
have a lot of these micro-moments,” he said. Game makers like Electronic Arts,
he added, “have reinvented the game experience to fit into micro-moments.”
Many business people, of course, have good reason to be constantly checking
their phones. But this can take a mental toll. Henry Chen, 26, a self-employed
auto mechanic in San Francisco, has mixed feelings about his BlackBerry habits.
“I check it a lot, whenever there is downtime,” Mr. Chen said. Moments earlier,
he was texting with a friend while he stood in line at a bagel shop; he stopped
only when the woman behind the counter interrupted him to ask for his order.
Mr. Chen, who recently started his business, doesn’t want to miss a potential
customer. Yet he says that since he upgraded his phone a year ago to a
feature-rich BlackBerry, he can feel stressed out by what he described as
internal pressure to constantly stay in contact.
“It’s become a demand. Not necessarily a demand of the customer, but a demand of
my head,” he said. “I told my girlfriend that I’m more tired since I got this
thing.”
In the parking lot outside the bagel shop, others were filling up moments with
their phones. While Eddie Umadhay, 59, a construction inspector, sat in his car
waiting for his wife to grocery shop, he deleted old e-mail while listening to
news on the radio. On a bench outside a coffee house, Ossie Gabriel, 44, a nurse
practitioner, waited for a friend and checked e-mail “to kill time.”
Crossing the street from the grocery store to his car, David Alvarado pushed his
2-year-old daughter in a cart filled with shopping bags, his phone pressed to
his ear.
He was talking to a colleague about work scheduling, noting that he wanted to
steal a moment to make the call between paying for the groceries and driving.
“I wanted to take advantage of the little gap,” said Mr. Alvarado, 30, a
facilities manager at a community center.
For many such people, the little digital asides come on top of heavy use of
computers during the day. Take Ms. Bates, the exercising multitasker at the
expansive Bakar Fitness and Recreation Center. She wakes up and peeks at her
iPhone before she gets out of bed. At her job in advertising, she spends all day
in front of her laptop.
But, far from wanting a break from screens when she exercises, she says she
couldn’t possibly spend 55 minutes on the elliptical machine without “lots of
things to do.” This includes relentless channel surfing.
“I switch constantly,” she said. “I can’t stand commercials. I have to flip
around unless I’m watching ‘Project Runway’ or something I’m really into.”
Some researchers say that whatever downside there is to not resting the brain,
it pales in comparison to the benefits technology can bring in motivating people
to sweat.
“Exercise needs to be part of our lives in the sedentary world we’re immersed
in. Anything that helps us move is beneficial,” said John J. Ratey, associate
clinical professor of psychiatry at the Harvard Medical School and author of
“Spark: The Revolutionary New Science of Exercise and the Brain.”
But all things being equal, Mr. Ratey said, he would prefer to see people do
their workouts away from their devices: “There is more bang for your buck doing
it outside, for your mood and working memory.”
Of the 70 cardio machines on the main floor at Bakar Fitness, 67 have
televisions attached. Most of them also have iPod docks and displays showing
workout performance, and a few have games, like a rope-climbing machine that
shows an animated character climbing the rope while the live human does so too.
A few months ago, the cable TV went out and some patrons were apoplectic. “It
was an uproar. People said: ‘That’s what we’re paying for,’ ” said Leeane
Jensen, 28, the fitness manager.
At least one exerciser has a different take. Two stories up from the main floor,
Peter Colley, 23, churns away on one of the several dozen elliptical machines
without a TV. Instead, they are bathed in sunlight, looking out onto the pool
and palm trees.
“I look at the wind on the trees. I watch the swimmers go back and forth,” Mr.
Colley said. “I usually come here to clear my head.”
SHENZHEN, China — Struggling to cope with a rash of suicides at his company’s
electronics factories here, the chairman of an electronics maker that supplies
Apple, Dell and Hewlett-Packard said Wednesday that he was doing everything
possible to find a solution.
“We are reviewing everything,” Terry Gou, the chairman of the Hon Hai Precision
Industry Group of Taiwan and one of Asia’s richest men, said after traveling
here from the company’s headquarters in Taiwan. He said the company was
reviewing labor practices, hiring psychiatrists and putting up safety nets on
the buildings.
“We will leave no stone unturned,” Mr. Gou said, “and we will make sure to find
a way to reduce these suicide tendencies.”
Mr. Gou spoke at a hastily organized news conference and media tour on the
campus of Foxconn Technology, the Hon Hai subsidiary that operates some of the
world’s biggest factories and produces a wide range of electronics for global
brands, including American computer makers.
Foxconn, which has about 420,000 employees on two campuses in Shenzhen, is known
for its military-style efficiency, the awesome scale of its production
operations and for manufacturing popular products like the Apple iPhone. But
this year the company has come under intense scrutiny because of a string of
suicides by distressed workers between the ages of 18 and 24.
The most recent took place early Tuesday, when a 19-year-old employee fell to
his death here. The police have already ruled the death a suicide.
It was the ninth suicide this year by an employee at one of Foxconn’s two
Shenzhen campuses, police said. Two additional workers survived suicide attempts
with serious injuries.
Apple, Dell and Hewlett-Packard say they were now investigating conditions at
Foxconn amid growing concern about the suicides. The companies say that all
their manufacturers are required to comply with international labor standards.
But several labor rights groups have called for an independent investigation
into the suicides and labor conditions at Foxconn, saying some deaths appear to
be suspicious. Some advocates have also accused the company of running huge
sweatshops that regularly violate Chinese labor laws and treat workers harshly.
Those assertions have been bolstered in recent weeks by China’s state-run
newspapers, which have published a series of sensational reports about the
suicides alongside exposés detailing the harsh conditions inside Foxconn
factories.
Some articles describe the heavy burdens workers face in trying to meet
Foxconn’s production quotas, cramped dormitories that sometimes house 10 to a
room and meager salaries of about $150 a month before overtime.
Foxconn executives, though, strongly defend the company’s labor practices and
the conditions on its huge campuses, which they say have modern dormitories,
swimming pools and shopping and recreational facilities.
While company executives acknowledge a sharp rise in the rate of suicides on the
Shenzhen campuses this year, they say the causes are largely because of China’s
social ills and personal problems that arise when migrant workers travel long
distances to find jobs.
Foxconn is still investigating the circumstances surrounding the suicides, but
company executives say they have no evidence they were caused by poor labor
conditions.
“There is a fine line between productivity and regimentation and inhumane
treatment,” said Louis Woo, an aide to Mr. Gou at Hon Hai. “I hope we treat our
workers with dignity and respect.”
To help ease the crisis, Foxconn says, it has invited university scholars and
mental health experts to its campuses in recent weeks. At the news conference at
one campuses Wednesday, some of those experts said the rising number of suicides
may be the result of complex social factors, including the nation’s rising
income gap and even something known as suicide contagion — a tendency for
copycat suicides to occur after reports of other suicides.
Health experts say the suicide figures from Foxconn are troubling but far below
the national rate of about 14 per 100,000 in China, according to the World
Health Organization.
Still, Mr. Gou, who rarely grants interviews and almost never allows journalists
to visit the campuses of Foxconn, made an unusual show of concern and openness
in Shenzhen on Wednesday, bowing several times at the news conference,
apologizing for the tragedies and asking mental health experts to help find a
solution. He even led dozens of journalists on a tour of Foxconn’s campus,
visiting dormitories, a campus hospital, a production line and an employee care
center.
And he appealed to the media to stop sensationalizing the suicides at Foxconn,
which he said could fuel even more suicide attempts.
“I’m appealing to the press to take social responsibility,” he said. “Do not
sensationalize this. But later, he said Foxconn was re-examining the way it
operated. “We can be a better company,” he said.
SAN FRANCISCO — When one of the most important e-mail messages of
his life landed in his in-box a few years ago, Kord Campbell overlooked it.
Not just for a day or two, but 12 days. He finally saw it while sifting through
old messages: a big company wanted to buy his Internet start-up.
“I stood up from my desk and said, ‘Oh my God, oh my God, oh my God,’ ” Mr.
Campbell said. “It’s kind of hard to miss an e-mail like that, but I did.”
The message had slipped by him amid an electronic flood: two computer screens
alive with e-mail, instant messages, online chats, a Web browser and the
computer code he was writing. (View an interactive panorama of Mr. Campbell's
workstation.)
While he managed to salvage the $1.3 million deal after apologizing to his
suitor, Mr. Campbell continues to struggle with the effects of the deluge of
data. Even after he unplugs, he craves the stimulation he gets from his
electronic gadgets. He forgets things like dinner plans, and he has trouble
focusing on his family.
His wife, Brenda, complains, “It seems like he can no longer be fully in the
moment.”
This is your brain on computers.
Scientists say juggling e-mail, phone calls and other incoming information can
change how people think and behave. They say our ability to focus is being
undermined by bursts of information.
These play to a primitive impulse to respond to immediate opportunities and
threats. The stimulation provokes excitement — a dopamine squirt — that
researchers say can be addictive. In its absence, people feel bored.
The resulting distractions can have deadly consequences, as when
cellphone-wielding drivers and train engineers cause wrecks. And for millions of
people like Mr. Campbell, these urges can inflict nicks and cuts on creativity
and deep thought, interrupting work and family life.
While many people say multitasking makes them more productive, research shows
otherwise. Heavy multitaskers actually have more trouble focusing and shutting
out irrelevant information, scientists say, and they experience more stress.
And scientists are discovering that even after the multitasking ends, fractured
thinking and lack of focus persist. In other words, this is also your brain off
computers.
“The technology is rewiring our brains,” said Nora Volkow, director of the
National Institute of Drug Abuse and one of the world’s leading brain
scientists. She and other researchers compare the lure of digital stimulation
less to that of drugs and alcohol than to food and sex, which are essential but
counterproductive in excess.
Technology use can benefit the brain in some ways, researchers say. Imaging
studies show the brains of Internet users become more efficient at finding
information. And players of some video games develop better visual acuity.
More broadly, cellphones and computers have transformed life. They let people
escape their cubicles and work anywhere. They shrink distances and handle
countless mundane tasks, freeing up time for more exciting pursuits.
For better or worse, the consumption of media, as varied as e-mail and TV, has
exploded. In 2008, people consumed three times as much information each day as
they did in 1960. And they are constantly shifting their attention. Computer
users at work change windows or check e-mail or other programs nearly 37 times
an hour, new research shows.
The nonstop interactivity is one of the most significant shifts ever in the
human environment, said Adam Gazzaley, a neuroscientist at the University of
California, San Francisco.
“We are exposing our brains to an environment and asking them to do things we
weren’t necessarily evolved to do,” he said. “We know already there are
consequences.”
Mr. Campbell, 43, came of age with the personal computer, and he is a heavier
user of technology than most. But researchers say the habits and struggles of
Mr. Campbell and his family typify what many experience — and what many more
will, if trends continue.
For him, the tensions feel increasingly acute, and the effects harder to shake.
The Campbells recently moved to California from Oklahoma to start a software
venture. Mr. Campbell’s life revolves around computers. (View a slide show on
how the Campbells interact with technology.)
He goes to sleep with a laptop or iPhone on his chest, and when he wakes, he
goes online. He and Mrs. Campbell, 39, head to the tidy kitchen in their
four-bedroom hillside rental in Orinda, an affluent suburb of San Francisco,
where she makes breakfast and watches a TV news feed in the corner of the
computer screen while he uses the rest of the monitor to check his e-mail.
Major spats have arisen because Mr. Campbell escapes into video games during
tough emotional stretches. On family vacations, he has trouble putting down his
devices. When he rides the subway to San Francisco, he knows he will be offline
221 seconds as the train goes through a tunnel.
Their 16-year-old son, Connor, tall and polite like his father, recently
received his first C’s, which his family blames on distraction from his gadgets.
Their 8-year-old daughter, Lily, like her mother, playfully tells her father
that he favors technology over family.
“I would love for him to totally unplug, to be totally engaged,” says Mrs.
Campbell, who adds that he becomes “crotchety until he gets his fix.” But she
would not try to force a change.
“He loves it. Technology is part of the fabric of who he is,” she says. “If I
hated technology, I’d be hating him, and a part of who my son is too.”
Always On
Mr. Campbell, whose given name is Thomas, had an early start with technology in
Oklahoma City. When he was in third grade, his parents bought him Pong, a video
game. Then came a string of game consoles and PCs, which he learned to program.
In high school, he balanced computers, basketball and a romance with Brenda, a
cheerleader with a gorgeous singing voice. He studied too, with focus,
uninterrupted by e-mail. “I did my homework because I needed to get it done,” he
said. “I didn’t have anything else to do.”
He left college to help with a family business, then set up a lawn mowing
service. At night he would read, play video games, hang out with Brenda and, as
she remembers it, “talk a lot more.”
In 1996, he started a successful Internet provider. Then he built the start-up
that he sold for $1.3 million in 2003 to LookSmart, a search engine.
Mr. Campbell loves the rush of modern life and keeping up with the latest
information. “I want to be the first to hear when the aliens land,” he said,
laughing. But other times, he fantasizes about living in pioneer days when
things moved more slowly: “I can’t keep everything in my head.”
No wonder. As he came of age, so did a new era of data and communication.
At home, people consume 12 hours of media a day on average, when an hour spent
with, say, the Internet and TV simultaneously counts as two hours. That compares
with five hours in 1960, say researchers at the University of California, San
Diego. Computer users visit an average of 40 Web sites a day, according to
research by RescueTime, which offers time-management tools.
As computers have changed, so has the understanding of the human brain. Until 15
years ago, scientists thought the brain stopped developing after childhood. Now
they understand that its neural networks continue to develop, influenced by
things like learning skills.
So not long after Eyal Ophir arrived at Stanford in 2004, he wondered whether
heavy multitasking might be leading to changes in a characteristic of the brain
long thought immutable: that humans can process only a single stream of
information at a time.
Going back a half-century, tests had shown that the brain could barely process
two streams, and could not simultaneously make decisions about them. But Mr.
Ophir, a student-turned-researcher, thought multitaskers might be rewiring
themselves to handle the load.
His passion was personal. He had spent seven years in Israeli intelligence after
being weeded out of the air force — partly, he felt, because he was not a good
multitasker. Could his brain be retrained?
Mr. Ophir, like others around the country studying how technology bent the
brain, was startled by what he discovered.
The Myth of Multitasking
The test subjects were divided into two groups: those classified as heavy
multitaskers based on their answers to questions about how they used technology,
and those who were not.
In a test created by Mr. Ophir and his colleagues, subjects at a computer were
briefly shown an image of red rectangles. Then they saw a similar image and were
asked whether any of the rectangles had moved. It was a simple task until the
addition of a twist: blue rectangles were added, and the subjects were told to
ignore them. (Play a game testing how well you filter out distractions.)
The multitaskers then did a significantly worse job than the non-multitaskers at
recognizing whether red rectangles had changed position. In other words, they
had trouble filtering out the blue ones — the irrelevant information.
So, too, the multitaskers took longer than non-multitaskers to switch among
tasks, like differentiating vowels from consonants and then odd from even
numbers. The multitaskers were shown to be less efficient at juggling problems.
(Play a game testing how well you switch between tasks.)
Other tests at Stanford, an important center for research in this fast-growing
field, showed multitaskers tended to search for new information rather than
accept a reward for putting older, more valuable information to work.
Researchers say these findings point to an interesting dynamic: multitaskers
seem more sensitive than non-multitaskers to incoming information.
The results also illustrate an age-old conflict in the brain, one that
technology may be intensifying. A portion of the brain acts as a control tower,
helping a person focus and set priorities. More primitive parts of the brain,
like those that process sight and sound, demand that it pay attention to new
information, bombarding the control tower when they are stimulated.
Researchers say there is an evolutionary rationale for the pressure this barrage
puts on the brain. The lower-brain functions alert humans to danger, like a
nearby lion, overriding goals like building a hut. In the modern world, the
chime of incoming e-mail can override the goal of writing a business plan or
playing catch with the children.
“Throughout evolutionary history, a big surprise would get everyone’s brain
thinking,” said Clifford Nass, a communications professor at Stanford. “But
we’ve got a large and growing group of people who think the slightest hint that
something interesting might be going on is like catnip. They can’t ignore it.”
Mr. Nass says the Stanford studies are important because they show
multitasking’s lingering effects: “The scary part for guys like Kord is, they
can’t shut off their multitasking tendencies when they’re not multitasking.”
Melina Uncapher, a neurobiologist on the Stanford team, said she and other
researchers were unsure whether the muddied multitaskers were simply prone to
distraction and would have had trouble focusing in any era. But she added that
the idea that information overload causes distraction was supported by more and
more research.
A study at the University of California, Irvine, found that people interrupted
by e-mail reported significantly increased stress compared with those left to
focus. Stress hormones have been shown to reduce short-term memory, said Gary
Small, a psychiatrist at the University of California, Los Angeles.
Preliminary research shows some people can more easily juggle multiple
information streams. These “supertaskers” represent less than 3 percent of the
population, according to scientists at the University of Utah.
Other research shows computer use has neurological advantages. In imaging
studies, Dr. Small observed that Internet users showed greater brain activity
than nonusers, suggesting they were growing their neural circuitry.
At the University of Rochester, researchers found that players of some
fast-paced video games can track the movement of a third more objects on a
screen than nonplayers. They say the games can improve reaction and the ability
to pick out details amid clutter.
“In a sense, those games have a very strong both rehabilitative and educational
power,” said the lead researcher, Daphne Bavelier, who is working with others in
the field to channel these changes into real-world benefits like safer driving.
There is a vibrant debate among scientists over whether technology’s influence
on behavior and the brain is good or bad, and how significant it is.
“The bottom line is, the brain is wired to adapt,” said Steven Yantis, a
professor of brain sciences at Johns Hopkins University. “There’s no question
that rewiring goes on all the time,” he added. But he said it was too early to
say whether the changes caused by technology were materially different from
others in the past.
Mr. Ophir is loath to call the cognitive changes bad or good, though the impact
on analysis and creativity worries him.
He is not just worried about other people. Shortly after he came to Stanford, a
professor thanked him for being the one student in class paying full attention
and not using a computer or phone. But he recently began using an iPhone and
noticed a change; he felt its pull, even when playing with his daughter.
“The media is changing me,” he said. “I hear this internal ping that says: check
e-mail and voice mail.”
“I have to work to suppress it.”
Kord Campbell does not bother to suppress it, or no longer can.
Interrupted by a Corpse
It is a Wednesday in April, and in 10 minutes, Mr. Campbell has an online
conference call that could determine the fate of his new venture, called Loggly.
It makes software that helps companies understand the clicking and buying
patterns of their online customers.
Mr. Campbell and his colleagues, each working from a home office, are
frantically trying to set up a program that will let them share images with
executives at their prospective partner.
But at the moment when Mr. Campbell most needs to focus on that urgent task,
something else competes for his attention: “Man Found Dead Inside His Business.”
That is the tweet that appears on the left-most of Mr. Campbell’s array of
monitors, which he has expanded to three screens, at times adding a laptop and
an iPad.
On the left screen, Mr. Campbell follows the tweets of 1,100 people, along with
instant messages and group chats. The middle monitor displays a dark field
filled with computer code, along with Skype, a service that allows Mr. Campbell
to talk to his colleagues, sometimes using video. The monitor on the right keeps
e-mail, a calendar, a Web browser and a music player.
Even with the meeting fast approaching, Mr. Campbell cannot resist the tweet
about the corpse. He clicks on the link in it, glances at the article and
dismisses it. “It’s some article about something somewhere,” he says, annoyed by
the ads for jeans popping up.
The program gets fixed, and the meeting turns out to be fruitful: the partners
are ready to do business. A colleague says via instant message: “YES.”
Other times, Mr. Campbell’s information juggling has taken a more serious toll.
A few weeks earlier, he once again overlooked an e-mail message from a
prospective investor. Another time, Mr. Campbell signed the company up for the
wrong type of business account on Amazon.com, costing $300 a month for six
months before he got around to correcting it. He has burned hamburgers on the
grill, forgotten to pick up the children and lingered in the bathroom playing
video games on an iPhone.
Mr. Campbell can be unaware of his own habits. In a two-and-a-half hour stretch
one recent morning, he switched rapidly between e-mail and several other
programs, according to data from RescueTime, which monitored his computer use
with his permission. But when asked later what he was doing in that period, Mr.
Campbell said he had been on a long Skype call, and “may have pulled up an
e-mail or two.”
The kind of disconnection Mr. Campbell experiences is not an entirely new
problem, of course. As they did in earlier eras, people can become so lost in
work, hobbies or TV that they fail to pay attention to family.
Mr. Campbell concedes that, even without technology, he may work or play
obsessively, just as his father immersed himself in crossword puzzles. But he
says this era is different because he can multitask anyplace, anytime.
“It’s a mixed blessing,” he said. “If you’re not careful, your marriage can fall
apart or your kids can be ready to play and you’ll get distracted.”
The Toll on Children
Father and son sit in armchairs. Controllers in hand, they engage in a fierce
video game battle, displayed on the nearby flat-panel TV, as Lily watches.
They are playing Super Smash Bros. Brawl, a cartoonish animated fight between
characters that battle using anvils, explosives and other weapons.
“Kill him, Dad,” Lily screams. To no avail. Connor regularly beats his father,
prompting expletives and, once, a thrown pillow. But there is bonding and mutual
respect.
“He’s a lot more tactical,” says Connor. “But I’m really good at quick
reflexes.”
Screens big and small are central to the Campbell family’s leisure time. Connor
and his mother relax while watching TV shows like “Heroes.” Lily has an iPod
Touch, a portable DVD player and her own laptop, which she uses to watch videos,
listen to music and play games.
Lily, a second-grader, is allowed only an hour a day of unstructured time, which
she often spends with her devices. The laptop can consume her.
“When she’s on it, you can holler her name all day and she won’t hear,” Mrs.
Campbell said.
Researchers worry that constant digital stimulation like this creates attention
problems for children with brains that are still developing, who already
struggle to set priorities and resist impulses.
Connor’s troubles started late last year. He could not focus on homework. No
wonder, perhaps. On his bedroom desk sit two monitors, one with his music
collection, one with Facebook and Reddit, a social site with news links that he
and his father love. His iPhone availed him to relentless texting with his
girlfriend.
When he studied, “a little voice would be saying, ‘Look up’ at the computer, and
I’d look up,” Connor said. “Normally, I’d say I want to only read for a few
minutes, but I’d search every corner of Reddit and then check Facebook.”
His Web browsing informs him. “He’s a fact hound,” Mr. Campbell brags. “Connor
is, other than programming, extremely technical. He’s 100 percent Internet
savvy.”
But the parents worry too. “Connor is obsessed,” his mother said. “Kord says we
have to teach him balance.”
So in January, they held a family meeting. Study time now takes place in a group
setting at the dinner table after everyone has finished eating. It feels, Mr.
Campbell says, like togetherness.
No Vacations
For spring break, the family rented a cottage in Carmel, Calif. Mrs. Campbell
hoped everyone would unplug.
But the day before they left, the iPad from Apple came out, and Mr. Campbell
snapped one up. The next night, their first on vacation, “We didn’t go out to
dinner,” Mrs. Campbell mourned. “We just sat there on our devices.”
She rallied the troops the next day to the aquarium. Her husband joined them for
a bit but then begged out to do e-mail on his phone.
Later she found him playing video games.
The trip came as Mr. Campbell was trying to raise several million dollars for
his new venture, a goal that he achieved. Brenda said she understood that his
pursuit required intensity but was less understanding of the accompanying surge
in video game.
His behavior brought about a discussion between them. Mrs. Campbell said he told
her that he was capable of logging off, citing a trip to Hawaii several years
ago that they called their second honeymoon.
“What trip are you thinking about?” she said she asked him. She recalled that he
had spent two hours a day online in the hotel’s business center.
On Thursday, their fourth day in Carmel, Mr. Campbell spent the day at the beach
with his family. They flew a kite and played whiffle ball.
Connor unplugged too. “It changes the mood of everything when everybody is
present,” Mrs. Campbell said.
The next day, the family drove home, and Mr. Campbell disappeared into his
office.
Technology use is growing for Mrs. Campbell as well. She divides her time
between keeping the books of her husband’s company, homemaking and working at
the school library. She checks e-mail 25 times a day, sends texts and uses
Facebook.
Recently, she was baking peanut butter cookies for Teacher Appreciation Day when
her phone chimed in the living room. She answered a text, then became lost in
Facebook, forgot about the cookies and burned them. She started a new batch, but
heard the phone again, got lost in messaging, and burned those too. Out of
ingredients and shamed, she bought cookies at the store.
She feels less focused and has trouble completing projects. Some days, she
promises herself she will ignore her device. “It’s like a diet — you have good
intentions in the morning and then you’re like, ‘There went that,’ ” she said.
Mr. Nass at Stanford thinks the ultimate risk of heavy technology use is that it
diminishes empathy by limiting how much people engage with one another, even in
the same room.
“The way we become more human is by paying attention to each other,” he said.
“It shows how much you care.”
That empathy, Mr. Nass said, is essential to the human condition. “We are at an
inflection point,” he said. “A significant fraction of people’s experiences are
now fragmented.”
October 5, 2009
The New York Times
By BRAD STONE and ASHLEE VANCE
SAN FRANCISCO — The high-tech industry has been working itself into paroxysms of
excitement lately over an idea that is not exactly new: tablet computers.
Quietly, several high-tech companies are lining up to deliver versions of these
keyboard-free, touch-screen portable machines in the next few months. Industry
watchers have their eye on Apple in particular to sell such a device by early
next year.
Tablets have been around in various forms for two decades, thus far delivering
little other than memorable failure. Nonetheless, the new batch of devices has
gripped the imagination of tech executives, bloggers and gadget hounds, who are
projecting their wildest dreams onto these literal blank slates.
In these visions, tablets will save the newspaper and book publishing
industries, present another way to watch television and movies, play video
games, and offer a visually rich way to enjoy the Web and the expanding world of
mobile applications.
“Desktops, laptops — we already know how those work,” said Brian Lam, editorial
director of the popular gadget site Gizmodo, which reports and hypothesizes
almost daily about these devices. Tablets, he said, “are one of the last few
mysteries left.”
Tablet computers were first conceived as a way to supplant plain old paper, in
the same way that PCs replaced the typewriter.
In 1993, Apple’s Newton MessagePad, with its expansive screen and stylus pen,
became known less for its innovative features than for being lampooned in
“Doonesbury,” which ridiculed the device for its flawed handwriting recognition.
Steven P. Jobs killed the Newton when he returned to Apple in 1997.
Then in 2001, at Comdex, the industry trade show, Bill Gates introduced new
Windows software for tablets with a bold prediction: within five years, he said,
tablets “will be the most popular form of PC sold in America.” It didn’t happen,
of course. Tablets running Windows sell only a few hundred thousand units a
year, mostly in business fields like health care and financial services.
There were basic problems with these early tablets: they cost too much and did
not do enough.
“Software engineers got ahead of the hardware capabilities,” said Paul Jackson,
a consumer product analyst at Forrester Research. “But we may be finally getting
to the point where the dreams and aspirations of those designers are actually
meeting capable and reasonably priced technology.”
You can thank Moore’s Law and the immutable advance of technology for that.
Integrated microchips now combine wireless connectivity and support for features
like multimedia, GPS functions and rich graphics. They are also more
energy-efficient.
At the same time, the iPhone and its imitators have demonstrated that new
tactile touch screens work and that people are comfortable with them, in a way
they never got accustomed to using earlier tablets and stylus pens.
“We darn well should be about ready to take advantage of this stuff. It’s time,”
said Bill Buxton, a researcher at Microsoft who has been working on multitouch
systems for 20 years, and has a comprehensive collection of tablets and touch
screens he keeps in his office in Toronto.
The drumbeat of tablet product introductions has already begun. In June, Archos,
a French consumer electronics company, began selling a small touch-screen tablet
running Google’s Android software. Later this month, it will introduce another
tablet that runs on Microsoft’s Windows 7, which has built-in support for touch
screens.
“A road warrior doesn’t want to take a big clamshell netbook with him,” said
Frédéric Balaÿ, vice president for marketing at Archos.
The industry blog TechCrunch has also commissioned its own Web tablet, called
the CrunchPad, which it has said it will start selling later this year.
Despite its past bruises in the tablet business, Microsoft appears ready to try
again. In September, images of a booklike Microsoft device called Courier, with
two 7-inch color screens, surfaced on Gizmodo.
In an interview, Steven A. Ballmer, Microsoft’s chief executive, would not
discuss that product in particular, but said the company devises such prototypes
all the time, so it can take them to its hardware partners. Still, rumors of a
Microsoft tablet computer sparked interest. “I got an e-mail from some customer
who said, ‘I want that,’ ” Mr. Ballmer said.
Apple’s rumored tablet is the most highly anticipated of the lot. Analysts
expect Apple to introduce it early next year — a sort of expanded, souped-up
version of the iPod Touch, priced at around $700.
Last week, Apple rehired the original chief marketer of its old Newton, Michael
Tchao, who was working at Nike. Mr. Tchao’s former Apple colleagues believe he
will help market this new device.
Colin Smith, an Apple spokesman, declined to comment on the company’s
recruitment or product plans. But Apple’s tablet will most likely have little in
common with the Newton, which was essentially a personal digital assistant. The
new crop of tablets is being viewed as more flexible — gadgets that combine
elements of the iPhone, e-book readers like the Kindle and laptops.
Apple has been working on such a Swiss Army knife tablet since at least 2003,
according to several former employees. One prototype, developed in 2003, used
PowerPC microchips made by I.B.M., which were so power-hungry that they quickly
drained the battery.
“It couldn’t be built. The battery life wasn’t long enough, the graphics
performance was not enough to do anything and the components themselves cost
more than $500,” said Joshua A. Strickland, a former Apple engineer whose name
is on several of the company’s patents for multitouch technology.
Another former Apple executive who was there at the time said the tablets kept
getting shelved at Apple because Mr. Jobs, whose incisive critiques are often
memorable, asked, in essence, what they were good for besides surfing the Web in
the bathroom.
The success of the iPhone may have partially helped to answer that question. As
of last month, developers had created 85,000 applications for the iPhone and
iPod Touch — video games, social networking software, restaurant finders and
more. Analysts believe that all those programs will immediately work on the new
tablet while developers begin to tailor new software for the larger screen.
Despite the preponderance of apps, there is still the persistent question of
whether regular people will really find a use for tablet computers. Smaller
cellphones are increasingly multipurpose and fit nicely in a jacket pocket. And
low-end laptops are inexpensive, run a full-fledged operating system and offer
the luxury of a keyboard.
“I can imagine something like the iPhone with a much bigger screen being a
gorgeous device with great capacity, but I don’t know where I would fit that
into my life,” said a former Apple executive, who declined to be named because
of Apple’s secrecy policies, but who anticipates an Apple tablet next year.
“Those are the debates that have been happening inside Apple for quite some
time.”
MOUNTAIN VIEW, Calif. — The computer industry has a lot riding on your fingers.
For years, companies have dabbled with the touch-screen technology that lets
people poke icons on a display to accomplish tasks like picking a seat at an
airport check-in kiosk. Apple elevated such technology from a novelty to a
must-have feature on mobile devices with its iPhone. People can flip through
pictures with a flick of a finger or make a document larger by pressing two
fingers against the screen and stretching them out.
Now both personal computer manufacturers and software makers hope to do more
with touch on larger devices by giving people a 10-fingered go at their screens.
“You don’t even operate your TV with two fingers,” said Amichai Ben-David, the
chief executive officer of N-trig, which produces touch-screen technology for PC
makers. “In order for this to feel really natural, you need more than two
fingers for sure.”
The PC industry hopes the feature spurs sales. PC makers like Hewlett-Packard
and Dell have been clobbered during the recession as struggling businesses drop
computer upgrades to the bottom of their to-do lists. Consumers have shown more
interest in new machines, but they are buying cheap, tiny laptops rather than
decked-out goliaths.
H.P., Dell, Intel and Microsoft expect that when companies and consumers
increase their spending, touch technology will be one of the things that nudge
them to upgrade. Computers with the special screens will probably cost consumers
about $100 more than standard machines.
H.P. has been selling a PC with an early version of touch technology. The $1,150
TouchSmart PC has been popular, H.P. says, particularly in kitchens as a family
computer. But outside of science-fiction films, touch computers have been met
with lukewarm reactions. Tabletlike computers that ship with plastic pens for
marking on screens remain a niche in the overall PC market, as do pure touch
machines. Mr. Ben-David said that about two million of about 300 million PCs
sold last year were touch computers.
H.P. has already been pushing touch technology to large businesses. It sells a
custom touch interface for both desktops and laptops. Customers can turn these
machines into bespoke kiosks for, say, ordering merchandise at a sporting event
or flipping through a menu while waiting at a restaurant.
The PC industry wants to make touch functions more sophisticated and widespread.
On-screen objects could be twisted and turned with several fingers, mimicking
the action used in real life. The next version of Windows from Microsoft,
Windows 7, will usher in a new era of touch technology when it appears on PCs
later this year, according to Mr. Ben-David. Backed by Microsoft, Israel-based
N-trig uses a combination of software and sensors to create a special type of
computer screen that can interact with pens and fingers. N-trig’s technology
works by pumping an electrical signal through the screen. When a finger hits the
screen, the electricity is discharged. Software interprets that to move graphics
on the screen. The company claims that its technology works better on the larger
displays of laptops and PCs since it handles many inputs at once.
Working together, Microsoft and N-trig have created a type of software interface
that lets other companies add touch functions to their programs. Such touch
software can handle lots of fingers hitting a screen at once rather than just
relying on one or two digits, as most of today’s touch screens do.
N-trig hopes to build more momentum later this year, when three more PC makers
are set to join H.P. and Dell as backers of the touch technology. It did not
disclose the names of those companies.
The big question is whether companies can create software that makes touch
useful rather than a mere curiosity.
Corel, which makes document and photo editing software, also plans touch
products that rely on N-trig’s technology for Windows 7.
SpaceClaim, which makes software for designing objects in 3-D, has taken a
business-oriented approach to touch. Its software, which will work with Windows
7, creates 3-D models that can be turned, pinched and altered via two-handed
touches. Frank DeSimone, the head of development urges other software makers to
try something new and stick with the technology rather than just replicating the
functions of a mouse.
“A lot of people say they will support touch, but they do a disservice to
everyone by not doing anything interesting,” he said.
SANTA CLARA, Calif. — Intel has worked hard and spent a lot of
money over the years to shape its image: It is the company that celebrates its
quest to make computer chips ever smaller, faster and cheaper with a quick
five-note jingle at the end of its commercials.
But as Intel tries to expand beyond the personal computer chip business, it is
changing in subtle ways. For the first time, its long unheralded software
developers, more than 3,000 of them, have stolen some of the spotlight from its
hardware engineers. These programmers find themselves at the center of Intel’s
forays into areas like mobile phones and video games.
The most attention-grabbing element of Intel’s software push is a version of the
open-source Linux operating system called Moblin. It represents a direct assault
on the Windows franchise of Microsoft, Intel’s longtime partner.
“This is a very determined, risky effort on Intel’s part,” said Mark
Shuttleworth, the chief executive of Canonical, which makes another version of
Linux called Ubuntu.
The Moblin software resembles Windows or Apple’s Mac OS X to a degree, handling
the basic functions of running a computer. But it has a few twists as well that
Intel says make it better suited for small mobile devices.
For example, Moblin fires up and reaches the Internet in about seven seconds,
then displays a novel type of start-up screen. People will find their
appointments listed on one side of the screen, along with their favorite
programs. But the bulk of the screen is taken up by cartoonish icons that show
things like social networking updates from friends, photos and recently used
documents.
With animated icons and other quirky bits and pieces, Moblin looks like a fresh
take on the operating system. Some companies hope it will give Microsoft a
strong challenge in the market for the small, cheap laptops commonly known as
netbooks. A polished second version of the software, which is in trials, should
start appearing on a variety of netbooks this summer.
“We really view this as an opportunity and a game changer,” said Ronald W.
Hovsepian, the chief executive of Novell, which plans to offer a customized
version on Moblin to computer makers. Novell views Moblin as a way to extend its
business selling software and services related to Linux.
While Moblin fits netbooks well today, it was built with smartphones in mind.
Those smartphones explain why Intel was willing to needle Microsoft.
Intel has previously tried and failed to carve out a prominent stake in the
market for chips used in smaller computing devices like phones. But the company
says one of its newer chips, called Atom, will solve this riddle and help it
compete against the likes of Texas Instruments and Qualcomm.
The low-power, low-cost Atom chip sits inside most of the netbooks sold today,
and smartphones using the chip could start arriving in the next couple of years.
To make Atom a success, Intel plans to use software for leverage. Its needs
Moblin because most of the cellphone software available today runs on chips
whose architecture is different from Atom’s. To make Atom a worthwhile choice
for phone makers, there must be a supply of good software that runs on it.
“The smartphone is certainly the end goal,” said Doug Fisher, a vice president
in Intel’s software group. “It’s absolutely critical for the success of this
product.”
Though large, Intel’s software group has remained out of the spotlight for
years. Intel considers its software work a silent helping hand for computer
makers.
Mostly, the group sells tools that help other software developers take advantage
of features in Intel’s chips. It also offers free consulting services to help
large companies wring the most performance out of their code, in a bid to sell
more chips.
Renee J. James, Intel’s vice president in charge of software, explained, “You
can’t just throw hardware out there into the world.”
Intel declines to disclose its revenue from these tools, but it is a tiny
fraction of the close to $40 billion in sales Intel racks up every year.
Still, the software group is one of the largest at Intel and one of the largest
such organizations at any company.
In the last few years, Intel’s investment in Linux, the main rival to Windows,
has increased. Intel has hired some of the top Linux developers, including Alan
Cox from Red Hat, the leading Linux seller, last year. Intel pays these
developers to improve Linux as a whole and to further the company’s own projects
like Moblin.
“Intel definitely ranks pretty highly when it comes to meaningful
contributions,” Linus Torvalds, who created the core of Linux and maintains the
software, wrote in an e-mail message. “They went from apparently not having much
of a strategy at all to having a rather wide team.”
Intel has also bought software companies. Last year, it acquired OpenedHand, a
company whose work has turned into the base of the new Moblin user interface.
It has also bought a handful of software companies with expertise in gaming and
graphics technology. Such software is meant to create a foundation to support
Intel’s release of new high-powered graphics chips next year. Intel hopes the
graphics products will let it compete better against Nvidia and Advanced Micro
Devices and open up another new business.
Intel tries to play down its competition with Microsoft. Since Moblin is open
source, anyone can pick it up and use it. Companies like Novell will be the ones
actually offering the software to PC makers, while Intel will stay in the
background. Still, Ms. James says that Intel’s relationship with Microsoft has
turned more prickly.
“It is not without its tense days,” she said.
Microsoft says Intel faces serious hurdles as it tries to stake a claim in the
operating system market.
“I think it will introduce some challenges for them just based on our experience
of having built operating systems for 25 years or so,” said James DeBragga, the
general manager of Microsoft’s Windows consumer team.
While Linux started out as a popular choice on netbooks, Microsoft now dominates
the market. Microsoft doubts whether something like Moblin’s glossy interface
will be enough to woo consumers who are used to Windows.
Intel says people are ready for something new on mobile devices, which are
geared more to the Internet than to running desktop-style programs.
“I am a risk taker,” Ms. James of Intel said. “I have that outlook that if
there’s a possibility of doing something different, we should explore trying
it.”
New software from I.B.M. can suck up huge volumes of data from many sources and
quickly identify correlations within it. The company says it expects the
software to be useful in analyzing finance, health care and even space weather.
Bo Thidé, a scientist at the Swedish Institute of Space Physics, has been
testing an early version of the software as he studies the ways in which things
like gas clouds and particles cast off by the sun can disrupt communications
networks on Earth. The new software, which I.B.M. calls stream processing, makes
it possible for Mr. Thidé and his team of researchers to gather and analyze vast
amounts of information at a record pace.
“For us, there is no chance in the world that you can think about storing data
and analyzing it tomorrow,” Mr. Thidé said. “There is no tomorrow. We need a
smart system that can give you hints about what is happening out there right
now.”
I.B.M., based in Armonk, N.Y., spent close to six years working on the software
and has just moved to start selling a product based on it called System S. The
company expects it to encourage breakthroughs in fields like finance and city
management by helping people better understand patterns in data.
Steven A. Mills, I.B.M.’s senior vice president for software, notes that
financial companies have spent years trying to gain trading edges by sorting
through various sets of information. “The challenge in that industry has not
been ‘Could you collect all the data?’ but ‘Could you collect it all together
and analyze it in real time?’ ” Mr. Mills said.
To that end, the new software harnesses advances in computing and networking
horsepower in a fashion that analysts and customers describe as unprecedented.
Instead of creating separate large databases to track things like currency
movements, stock trading patterns and housing data, the System S software can
meld all of that information together. In addition, it could theoretically then
layer on databases that tracked current events, like news headlines on the
Internet or weather fluctuations, to try to gauge how such factors interplay
with the financial data.
Most computers, of course, can digest large stores of information if given
enough time. But I.B.M. has succeeded in performing very quick analyses on
larger hunks of combined data than most companies are used to handling.
“It’s that combination of size and speed that had yet to be solved,” said Gordon
Haff, an analyst at Illuminata, a technology industry research firm.
Conveniently for I.B.M., the System S software matured in time to match up with
the company’s “Smarter Planet” campaign. I.B.M. has flooded the airwaves with
commercials about using technology to run things like power grids and hospitals
more efficiently.
The company suggests, for example, that a hospital could tap the System S
technology to monitor not only individual patients but also entire patient
databases, as well as medication and diagnostics systems. If all goes according
to plan, the computing systems could alert nurses and doctors to emerging
problems.
Analysts say the technology could also provide companies with a new edge as they
grapple with doing business on a global scale.
“With globalization, more and more markets are heading closer to perfect
competition models,” said Dan Olds, an analyst with Gabriel Consulting. “This
means that companies have to get smarter about how they use their data and find
previously unseen opportunities.”
Buying such an advantage from I.B.M. has its price. The company will charge at
least hundreds of thousands of dollars for the software, Mr. Mills said.
October 12, 2008
The New York Times
By RICHARD DOOLING
Omaha
“BEWARE of geeks bearing formulas.” So saith Warren Buffett, the Wizard of
Omaha. Words to bear in mind as we bail out banks and buy up mortgages and tweak
interest rates and nothing, nothing seems to make any difference on Wall Street
or Main Street. Years ago, Mr. Buffett called derivatives “weapons of financial
mass destruction” — an apt metaphor considering that the Manhattan Project’s
math and physics geeks bearing formulas brought us the original weapon of mass
destruction, at Trinity in New Mexico on July 16, 1945.
In a 1981 documentary called “The Day After Trinity,” Freeman Dyson, a reigning
gray eminence of math and theoretical physics, as well as an ardent proponent of
nuclear disarmament, described the seductive power that brought us the ability
to create atomic energy out of nothing.
“I have felt it myself,” he warned. “The glitter of nuclear weapons. It is
irresistible if you come to them as a scientist. To feel it’s there in your
hands, to release this energy that fuels the stars, to let it do your bidding.
To perform these miracles, to lift a million tons of rock into the sky. It is
something that gives people an illusion of illimitable power, and it is, in some
ways, responsible for all our troubles — this, what you might call technical
arrogance, that overcomes people when they see what they can do with their
minds.”
The Wall Street geeks, the quantitative analysts (“quants”) and masters of “algo
trading” probably felt the same irresistible lure of “illimitable power” when
they discovered “evolutionary algorithms” that allowed them to create vast
empires of wealth by deriving the dependence structures of portfolio credit
derivatives.
What does that mean? You’ll never know. Over and over again, financial experts
and wonkish talking heads endeavor to explain these mysterious, “toxic”
financial instruments to us lay folk. Over and over, they ignobly fail, because
we all know that no one understands credit default obligations and derivatives,
except perhaps Mr. Buffett and the computers who created them.
Somehow the genius quants — the best and brightest geeks Wall Street firms could
buy — fed $1 trillion in subprime mortgage debt into their supercomputers, added
some derivatives, massaged the arrangements with computer algorithms and — poof!
— created $62 trillion in imaginary wealth. It’s not much of a stretch to
imagine that all of that imaginary wealth is locked up somewhere inside the
computers, and that we humans, led by the silverback males of the financial
world, Ben Bernanke and Henry Paulson, are frantically beseeching the monolith
for answers. Or maybe we are lost in space, with Dave the astronaut pleading,
“Open the bank vault doors, Hal.”
As the current financial crisis spreads (like a computer virus) on the earth’s
nervous system (the Internet), it’s worth asking if we have somehow managed to
colossally outsmart ourselves using computers. After all, the Wall Street titans
loved swaps and derivatives because they were totally unregulated by humans.
That left nobody but the machines in charge.
How fitting then, that almost 30 years after Freeman Dyson described the almost
unspeakable urges of the nuclear geeks creating illimitable energy out of
equations, his son, George Dyson, has written an essay (published at Edge.org)
warning about a different strain of technical arrogance that has brought the
entire planet to the brink of financial destruction. George Dyson is an
historian of technology and the author of “Darwin Among the Machines,” a book
that warned us a decade ago that it was only a matter of time before technology
out-evolves us and takes over.
His new essay — “Economic Dis-Equilibrium: Can You Have Your House and Spend It
Too?” — begins with a history of “stock,” originally a stick of hazel, willow or
alder wood, inscribed with notches indicating monetary amounts and dates. When
funds were transferred, the stick was split into identical halves — with one
side going to the depositor and the other to the party safeguarding the money —
and represented proof positive that gold had been deposited somewhere to back it
up. That was good enough for 600 years, until we decided that we needed more
speed and efficiency.
Making money, it seems, is all about the velocity of moving it around, so that
it can exist in Hong Kong one moment and Wall Street a split second later. “The
unlimited replication of information is generally a public good,” George Dyson
writes. “The problem starts, as the current crisis demonstrates, when
unregulated replication is applied to money itself. Highly complex
computer-generated financial instruments (known as derivatives) are being
produced, not from natural factors of production or other goods, but purely from
other financial instruments.”
It was easy enough for us humans to understand a stick or a dollar bill when it
was backed by something tangible somewhere, but only computers can understand
and derive a correlation structure from observed collateralized debt obligation
tranche spreads. Which leads us to the next question: Just how much of the
world’s financial stability now lies in the “hands” of computerized trading
algorithms?
•
Here’s a frightening party trick that I learned from the futurist Ray Kurzweil.
Read this excerpt and then I’ll tell you who wrote it:
But we are suggesting neither that the human race would voluntarily turn power
over to the machines nor that the machines would willfully seize power. What we
do suggest is that the human race might easily permit itself to drift into a
position of such dependence on the machines that it would have no practical
choice but to accept all of the machines’ decisions. ... Eventually a stage may
be reached at which the decisions necessary to keep the system running will be
so complex that human beings will be incapable of making them intelligently. At
that stage the machines will be in effective control. People won’t be able to
just turn the machines off, because they will be so dependent on them that
turning them off would amount to suicide.
Brace yourself. It comes from the Unabomber’s manifesto.
Yes, Theodore Kaczinski was a homicidal psychopath and a paranoid kook, but he
was also a bloodhound when it came to scenting all of the horrors technology
holds in store for us. Hence his mission to kill technologists before machines
commenced what he believed would be their inevitable reign of terror.
•
We are living, we have long been told, in the Information Age. Yet now we are
faced with the sickening suspicion that technology has run ahead of us. Man is a
fire-stealing animal, and we can’t help building machines and machine
intelligences, even if, from time to time, we use them not only to outsmart
ourselves but to bring us right up to the doorstep of Doom.
We are still fearful, superstitious and all-too-human creatures. At times, we
forget the magnitude of the havoc we can wreak by off-loading our minds onto
super-intelligent machines, that is, until they run away from us, like mad
sorcerers’ apprentices, and drag us up to the precipice for a look down into the
abyss.
As the financial experts all over the world use machines to unwind Gordian knots
of financial arrangements so complex that only machines can make — “derive” —
and trade them, we have to wonder: Are we living in a bad sci-fi movie? Is the
Matrix made of credit default swaps?
When Treasury Secretary Paulson (looking very much like a frightened primate)
came to Congress seeking an emergency loan, Senator Jon Tester of Montana, a
Democrat still living on his family homestead, asked him: “I’m a dirt farmer.
Why do we have one week to determine that $700 billion has to be appropriated or
this country’s financial system goes down the pipes?”
“Well, sir,” Mr. Paulson could well have responded, “the computers have demanded
it.”
September 13, 2008
The New York Times
By ROBERT PEAR
WASHINGTON — Countless federal records are being lost to posterity because
federal employees, grappling with a staggering growth in electronic records, do
not regularly preserve the documents they create on government computers, send
by e-mail and post on the Web.
Federal agencies have rushed to embrace the Internet and new information
technology, but their record-keeping efforts lag far behind. Moreover, federal
investigators have found widespread violations of federal record-keeping
requirements.
Many federal officials admit to a haphazard approach to preserving e-mail and
other electronic records of their work. Indeed, many say they are unsure what
materials they are supposed to preserve.
This confusion is causing alarm among historians, archivists, librarians,
Congressional investigators and watchdog groups that want to trace the
decision-making process and hold federal officials accountable. With the
imminent change in administrations, the concern about lost records has become
more acute.
“We expect to see the wholesale disappearance of materials on federal agency Web
sites,” said Mary Alice Baish, the Washington representative of the American
Association of Law Libraries, whose members are heavy users of government
records. “When new officials take office, they have new programs and policies,
and they want to make a fresh start.”
Richard Pearce-Moses, a former president of the Society of American Archivists,
said, “My biggest worry is that even with the best and brightest minds working
on this problem, the risks are so great that we may lose significant portions of
our history.”
The Web site of the Environmental Protection Agency lists more than 50 “broken
links” that once connected readers to documents on depletion of the ozone layer
of the atmosphere.
At least 20 documents have been removed from the Web site of the United States
Commission on Civil Rights. They include a draft report highly critical of the
civil rights policies of the Bush administration.
Problems in the White House e-mail system have been well publicized in court
cases and Congressional hearings. Officials at other federal agencies
acknowledge that their record-keeping systems are not much more advanced or
reliable.
Businesses and state and local governments face similar problems, on a smaller
scale.
“We are overwhelmed by the challenge of preserving digital information,” said
Robert P. Spindler, the chief archivist at the Arizona State University
Libraries.
For the federal government, the challenge of preserving records grows each
month, as employees create billions of e-mail messages. E-mail often replaces
telephone conversations and meetings that would not have been recorded in the
past.
In an effort to save money, federal agencies are publishing fewer reports on
paper and posting more on the Web. Increasingly, federal officials use blogs,
podcasts and videos to announce and defend their policies. Growing numbers of
federal employees do government business outside the office on personal
computers, using portable “flash drives” and e-mail services like Google Gmail
and Microsoft Hotmail.
In the past, clerks put most important government records in central agency
files. But record-keeping has become decentralized, and the government has fewer
clerical employees. Federal employees say they store many official records on
desktop computers, so the records are not managed in a consistent way.
“The Achilles’ heel of record-keeping is people,” said Jason R. Baron, the
director of litigation at the National Archives. “We used to have secretaries.
Now each of us with a desktop computer is his or her own record-keeper. That
creates some very difficult problems.”
Experts worry that items preserved in digital form may not be readily accessible
in the future because the equipment and software needed to read them will become
obsolete.
“All of us have stored personal memories or favorite music on eight-track tapes,
floppy disks or 8-millimeter film,” said Allen Weinstein, the archivist of the
United States. “In many cases, these technologies are now relics, and we have no
way to access the stored information. Imagine this problem multiplied millions
and millions of times. That’s what the federal government is facing.”
The National Archives is in the early stages of creating a permanent electronic
record-keeping system, seeking help from the San Diego Supercomputer Center at
the University of California, and from some of the nation’s best computer
scientists.
The electronic archive is behind schedule and over budget. But officials say
they hope that the project, being developed with Lockheed Martin, will be able
to take in huge quantities of White House records when President Bush leaves
office in January.
Kenneth Thibodeau, director of the electronic records archives program at the
National Archives, said that 32 million White House e-mail messages had been
preserved as records of the Clinton administration. He expects to receive
hundreds of millions from the Bush White House.
Disputes over White House records occurred at the end of the last three
administrations, and federal officials are bracing for more of litigation in
January.
Courts have imposed severe penalties on companies that failed to provide
electronic records sought in litigation, and the government is subject to
similar penalties. A federal district judge found the Environmental Protection
Agency in contempt of court for destroying certain electronic records at the end
of the Clinton administration.
Warnings about the possible loss of electronic records come from many quarters.
In a recent report, the Government Accountability Office, an investigative arm
of Congress, described widespread violations of federal record-keeping
requirements. At several large agencies, the report said, “e-mail records of
senior officials were not consistently preserved.” Some officials keep tens of
thousands of messages in their e-mail accounts, where they “cannot be
efficiently searched,” and are not accessible to others.
The inspector general of the National Aeronautics and Space Administration found
similar problems. He surveyed 40 top officials and found that 93 percent of them
were violating federal requirements for preserving e-mail correspondence.
He reported that NASA might lose some of its “institutional memory” and might
have already lost records needed to protect the legal and financial rights of
the government.
The same federal laws apply to electronic and paper records, defined as
materials — in any form — that document government activities, policies or
decisions. A formal schedule defines how long each type of record must be kept.
In general, records cannot be deleted or destroyed without prior authority from
the National Archives, which permanently preserves records judged to be of
historical value.
Melanie Sloan, executive director of Citizens for Responsibility and Ethics in
Washington, a watchdog group, said: “Agency employees do not understand their
record-keeping obligations. At the most basic level, many agency employees do
not even understand what a federal record is, much less how it must be
preserved.”
In interviews, employees agreed.
“I don’t have a very good understanding of what the rules are — what we are
supposed to keep and what we don’t have to keep,” said Christina Pearson, an
assistant secretary of health and human services. “We are trying to clarify how
our policies apply to new electronic media like Web sites and e-mail.”
At federal agencies, the most common method of preserving important e-mail
messages and attachments is to print them on paper and store them in paper
files. Officials confirmed this at the Labor Department, the Transportation
Department and the Justice Department.
Thomas A. Scully, former administrator of the Centers for Medicare and Medicaid
Services, had job discussions with prospective employers while he was a federal
official in 2003. When questions were raised about the propriety of those
discussions, he tried to find some of his old e-mail messages. But he said:
“They were gone. I could not find anything. I was told that all my e-mails had
been deleted.”
When President Bill Clinton left office, the National Archives preserved
snapshots of agency Web sites as they existed on or just before Jan. 20, 2001.
The Archives decided recently that it would not take such snapshots at the end
of the Bush administration. “Most Web records do not warrant permanent
retention,” because they do not have “long-term historical value,” the Archives
said.
Many historians disagree. Several university libraries and the Internet Archive,
a nonprofit digital library based in San Francisco, are starting to do what the
federal government refuses to do: copy government Web sites, so they remain
available after Mr. Bush leaves office.
Alarmed at the possible loss of White House e-mail messages, the House passed a
bill in July that would require agencies to preserve more electronic records.
The vote was 286 to 137. Republican opponents said the requirements would be
onerous and costly. Mr. Bush has threatened to veto the bill, saying it could
“interfere with a president’s ability to carry out his or her constitutional and
statutory responsibilities.”
February 25, 2008
The New York Times
By JOHN MARKOFF
SAN FRANCISCO — On sabbatical in 2001 from Macromedia, Kevin Lynch, a software
developer, was frustrated that he could not get to his Web data when he was off
the Internet and annoyed that he could not get to his PC data when he was
traveling.
Why couldn’t he have access to all his information, like movie schedules and
word processing documents, in one place?
He hit upon an idea that he called “Kevincloud” and mocked up a quick
demonstration of the idea for executives at Macromedia, a software development
tools company. It took data stored on the Internet and used it interchangeably
with information on a PC’s hard drive. Kevincloud also blurred the line between
Internet and PC applications.
Seven years later, his brainchild is about to come into focus on millions of
PCs. On Monday, Mr. Lynch, who was recently named the chief technology officer
at Adobe Systems, which bought Macromedia in 2005, will release the official
version of AIR, a software development system that will power potentially tens
of thousands of applications that merge the Internet and the PC, as well as blur
the distinctions between PCs and new computing devices like smartphones.
Adobe sees AIR as a major advance that builds on its Flash multimedia software.
Flash is the engine behind Web animations, e-commerce sites and many streaming
videos. It is, the company says, the most ubiquitous software on earth, residing
on almost all Internet-connected personal computers.
But most people may never know AIR is there. Applications will look and run the
same whether the user is at his desk or his portable computer, and soon when
using a mobile device or at an Internet kiosk. Applications will increasingly be
built with routine access to all the Web’s information, and a user’s files will
be accessible whether at home or traveling.
AIR is intended to help software developers create applications that exist in
part on a user’s PC or smartphone and in part on servers reachable through the
Internet.
To computer users, the applications will look like any others on their device,
represented by an icon. The AIR applications can mimic the functions of a Web
browser but do not require a Web browser to run.
The first commercial release of AIR takes place on Monday, but dozens of
applications have been built around a test or beta version.
EBay offers an AIR-based application called eBay Desktop that gives its
customers the power to buy wherever they are. Adobe uses AIR for Buzzword, an
online word processing program. At Monday’s introduction event in San Francisco,
new hybrid applications from companies including Salesforce, FedEx, eBay,
Nickelodeon, Nasdaq, AOL and The New York Times Company will be demonstrated.
Like Adobe’s Flash software, AIR will be given away. The company makes its money
selling software development kits to programmers.
Mr. Lynch and a rapidly growing number of industry executives and technologists
believe that the model represents the future of computing.
Moreover, the move away from PC-based applications is likely to get a
significant jump start in the coming weeks when Intel introduces its low-cost
“Netbook” computer strategy, which is intended to unleash a new wave of
inexpensive wireless connected mobile computers.
The new machines will have a relatively small amount of solid state disk storage
capacity and will increasingly rely on data stored on Internet servers.
“There is a big cloud movement that is building an infrastructure that speaks
directly to this kind of software and experience,” said Sean M. Maloney, Intel’s
executive vice president.
Adobe faces stiff competition from a number of big and small companies with the
same idea. Many small developers like OpenLazlo and Xcerion are creating
“Web-top” or “Web operating systems” intended to move applications and data off
the PC desktop and into the Internet through the Web browser.
Mozilla, the developer of the Firefox Web browser, has created a system known as
Prism. Sun Microsystems introduced JavaFX this year, which is also aimed at
blurring the Web-desktop line. Google is testing a system called Gears, which is
intended to allow some Web services to work on computers that are not connected
to the Internet.
Finally, there is Microsoft. It is pushing its competitor to Flash, called
Silverlight. Three years ago, Microsoft hired one of Mr. Lynch’s crucial
software developers at Macromedia, Brad Becker, to help create it. Mr. Becker
was a leading designer of the Flash programming language.
The blurring of Web and desktop applications and PC and phone applications is
further encouraged by the cellphone industry’s race to catch up with Apple’s
iPhone. The industry is focusing on smartphones, or what Sanjay K. Jha, the
chief operating officer of Qualcomm, calls “pocketable computing.”
“We need to deliver an experience that is like the PC desktop,” he said. “At the
same time, people are used to the Internet and you can’t shortchange them.”
Much software will have to be rewritten for the new devices, in what Mr. Lynch
said is the most significant change for the software industry since the
introduction in the 1980s of software that can be run through clicking icons
rather than typing in codes. This upheaval pits the world’s largest software
developer groups against one another in a battle for the new hybrid software
applications. Industry analysts say there are now about 1.2 billion
Internet-connected personal computers. Market researchers peg the number of
smartphones sold in 2007 at 123 million, but that market is growing rapidly.
“There is a proliferation of platforms,” Mr. Lynch said. “This is a battle for
the hearts and minds of people who are building things.”
The battle will largely pit Microsoft’s 2.2 million .Net software developers
against the more than one million Adobe Flash developers, who have until now
developed principally for the Web, as well as a vast number of other
Web-oriented designers who use open-source software development tools that are
referred to as AJAX.
Microsoft executives said they thought the company would have an advantage
because Silverlight has a more sophisticated security model. “Desktop
integration is a mixed blessing. There is potentially a gaping security hole,”
said Microsoft’s Mr. Becker. “We’ve learned at the school of hard knocks about
security.”
Microsoft’s competitors challenge its intent and assert that its goal is
retaining its desktop monopoly. “Microsoft is taking their desktop franchise and
trying to move that franchise to the Web,” said John Lilly, chief executive of
Mozilla. He faults the design of Silverlight for being an island that is not
truly integrated with the Internet.
“You get this rectangle in a Web browser and it can’t interact with the rest of
the Web,” he said.
He said Mozilla’s Prism offers a simple alternative to capitalize on the
explosion of creative software development taking place on the Internet. “There
are jillions of applications. A million more got launched today. The whole world
is collaborating on this.”
Up to now, it has been a low-level war between Microsoft and Adobe. Silverlight,
for instance, got high marks from developers for its ability to handle high
resolution video, but Adobe quickly upgraded Flash last year in response.
“We said, ‘Let’s put this in right now,’ ” Mr. Lynch said. With revenue last
year of $3.16 billion, Adobe is large enough to fight Microsoft.
Adobe, the maker of Photoshop, Acrobat and other software, also has a strong
reputation as a maker of tools for the creative class. "We’re one of the best
tool makers in the world," said Mr. Lynch, who worked on software design at
MicroPro, the publishers of the Wordstar word processor, and at General Magic,
an ill-fated effort to create what could be called a predecessor to today’s
smartphones, before joining Macromedia.
“Adobe’s known for its designer tools, but they realize that development — for
the browser, for the desktop, and for devices such as cellphones — is a huge
growth market,” said Steve Weiss, executive editor at O’Reilly Media, a
technology publishing firm.
September 9, 2007
By THE ASSOCIATED PRESS
Filed at 12:45 a.m. ET
The New York Times
SAN FRANCISCO (AP) -- At the center of a black hole there lies a point called a
singularity where the laws of physics no longer make sense. In a similar way,
according to futurists gathered Saturday for a weekend conference, information
technology is hurtling toward a point where machines will become smarter than
their makers. If that happens, it will alter what it means to be human in ways
almost impossible to conceive, they say.
''The Singularity Summit: AI and the Future of Humanity'' brought together
hundreds of Silicon Valley techies and scientists to imagine a future of
self-programming computers and brain implants that would allow humans to think
at speeds nearing today's microprocessors.
Artificial intelligence researchers at the summit warned that now is the time to
develop ethical guidelines for ensuring these advances help rather than harm.
''We and our world won't be us anymore,'' Rodney Brooks, a robotics professor at
the Massachusetts Institute of Technology, told the audience. When it comes to
computers, he said, ''who is us and who is them is going to become a different
sort of question.''
Eliezer Yudkowsky, co-founder of the Palo Alto-based Singularity Institute for
Artificial Intelligence, which organized the summit, researches on the
development of so-called ''friendly artificial intelligence.'' His greatest
fear, he said, is that a brilliant inventor creates a self-improving but amoral
artificial intelligence that turns hostile.
The first use of the term ''singularity'' to describe this kind of fundamental
technological transformation is credited to Vernor Vinge, a California
mathematician and science-fiction author.
High-tech entrepreneur Ray Kurzweil raised the profile of the singularity
concept in his 2005 book ''The Singularity is Near,'' in which he argues that
the exponential pace of technological progress makes the emergence of
smarter-than-human intelligence the future's only logical outcome.
Kurzweil, director of the Singularity Institute, is so confident in his
predictions of the singularity that he has even set a date: 2029.
Most ''singularists'' feel they have strong evidence to support their claims,
citing the dramatic advances in computing technology that have already occurred
over the last 50 years.
In 1965, Intel co-founder Gordon Moore accurately predicted that the number of
transistors on a chip should double about every two years. By comparison,
according Singularity Institute researchers, the entire evolution of modern
humans from primates has resulted in only a threefold increase in brain
capacity.
With advances in biotechnology and information technology, they say, there's no
scientific reason that human thinking couldn't be pushed to speeds up to a
million times faster.
Some critics have mocked singularists for their obsession with
''techno-salvation'' and ''techno-holocaust'' -- or what some wags have called
the coming ''nerdocalypse.'' Their predictions are grounded as much in science
fiction as science, the detractors claim, and may never come to pass.
But advocates argue it would be irresponsible to ignore the possibility of dire
outcomes.
''Technology is heading here. It will predictably get to the point of making
artificial intelligence,'' Yudkowsky said. ''The mere fact that you cannot
predict exactly when it will happen down to the day is no excuse for closing
your eyes and refusing to think about it.''
------
On the Web:
The Singularity Institute for Artificial Intelligence,
www.singinst.org
RALEIGH, N.C. (AP) -- Red Hat Inc. has unveiled the latest version of its Linux
operating system as the open-source software company continues to combat
Microsoft's market-dominating Windows platform.
Developers for the Raleigh-based company touted Red Hat Enterprise Linux 5 as
more flexible and more manageable than its prior versions, and said they worked
for two years on the product.
''Our customers are an integral part of the development process,'' said Paul
Cormier, Red Hat's executive vice president for engineering, echoing the
open-source tenet that users be allowed to view and edit the software's code.
Resoundingly, Cormier said, customers wanted less complexity.
The new operating system supports ''virtualization,'' which Red Hat said will
help companies consolidate their technology workload onto one server -- saving
energy, space and money.
''Customers have figured out that they've got rooms full of racks and servers,''
said Nick Carr, the marketing director for the operating system. ''They're
taking up heat and power and space, but they're only 15 percent loaded. They
want to know how they can use what they have more efficiently.''
For desktop computers, Red Hat touted its advances in security to protect
systems from external and internal attacks.
Redmond, Wash.-based Microsoft, which recently launched its long-awaited Windows
Vista operating system, still dominates the software market. Red Hat says Linux
can be found in the majority of Fortune 500 companies, where savvy tech
departments have switched to Linux to cut down on costs.
Along with the new Linux product, Red Hat launched several new service programs
to help companies migrate their data centers to Linux and to help customers get
support for a variety of different open-source programs.
Red Hat's business model is based around service. Unlike Microsoft's proprietary
software, Red Hat delivers its products for free but makes money by selling
subscription packages for service.
Shares of Red Hat fell 19 cents Wednesday to close at $22.52 on the New York
Stock Exchange.
REDMOND, Wash., Oct. 5 — On a whiteboard in a windowless Microsoft conference
room here, an elegant curve drawn by a software-testing engineer captures both
five years of frustration and more recent progress.
The principle behind the curve — that 80 percent of the consequences come from
20 percent of the causes — is rooted in a 19th-century observation about the
distribution of wealth. But it also illustrates the challenge for the builders
of the next generation of Windows and Office, the world’s largest-selling
software packages.
As they scramble to get the programs to users by the end of the year, the
equation is a simple one: making software reliable for most personal computer
users is relatively easy; it is another matter, in a PC universe with tens of
thousands of peripherals and software applications, to defeat the remaining bugs
that cause significant problems for some users.
The effort to overhaul the Windows operating system, originally code-named
Longhorn and since renamed Vista, was meant to offer a transformation to a new
software foundation. But several ambitious initiatives failed to materialize in
time, and the project started over from scratch three years ago. The result is
more an evolutionary shift, focusing on visual modernization and ease of use.
Still, the company is within a month of completing work on new versions of both
Windows and Office, having apparently overcome technical hurdles that as
recently as August seemed to signal a quagmire.
“It looked bleak; it was a slog, but in the end this was a technical problem,
and there was a turning point,” said Bharat Shyam, 37, a computer scientist who
is director of Windows program management. “We’ve confounded the analysts and
the press.”
As October arrived, a vote of confidence came from Wall Street when a Goldman
Sachs analyst, Richard G. Sherlund, wrote that he expected the product to be
introduced on time. “The Vista development organization has made rapid progress
delivering improvements to Vista’s performance, reliability, and compatibility,”
he said.
[On Friday, the company released what it said would be the final test version of
Vista, named Release Candidate 2. If the response from testers is positive, the
software will go into production by the end of the month.]
The debugging process has been urgent, with Microsoft scheduled to introduce
Windows Vista and Office 2007 to corporate customers by the end of the year, and
to home users early next year.
This coordinated introduction is a multibillion-dollar proposition for
Microsoft, which has Windows running on some 845 million computers worldwide and
Office on more than 450 million, according to the market research firm Gartner.
Indeed, it was the vast scale of the Windows testing program that saved the
software development projects. Over the summer, the company began an
extraordinary bug-tracking effort, abetted by volunteers and corporate partners
who ran free copies of both Windows and Office designed to send data detailing
each crash back to Microsoft computers.
The Office package, for example, has been tested by more than 3.5 million users;
last month alone, more than 700,000 PC’s were running the software, generating
more than 46 million separate work sessions. At Microsoft, 53,000 employee
computers are running test versions.
Vista has also been tested extensively. More than half a million computer users
have installed Vista test software, and 450,000 of the systems have sent crash
data back to Microsoft.
Such data supplements the company’s own testing in a center for Office referred
to as the Big Button Room, for the array of switches, lights and other apparatus
that fill the space. (A similar Vista room has a less interesting name — Windows
Test Technologies.)
This is where special software automatically exercises programs rapidly while
looking for errors.
The testing effort for Windows Vista has been led by Mario Garzia, Microsoft’s
director of Windows reliability. A former Bell Labs software engineer, Mr.
Garzia says the complexity of the Vista and Office effort dwarfs anything he
undertook for the nation’s telephone network.
“Everything is easy if you do it for a limited number of things,” he said. “When
I was at Bell Labs, the problems were complex, but nothing compared to this.”
The test data from the second beta release of Vista alone generated 5.5
petabytes of information — the equivalent of the storage capacity of 690,000
home PC’s.
The resulting complexity can be seen in the dance that has gone on in recent
months between Microsoft’s designers and its partners, who have been tailoring
software and hardware to work with Vista.
On Sept. 1, for example, Microsoft released a version of Vista called Release
Candidate 1 to a large group of outside testers, hoping to take advantage of
their free time over the Labor Day weekend.
Immediately, Mr. Garzia recalled, a wave of crash data fed back to Microsoft
disclosed a newly introduced bug that had been created by incompatibility with a
software module (referred to as a device driver) written by a partner company.
That company was alerted to the problem, and a remedy was transmitted directly
to the testers’ computers over the Internet within four days — a vast
improvement in the gap between detection and repair, he said.
Despite the impending commercial arrival of the two software projects — which
between them have involved the labors of more than 5,000 programmers and testers
here — there is still uncertainty in the industry about how long it will take
for Vista in particular to gain acceptance.
“We’ve been impressed with the progress, and they deserve a lot of credit,” said
David Smith, a Gartner vice president, but that does not mean that Windows Vista
will soon be in standard workplace use. Its deployment on a significant scale
will not begin at most companies until 2008, Mr. Smith said.
Microsoft executives contend that such calculations are overly conservative, and
they have been making the case that the use of Vista could pay for itself in
saved labor and related costs in less than a year.
A more fundamental question for the industry is whether Vista will represent a
new era for computing or be the last great push of the current epoch.
While Microsoft’s co-founder and chairman, Bill Gates, was able to turn his
company abruptly in the mid-1990’s to respond to the challenge posed by
Netscape, Microsoft has proved less effective in blunting a similar challenge to
its dominance from Google.
Moreover, the rise of Google and other companies moving toward Internet-based
software development raises doubts about the value of giant efforts like Windows
and Office, which can take more than five years.
Eric E. Schmidt, chief executive of Google, has said he believes that the rise
of advertising-supported Web services will increasingly undercut Microsoft’s
software development model — using a proprietary software development system and
selling shrink-wrapped applications.
In an internal company memo titled “Don’t Bet Against the Internet,” he wrote
recently, “Almost no pure PC software companies are left (all is on the
Internet), most proprietary standards (I’m thinking of Exchange e-mail and file
systems protocols from Microsoft) are under attack from open protocols gaining
share rapidly on the Internet.”
The larger struggle has had little influence on Ben Canning, who began his
career at Microsoft testing software nine years ago after getting a graduate
degree in philosophy from Reed College.
Rather, his days are consumed with working his way down that whiteboard curve.
Mr. Canning acknowledges that his degree prepared him for little beyond teaching
philosophy — with the possible exception of finding and killing bugs in
software, because philosophers are trained to analyze and solve particularly
hard logical problems. For the last few months, his mind has been focused on the
hard problems at the end of the curve.
“If you look at the mean time to crash for most Office customers, it’s very
high,” he said. “There is a small minority that crash all the time, and they
hate us, and we want to help.”