Friday, August 28, 2009

The Ten-Year Century - The Pace of Change Accelerates

In computer jargon, when your hard drive becomes overwhelmed with too much information it is said to be fragmented—or “fragged.”

Today, the rapid and unsettling pace of change has left us all more than a little, well, fragged.
We watch 60-second television commercials that have been sped up to fit into 30-second spots, even as we multitask our way through emails, text messages and tweets.

We assume that these small time compressions are part of the price of modern living. But it is more profound than that.
Changes that used to take generations—economic cycles, cultural shifts, mass migrations, changes in the structures of families and institutions—now unfurl in a span of years.

Since 2000, we have experienced three economic bubbles (dot-com, real estate, and credit), three market crashes, a devastating terrorist attack, two wars and a global influenza pandemic.
Game-changing consumer products and services (iPod, smart phones, YouTube, Twitter, blogs) that historically might have appeared once every five or more years roll out within months. In what seems like the blink of an eye one giant industry (recorded music) has been utterly transformed, another (the 250-year-old newspaper business) is facing oblivion, and a half-dozen more (magazines, network television, book publishing) are apparently headed to meet one of those two fates.

Call it the advent of “the 10-year century”: a fast shuffle that stacks events which once took place in the course of a lifetime compressed into the duration of a childhood. To understand how this his happening—and what it will take to cope—take a look at the underlying forces:

• Faster computation. “Moore’s Law”—the doubling of semiconductor chip performance every 18-24 months first observed by Intel co-founder Gordon E. Moore—has become the metronome of modern times. Yet the extraordinary changes we have seen since the invention of the transistor in 1947—all of the way to broadband Internet, smart phones, iPods and supercomputers—are only a prelude to the emerging world of single-molecule silicon gates, nanotechnology and advanced bioinformatics (which uses information processing in molecular biology).


• Quicker access. “Metcalfe’s Law,” named for electrical engineer Robert Metcalfe, says that networks grow in value exponentially with each new user. The biggest network in the world is the Internet; and thanks to the advent of cheap, Web-enabled cellphones, the Internet is about to see its “jump point”: the arrival of two billion new users from the developing world, nearly tripling its size. Now consider what may happen with faster computation speeds and global broadband wireless coverage, which means full access from anywhere on the planet, anytime. What counts here is not the sheer size of the Internet, or the richness of the experience, but the life-altering access to any information we need, delivered with unprecedented sophistication, almost instantly.

• Shorter decision cycles. Think about what quicker access to vast caches of information, available instantly almost anywhere, to be crunched and analyzed using ubiquitous and powerful processors—all with the knowledge that competitors are doing the same thing—means for business enterprise. The emerging environment is not one for reflection, or “letting things play out for a while.” It means bold, impetuous moves, all while betting that the information is not just complete, but accurate.


True, when a computer chip goes through as many computations in a single second as there are human heartbeats in 10 lifetimes, a 10-year year century seems positively pokey. But we humans have a slower metabolism, which will make this rapid fire of events ever more difficult to comprehend, much less manage.


More disturbing, we have few safeguards—software shut-off switches, virus protections, firewalls, etc.—in place to check or repair our new global über-system when it misfires or goes completely off the rails. When felons with lousy credit histories can sign up for inflated mortgages in a matter of seconds over a computer; when nervous shareholders can panic over a fake blog and dump millions of shares online in a matter of minutes; and when an Internet rumor can provoke a virtual “run” on a bank, then bubbles and cascades and crashes become inevitable.


So how do we control this increasingly out-of-control, interlinked world? Venture capitalist Bill Davidow has proposed the equivalent of online “surge protectors” to stop run-ups and panics on the Internet, the same way stock markets stop runaway trading. At the least we need better analytics to predict where change is taking us next.
Most importantly, trust will become the critical factor.

Without the luxury of time, trust will be the new currency of our times, whether in news sources, economic systems, political figures, even spiritual leaders. As change accelerates, it will remain one true constant.


By Tom Hayes and Michael S. Malone, WSJ

The Ten-Year Century

In computer jargon, when your hard drive becomes overwhelmed with too much information it is said to be fragmented—or “fragged.” Today, the rapid and unsettling pace of change has left us all more than a little, well, fragged.

We watch 60-second television commercials that have been sped up to fit into 30-second spots, even as we multitask our way through emails, text messages and tweets. We assume that these small time compressions are part of the price of modern living. But it is more profound than that.

Changes that used to take generations—economic cycles, cultural shifts, mass migrations, changes in the structures of families and institutions—now unfurl in a span of years. Since 2000, we have experienced three economic bubbles (dot-com, real estate, and credit), three market crashes, a devastating terrorist attack, two wars and a global influenza pandemic.

Game-changing consumer products and services (iPod, smart phones, YouTube, Twitter, blogs) that historically might have appeared once every five or more years roll out within months. In what seems like the blink of an eye one giant industry (recorded music) has been utterly transformed, another (the 250-year-old newspaper business) is facing oblivion, and a half-dozen more (magazines, network television, book publishing) are apparently headed to meet one of those two fates.

Call it the advent of “the 10-year century”: a fast shuffle that stacks events which once took place in the course of a lifetime compressed into the duration of a childhood. To understand how this his happening—and what it will take to cope—take a look at the underlying forces:

 Faster computation. “Moore’s Law”—the doubling of semiconductor chip performance every 18-24 months first observed by Intel co-founder Gordon E. Moore—has become the metronome of modern times. Yet the extraordinary changes we have seen since the invention of the transistor in 1947—all of the way to broadband Internet, smart phones, iPods and supercomputers—are only a prelude to the emerging world of single-molecule silicon gates, nanotechnology and advanced bioinformatics (which uses information processing in molecular biology).

• Quicker access. “Metcalfe’s Law,” named for electrical engineer Robert Metcalfe, says that networks grow in value exponentially with each new user. The biggest network in the world is the Internet; and thanks to the advent of cheap, Web-enabled cellphones, the Internet is about to see its “jump point”: the arrival of two billion new users from the developing world, nearly tripling its size.

Now consider what may happen with faster computation speeds and global broadband wireless coverage, which means full access from anywhere on the planet, anytime. What counts here is not the sheer size of the Internet, or the richness of the experience, but the life-altering access to any information we need, delivered with unprecedented sophistication, almost instantly.

• Shorter decision cycles. Think about what quicker access to vast caches of information, available instantly almost anywhere, to be crunched and analyzed using ubiquitous and powerful processors—all with the knowledge that competitors are doing the same thing—means for business enterprise. The emerging environment is not one for reflection, or “letting things play out for a while.” It means bold, impetuous moves, all while betting that the information is not just complete, but accurate.

True, when a computer chip goes through as many computations in a single second as there are human heartbeats in 10 lifetimes, a 10-year year century seems positively pokey. But we humans have a slower metabolism, which will make this rapid fire of events ever more difficult to comprehend, much less manage.

More disturbing, we have few safeguards—software shut-off switches, virus protections, firewalls, etc.—in place to check or repair our new global über-system when it misfires or goes completely off the rails. When felons with lousy credit histories can sign up for inflated mortgages in a matter of seconds over a computer; when nervous shareholders can panic over a fake blog and dump millions of shares online in a matter of minutes; and when an Internet rumor can provoke a virtual “run” on a bank, then bubbles and cascades and crashes become inevitable.

So how do we control this increasingly out-of-control, interlinked world? Venture capitalist Bill Davidow has proposed the equivalent of online “surge protectors” to stop run-ups and panics on the Internet, the same way stock markets stop runaway trading. At the least we need better analytics to predict where change is taking us next.

Most importantly, trust will become the critical factor. Without the luxury of time, trust will be the new currency of our times, whether in news sources, economic systems, political figures, even spiritual leaders. As change accelerates, it will remain one true constant.

Tuesday, August 18, 2009

Steve Jobs: Inside the Apple Culture


We need more productive narcissists like Steve Jobs.

In June,
The Wall Street Journal revealed that Steve Jobs, chief executive of Apple Inc, had had a liver transplant at the Methodist University Hospital in Memphis, Tennessee, in April. He’d taken a house in Memphis to be nearby if a liver became available. He had chosen Tennessee because of its short transplant waiting list. But, even there, to get to the top of the list means you have to be close to death. He was, the hospital confirmed, “the sickest patient on the waiting list at the time”.

Philip Elmer-DeWitt, author of the Apple 2.0 blog at CNNmoney.com, e-mails me the grim details of his operation: “He’s lost his gall-bladder, part of his stomach, part of his pancreas, the upper end of his small intestine and now has someone else’s liver, which probably means he’ll be on immunosuppressant drugs for the rest of his life. That can’t be fun.”

On January 5, Jobs had written to the “Apple Community” explaining that he was ill and taking six months off work. “Fortunately, after further testing,” he wrote, “my doctors think they have found the cause — a hormone imbalance that has been ‘robbing’ me of the proteins my body needs to be healthy. Sophisticated blood tests have confirmed this diagnosis.”

Apple Inc is worth around $140 billion. But is it worth anything without Jobs? It is a company formed around his personality and inspiration. It is also the most watched, envied, admired and adored company in the world. So how, you may wonder, was it possible for Jobs to put out such a statement four months before a liver transplant? And how was it possible for consumer capitalism’s greatest hero to pull off the Memphis Liver Caper in absolute secrecy?

The answer is that, along with computers, iPhones and iPods, secrecy is one of Apple’s signature products. A cult of corporate omerta — the mafia code of silence — is ruthlessly enforced, with employees sacked for leaks and careless talk. Executives feed deliberate misinformation into one part of the company so that any leak can be traced back to its source. Workers on sensitive projects have to pass through many layers of security. Once at their desks or benches, they are monitored by cameras and they must cover up devices with black cloaks and turn on red warning lights when they are uncovered. “The secrecy is beyond fastidious and is in fact insultingly petty and political,” says one employee on the anonymous corporate reporting site Glassdoor.com, “and often is an impediment to actually getting one’s work done.”

But employees are one thing; shareholders are another. Should Jobs (who, as far as the world is concerned, is Apple) have been allowed to conceal the seriousness of his illness? Warren Buffett, the greatest investor alive, doesn’t think so. “Whether [Steve Jobs] is facing serious surgery or not is a material fact.”

Some say another sign that Apple omerta has gone too far was the death of Sun Danyong, a 25-year-old employee of Foxconn, a Chinese manufacturer of Apple machines. He was given 16 prototypes of new iPhones. One disappeared. Facts beyond that get hazy, but it is clear that Sun committed suicide by jumping from a 12th-storey apartment. Internet babble says he killed himself because of the vanished prototype and, therefore, because of Apple’s obsessive secrecy.

Then there is the recent case of the exploding iPod in Liverpool. Ellie Stanborough’s iPod touch went up in a puff of smoke. Her father, Ken, complained, but Apple said he could only have a refund if he promised not to talk. He refused. “They’re putting a restriction on myself, my daughter and Ellie’s mum not to say anything to anyone,” said Ken. “If we inadvertently did say anything? they could take litigation against us. I thought that was absolutely appalling.” This isn’t the freewheeling, good-times California lifestyle image the company likes to project. It is, rather, that of a much tougher and paranoid operation.

Yet secrecy is Apple’s core marketing tool. Jobs’s specialities are 90-minute to two-hour-long presentations to prayer meetings of the faithful. These always end with the words “and one last thing”, at which point he unveils the latest gizmo to geek hallelujahs. Rumours suggest he is, in spite of the transplant, about to do it again in the next few weeks. It will be a dual sensation: the sight of a walking, talking Jobs and of a new tablet computer, a sort of giant iPhone, which, some say, will yet again change the world. Excitement intensified early this month when an unnamed “analyst” was reported as having actually held the tablet. He said it was “better than your average movie experience”.

The secrecy is all about preserving the magic of each new product. Apple hates personality stuff and press intrusion. “We want to discourage profiles,” an Apple PR tells me stiffly, apparently unaware she is waving a sackful of red rags at a herd of bulls. Another PR rings the editor of this magazine to try to halt publication of this piece.

Jobs doesn’t like being questioned. Despite his attempts to find serenity through Zen Buddhism, the agony of interviews can get to him. “Imagine what he’d be like,” said a reporter after emerging from a Jobs drubbing, “if he hadn’t studied Zen.”

“He’s a tough, prickly interview,” says Elmer-DeWitt, “and he’s always selling. Hard.” In fact, any interview situation with Jobs can turn nasty. One excessively strait-laced candidate for a job at Apple bored him so much, he sprang questions like “How old were you when you lost your virginity?” and “How many times have you taken LSD?” on the poor sap. (Jobs has said that taking LSD was one of the most important things in his own life.) Then he lapsed into a chant of “Gobble, gobble, gobble, gobble”. “I guess I’m not the right guy for this job,” said the candidate finally.

The transplant was not Jobs’s first near-death experience. In 2004 he was found to have pancreatic cancer. This usually means certain death. He was told to go home and put his affairs in order. Then he got a call. His tumour was rare and operable. He returned to work, and in 2007 launched the iPhone, the latest of what he calls Apple’s “insanely great” products. The iPhone joined the Mac computer, the iPod and the films of Pixar Animation Studios, all vastly successful, influential products brought to market by Jobs. Well, “products” is perhaps a bit weak: “agents of global transformation” might be better. “My God!” says Andrea Cunningham, a PR hired and fired four times by Jobs. “He’s single-handedly changed the world, like, at least three times!”

But, even as the faithful queued overnight to get their hands on the first iPhones, new rumours were circulating about his health. These were given almost comical credence when his obituary was accidentally published by the Bloomberg news service last August. Then, in January this year, Jobs made his announcement. Then came news of the transplant. This indicated the cancer had spread to his liver. The signs are not good. On the other hand, he seems to be up and about. He has gone back to work, and Elmer-DeWitt has reported that he’s been seen at a Coldplay concert. Cunningham has seen him going into the Fraiche yoghurt cafe in Palo Alto near her office. “He walks by occasionally. He looks pretty good, actually, and they do make great yoghurt.”

The drama of it all is intense, important — not least for Apple shareholders — and strangely thrilling. Jobs, in business, has died before and risen from the grave. For the past 12 years he has been the risen God of Silicon Valley, the Sun King of Palo Alto. Yet it won’t be until squadrons of pigs are flying over the frozen wastes of hell that he will appear on Oprah Winfrey or Larry King to tell the world how he feels about all this.

Jobs can be a cold, hard boss. In fact, judged simply as an office politician, he can seem pretty hopeless. He blew it in 1985. Having launched the Macintosh, he was driven out of Apple by John Sculley, the CEO he had lured from Pepsi-Cola with the hubristic and diet-conscious words “Do you want to spend the rest of your life selling sugared water, or do you want a chance to change the world?” Many at Apple were happy to see Jobs go. They would be sad soon afterwards.

That’s Bad Steve. But then there’s Good Steve. Abused employees, if they survive, often find themselves praised to the heavens. They ride on what is know as the “hero-asshole rollercoaster” and they live inside the “reality distortion field”, Jobs’s uncanny ability to convince people that the utterly impossible is, in fact, entirely possible.

Good Steve is the only businessman to be accorded rock-god status by millions. Apple nuts queue overnight to hear him speak. They buy Macs, iPods and iPhones not just because they want them, but also because they want to support this company as if it were some kind of charity or cult. The nuts aren’t wrong for one crucial reason. Though personally worth $3.4 billion, Jobs is one of them, the great consumer of his own products.

“Jobs is not an engineer,” says the writer Dan Lyons, “he can’t really design anything and he doesn’t know anything about circuits. But he is the ultimate end-user, the guy who is on our side.” Lyons created the Secret Diary of Steve Jobs blog with a motto that captures the strange Jobs mix of geek fantasy and power: “I will restore your sense of childlike wonder. There is nothing you can do to stop me.” And so, amid the secrecy and geekery, Good and Bad Steves blend to form one, gigantic, mesmerising personality. “He would have made,” said Jef Raskin, the brain behind the first Mac, “an excellent king of France.”

To call Jobs a control freak is to call rain wet. When building the first Mac, engineers wanted to include “expansion slots” into which people could slide kit to customise their machines.

Jobs resisted. The machine was his and it had to be closed and perfect. And he’s still at it: he has made it impossible for buyers even to change the batteries on his latest laptops.

But he has eased off on this with the iPhone. He has allowed outside companies to develop applications — “apps” — that can be downloaded to the phone. These range from Grindr — a gay cruising tool that helps you find nearby gays — to Shakespeare, which stores all the plays on your phone. The success of apps has stunned Apple. By the end of the year there will be 100,000 apps available. There have already been 1.5 billion downloads. Some have speculated that the app cult will supplant the internet. Certainly the new tablet Mac will be based on this phenomenon. Jobs also has a bizarre obsession with the insides of his machines. He drives his engineers mad by insisting that insides look beautiful, even though his customers won’t see this. This code of impenetrable perfection even extends to Jobs’s view of his own body. He has always been a fussy eater, and health problems have intensified this. His favourite dish was once said to be shredded raw carrots without dressing.

Jobs is, in the words of the psychiatrist and scholar of leadership Michael Maccoby, “a productive narcissist”. To Jobs, the world is an epiphenomenon, a side effect of the existence of Steve. Or rather, it is a pyramid with Jobs at the top, a few bright people just beneath him, and then the rest of us — the “bozos”. The customer bozo is not, to him, always right. In the early days it was said the Apple marketing department consisted of Jobs looking in his mirror and asking himself what he wanted. His customer-relations motto is from Henry Ford: “If I’d asked my customers what they wanted, they’d have said a faster horse.” In a world driven by technology, only the technocrats know what we want and need.

Silicon Valley, south of San Francisco, was once simply the Santa Clara Valley, a land of orchards. Now it’s a land of smart, rich people who eat breakfast daily suffused with the conviction that today is the day they will make billions and change the world. It was in here, in the town of Mountain View, that Jobs spent his childhood. He was born to Joanne Schieble and Abdulfattah Jandali in San Francisco. They were young and unmarried and, as a result, he was adopted by Paul and Clara Jobs. They seem to have provided a good home, but everybody is convinced that the mere fact of the adoption did much to form Jobs’s character. Michael Maccoby thinks the key might be the idea of the absent or lost father.

“The very striking thing about productive narcissists, particularly men, is that they grow up in families where there is an absent or weak father figure. You can see this in narcissistic presidents like Obama, Clinton, Reagan and Nixon. They struggle with their identity and view of the world. So they tend to come up with a very original view of things and are then driven to find followers.”

Later, Jobs dropped out of college. Again, this seems to have been crucial. Alan Deutschman, author of The Second Coming of Steve Jobs, says his lack of a proper education in a world of highly educated people left him permanently insecure, especially in matters of taste. “I think his choice of a minimalist aesthetic comes from his fear of making the wrong aesthetic choice. He was someone who had great wealth from his early twenties. He was worried about not being seen as a brilliant sophisticate, so he had gurus to help him. There was this anxiety about being judged, combined with a natural instinct about the tremendous importance of design.”

There was another sense in which Jobs seemed to miss out. Deutschman says he “lagged the zeitgeist”. He was too young — 12 in 1967 — to enjoy the full hippie, summer-of-love experience. Yet he seemed to want to catch up, travelling, like the Beatles, to India to find enlightenment, and returning, unlike the Beatles, a Buddhist.

His first business persona was that of counter-cultural guerrilla, a silicon Che Guevara. The Mac was launched with the most famous TV ad ever made, a tour de force of ad-art directed by Ridley Scott. It portrayed IBM as George Orwell’s Big Brother and Apple as a blonde, athletic Californian-type freedom fighter, smashing Big Brother’s screen with a sledgehammer.

Finally, he even dated Joan Baez, the folk-singing goddess of the counterculture. Some said it was because she had been the lover of Bob Dylan, and Jobs is crazy about Bob. According to Deutschman’s book, he later said gracelessly:

“I would have married Joan Baez but she was too old to have my children.”

Which brings me to the matter of Jobs and women. This has been a rocky road. When his first serious girlfriend, Chris-Ann, became pregnant, he refused to accept it was anything to do with him. Lisa, his daughter, was born in a commune in Oregon in 1978. They have since been reconciled. That would be that but for the fact that, in the early 1980s, Jobs rediscovered his biological parents. They had married and had a daughter, Mona Simpson, his sister. She was a highly regarded novelist, who in 1996 published A Regular Guy, about a driven, narcissistic superstar business man and his relations with the daughter he had abandoned. At every turn, Jobs’s story seems to grow into fiction and then myth.

Jobs seems to go for the blonde, athletic Californian look of the girl in the Mac ad. It may be one more aspect of his pursuit of belonging in the pampered groves of the Valley. In 1991, at a Zen Buddhist ceremony, he married a woman — Laurene Powell — with precisely that look. They are still together and have three children.

His eviction from Apple in 1985 was a death and he did not go gently into that good night. One day he called Andrea Cunningham to the Jackling House to talk about his new company. She found him in the almost entirely unfurnished house haranguing journalists about the iniquities of his usurper, John Sculley. “He was pretty much ranting. I was quite shocked that someone of his abilities and intelligence and all of that would attempt what he was trying to attempt. It was just amazing.”

Then came the wilderness years. Apple lost its way, and by the mid-1990s it was on the verge of collapse. Its computers were dull and the Apple operating system was buggy and awful. I reluctantly abandoned them at this point. Jobs’s new company, NeXT, meanwhile, went nowhere. It made beautiful-looking computers for education. But they were expensive and impossible to sell.

In 1986 he bought — from the creator of Star Wars, George Lucas — a strange commune of brilliant men who were convinced that movies could be made on computers. It was called Pixar. They were inventing the technology as they went along. That, too, seemed to be going nowhere.

Saddled with these increasingly implausible projects, Jobs saw his own massive wealth begin to dwindle. His press coverage, adoring at the time of the Mac, became sceptical. But vengeance is his and he will repay. Pixar went into partnership with Disney to produce Toy Story, and Apple, crippled and loss-making, took over NeXT and brought Jobs back into the fold. Within months he was God again. Pixar grossed millions, then billions, and Apple brushed the dirt off its face and leapt out of the grave. First came the iMac, a toy-like, one-box desktop computer that can still be seen in groovy offices. Then in 2002 came the real payoff for the grim NeXT years. Mac OS X, the new operating system, was based on NeXT software. It was superb, infinitely better than Microsoft Windows and infinitely more beautiful. I, and millions of others, returned to Apple.

Jobs couldn’t hope to conquer Microsoft’s dominance in the market, but he could easily make it look desperately clunky. Silicon Che Guevara, having defeated IBM, returned to outcool Microsoft. But world domination was still to be had. He took it with the iPod in 2001 and the iPhone in 2007. The first stole almost the whole of the MP3-player market, and the second is doing the same to the mobile-phone market. Apple is now the consumer-electronics company by which all others are judged and found wanting.

Inevitably, with his health hanging by a thread, this raises the question: can they do without him? Of course they can, says Andy Hertzfeld, one of the original Mac-makers, who is now at Google. “It’s ludicrous,” he e-mails, “to think that Apple is a one-man company; there are hundreds if not thousands of exceptionally talented individuals who work there. Much of their post-Steve fate will depend on the leadership that eventually replaces him. The company disintegrated after Steve left in the mid-1980s; hopefully, they can do much better this time around.”

Others are not so sure Apple can do without his burning product perfectionism. “A lot of companies can do without that,” says Cunningham. “There’s probably a lot of business they can do with long-term incremental improvements to their products. But are they ever going to have another breakthrough product? I don’t know.”

“Apple will keep executing its current business plan,” says Philip Elmer-DeWitt, “which could go on for years. But it will be different in one key respect: with Jobs there was a guy at the beginning and end of every project who had the authority to say, ‘This sucks. Start over.’ Whoever replaces him may share his vision and job title, but he or she will not be the co-founder of Apple and won’t have the same authority.”

My own view is that a Jobsless Apple will seek a merger with Google. The two companies are rapidly converging, a fact that recently led to the resignation of the Apple director Eric Schmidt, the chairman and chief executive of Google. He had been on the Apple board for three years, and was forced out because of suspicions that links between the two companies could endanger competition. One other director of both companies remains: Arthur Levinson, former chief executive of Genentech. The key areas of convergence are, first, mobile phones. There is Apple’s iPhone and there is Google’s Android, not a phone in itself, but an operating system that can be used by other companies. Google also produce a web browser called Chrome, which competes with Apple’s Safari. And, most importantly, Google is working on a computer operating system, also called Chrome, which may well be a very serious competitor for Mac OS X. Apple’s iPhone “apps” also compete with many free Google applications. The point is that both companies are aiming to seize dominance of the world market from Microsoft. Microsoft’s Windows still dominates world computing in spite of its failure to innovate. The loss of Jobs’s genius for products would mean Google’s innovation and Apple’s design and market sense would be a very good fit, although antitrust regulators might disagree.

Then there is the mighty, epic question of Jobs himself. Can the Valley do without him? Can we? Opinions of his career swing between the Bad Steve/Good Steve poles. Those who focus on the former think he could have done it all without the tantrums and brutality. Gifted people have been damaged horribly by his behaviour. Jobs took against Alvy Ray Smith at Pixar and cut him out of the company history. “He has failed many times,” says Smith, “but the press and the public overlook that in their rush to glorify him? Steve and I don’t like one another.” Deutschman’s book is a cool look at Bad Steve and asks the very good question once asked by a college friend of Jobs: “How much of an asshole do you have to be to be highly successful?” One Hollywood boss compared Jobs to Citizen Kane, adding: “I hope there’s a sled called Rosebud.”

“Rosebud” was Kane’s mysterious last word. It turns out to be the sledge he lost when a banker took him from his childhood home. The implication is that Jobs nurses a wound that cannot be healed. Such ambivalence infuriates those who focus on Good Steve. “I think Deutschman’s book was a hatchet job,” says Hertzfeld. “Steve is a complicated individual. Like many of us, the good and the bad aspects of his personality are inextricably linked.”

“I think we need productive narcissists like Jobs,” says Maccoby, “but there are always quirks. You may get an Abraham Lincoln or you may get an Adolf Hitler; you may get a Winston Churchill or you may get a Joseph Stalin.”

The strength and relative stability of the company make it clear that Jobs learnt something from his first fall and his second coming. He learnt, says Maccoby, that a narcissistic personality like his, with extremely dodgy people skills, needs a more consensual character to keep him in check. He found one in Tim Cook, Apple’s comparatively serene chief operating officer, who is the likeliest successor. He’s not Jobs but he’s a rarity in the Valley — a “safe pair of hands”.

All agree that Jobs made Apple into more than a company. To the believers it is a great cause; to the sceptics it is more sinister. “Apple is less of a company and more like a cult,” says Dan Lyons. “If the Church of Scientology went into consumer electronics it would be Apple.” The status of the company is beyond argument. It is watched by bloggers who trawl through its patent applications and analyse its every move. “I swim through Apple newsfeeds like a whale swims through krill,” says Elmer-DeWitt. Yet the company continues to surprise and amaze. I don’t want Jobs to die because my computers and iPhone are, indeed, “insanely great” compared with the dismal competition but, more importantly, because he is an extraordinary figure. I don’t use the word “genius” about businesspeople, but in Steve Jobs’s case I’m prepared to make an exception.

Geniuses tend to see their own lives as universally significant, embodying the great currents of their age. They may not know they are doing this, but it is evident in their work. Everything about Jobs tells me this is how he sees his life, as the distillation of the high-tech revolution and of affluent, aspirational consumerism. He is, as Dan Lyons says, “the ultimate end-user”, both consumer and maker. He is one with the bozos and their gizmos. That’s who he is.

Surviving his health crisis may require more than a transplant in Memphis. Perhaps he will need a free download or upgrade from the other god that watches over Silicon Valley. If he gets it, then he can knock down that ghostly house in Woodside and build the minimalist mansion that will set his consuming mind to rest. He just wants a home really, like the rest of us bozos, because home is everybody’s Rosebud.

From Sunday Times U.K. 8-16-09

Friday, August 14, 2009

Can We Say Oligopoly ?

Major internet carriers are shunning broadband stimulus because they would come with tighter rules. The Obama administration made a national priority of spreading high-speed Internet access to every American home and offered stimulus money to help companies pay for it, but the biggest network operators are staying away from the program.

As the Aug. 20 deadline nears to apply for $4.7 billion in broadband grants, AT&T, Verizon and Comcast are unlikely to go for the stimulus money, sources close to the companies said.

Their reasons are varied. All three say they are flush with cash, enough to upgrade and expand their broadband networks on their own. Some say taking money could draw unwanted scrutiny of business practices and compensation, as seen with automakers and banks that have taken government bailouts. And privately, some companies are griping about conditions attached to the money, including a net-neutrality rule that they say would prevent them from managing traffic on their networks in the way they want.

"We are concerned that some new mandates seem to go well beyond current laws and [Federal Communications Commission] rules, and may lead to the kind of continuing uncertainty and delay that is antithetical to the president's primary goals of economic stimulus and job creation," said Walter B. McCormick Jr., president of USTelecom, a trade group that represents telecoms including AT&T and Verizon.

Yet those firms might be the best positioned to achieve the goal of spreading Internet access to underserved areas, some experts say.

"If you want to get broadband out, you have to do it with [those] who brought you to the dance in the first place, and in this case it is the incumbent cable and telephone carriers who have 85 percent of lines in the country," said Robert Atkinson, president of the Information Technology and Innovation Foundation, a Washington tech policy think tank. "This is not basket weaving. This is really complex and intensive technical stuff that takes a fair amount of sophistication and scale to be able to do right and to continue to upgrade."

Obama has pushed for universal access to broadband since his presidential campaign, saying it would underpin the country's economic future. The stimulus funds target homes and businesses in the hinterlands that have largely been overlooked by broadband providers because of the hefty costs to lay down fiber-optic and other broadband pipes to small communities.

At the same time, the government has promised more scrutiny of industry practices that seem to limit consumer access to services, such as Comcast blocking the peer-to-peer file sharing service BitTorrent in 2007 and Apple's recent decision to block Google's voice service and the free Internet calling service Skype on the iPhone.

Those efforts have alarmed the major carriers. Specifically, some of the biggest firms fear that a clause in the stimulus plan that says recipients of the grants cannot "favor any lawful Internet applications and content over others" -- the concept known as net neutrality -- could lead to more rules down the road.

This condition goes beyond guidelines at the FCC that have been criticized by consumer advocacy groups as too vague. Carriers have pushed to keep current rules in place and see the condition on the stimulus grants as a potential precursor for additional rules at the FCC on how carriers can manage content over the Web.

The companies paint dire scenarios where new rules would lead to networks getting clogged with spam and too much video content, slowing down service for all users.

"It's not cost-effective for the big network operators to play in rural [markets] in the first place, and if they take federal money that comes with all these strings attached to it, they are opening themselves up to being regulated even further," said Roger Entner, head of communications research for Nielsen IAG.

McCormick said net-neutrality conditions on the grants are fuzzy and may give network operators pause before investing in long and expensive projects that could end up in a tangle of technical and legal hang-ups over how the firms oversee their networks.

"Clearly, it's causing potential applicants to reflect upon the uncertainties," McCormick said.

Verizon said it decided not to apply before conditions were announced. Comcast, which mainly serves urban and suburban areas, said it would also abstain. AT&T said that it likely would not apply but that it is open to partnership with state and local governments who win the grants.

Corporate officials have also said it would look bad for a company like AT&T or Comcast to come to the government with hat in hand when they are among the few companies in the economy flush with billions of dollars in cash reserves.

One official at a large network operator said on the condition of anonymity that once taken, government funds incite a "mob mentality" that could preclude sponsoring golf tournaments or giving executives bonuses, for fear of political backlash.

Some public advocates and analysts say the carriers never had a compelling reason to seek the grants.

"They weren't going to apply," said Ben Scott, head of policy at public advocacy group Free Press. "They are using this as an opportunity to grandstand against net neutrality."

Rebecca Arbogast, head of tech-policy research at Stifel Nicolaus, notes that the biggest carriers would be less inclined to deploy networks in rural areas because there is not enough demand to justify the ongoing financial investments. She said the companies should have expected stronger net-neutrality conditions because it was mandated by Congress in the stimulus act.

"With a few exceptions, the net-neutrality provisions were not a great departure from what I think was already out there and is consistent with the path that most recognize we were already headed down," Arbogast said.

The Commerce and Agriculture departments, which are handing out a total of $7.2 billion in broadband stimulus grants through 2011, say the plan to bring high-speed Internet to the hinterlands and urban poor can be accomplished without the big carriers. Companies like wireless broadband provider Clearwire and small cable and telecom operators may introduce more competition into the industry by using the funds to build networks that could compete with AT&T, Verizon and Comcast, analysts and government officials say.

"I think if the big carriers want to participate and play by the rules, great. If not, I'm not that concerned," said Mark Seifert, a senior adviser for the National Telecommunications and Information Administration, which is overseeing grants for the Commerce Department.

Seifert said the rules for broadband grants were not written to favor any size or kind of network operator. Further, the $7.2 billion is not intended to complete Obama's goal of spreading broadband to every home; rather, it is a "down payment" on a larger plan being crafted by the FCC, he said.

Washington Post, 8/14/09, thanks Steve for this article !

Netscape Founder Backs New Browser

It has been 15 years since Marc Andreessen developed the Netscape Internet browser that introduced millions of people to the Internet.

After its early success, Netscape was roundly defeated by Microsoft in the so-called browser wars of the 1990s that dominated the Web’s first chapter.

Mr. Andreessen appears to want a rematch. Now a prominent Silicon Valley financier, Mr. Andreessen is backing a start-up called RockMelt, staffed with some of his close associates, that is building a new Internet browser, according to people with knowledge of his investment.

“We have backed a really good team,” Mr. Andreessen said in an interview earlier this summer. A moment later, Mr. Andreessen appeared to regret his comment, saying he was not ready to talk about any aspect of the company.

But Mr. Andreessen suggested the new browser would be different, saying that most other browsers had not kept pace with the evolution of the Web, which had grown from an array of static Web pages into a network of complex Web sites and applications. “There are all kinds of things that you would do differently if you are building a browser from scratch,” Mr. Andreessen said.

RockMelt was co-founded by Eric Vishria and Tim Howes, both former executives at Opsware, a company that Mr. Andreessen co-founded and then sold to Hewlett-Packard in 2007 for about $1.6 billion. Mr. Howes also worked at Netscape with Mr. Andreessen.

Little else is known about RockMelt, and Mr. Vishria was unwilling to discuss it. “We are at very early stages of development,” Mr. Vishria said. “Talking about it at this stage is not useful.”

After Microsoft defeated Netscape, it controlled more than 90 percent of the browser market. Interest in browsers among technology companies waned and innovation ground to a halt. But in the last 18 months, the Internet browser has become a battleground again with giants like Google, Apple and Microsoft fighting one another.

The renewed interest in browsers is partly a result of the success of Mozilla, a nonprofit. The speedier, safer and more innovative Mozilla Firefox browser, introduced in 2004, has grabbed 23 percent of the market, and Microsoft’s share has dropped to 68 percent.

But the latest battle was also prompted by a giant shift in computing that is increasingly making the Web, not the PC, the place where people interact with complex software applications. Technology giants now see the browser as a control point to what users do online, and they want a say in shaping it.

In the last 18 months, Microsoft and Apple introduced greatly improved versions of their browsers, Internet Explorer and Safari. And Google entered the fray last fall when it released its Chrome browser. Last month, Google said it would build an operating system, also called Chrome, with its principal function being to support its browser.

“The days of working in isolation on your computer are mostly gone,” said John Lilly, the chief executive of Mozilla. “Because the Web has become so central to what we do, and the browser is the technology that mediates our interaction with the Web, the way the browser works is really important. There is a lot of room for innovation.”

Mr. Andreessen’s backing is certain to make RockMelt the focus of intense attention. For now, the company is keeping a lid on its plans. On the company’s Web site, the corporate name and the words “coming soon” are topped by a logo of the earth, with cracks exposing what seems to be molten lava from the planet’s core. A privacy policy on the site, which was removed after a reporter made inquiries to Mr. Vishria, indicates the browser is intended to be coupled somehow with Facebook. Mr. Andreessen serves as a director of Facebook.

The policy says that a person could use a Facebook ID to log into RockMelt, suggesting that the browser may be tailored to display Facebook updates and other features as users browse the Web. Another browser, Flock, based on Firefox, already incorporates feeds from social networking sites.

But RockMelt is not currently working with Facebook. “We are not aware of any details about RockMelt and its product,” said Brandee Barker, a Facebook spokeswoman.

In the interview this summer, Mr. Andreessen credited Mozilla with coming up with an economic model to support Web browsers. The organization has an agreement with Google that makes Google the standard home page when people start Firefox, and sends them to Google when they type something into the search box at the top of the browser. In 2007, Google paid Mozilla about $75 million for the alliance.

“Browsers today have a great business model,” Mr. Andreessen said.

But experts say a big challenge for any new Web browser could be distribution. Despite Google’s heavy promotion of Chrome, the browser has gained just 2 percent of the market.

“If anybody could do it today, one would imagine Google would be best positioned, and it is obvious they have made only meager gains,” said David B. Yoffie, a professor at the Harvard Business School, and the co-author of “Competing on Internet Time: Lessons From Netscape and Its Battle With Microsoft.” Professor Yoffie said that aiming the browser at Facebook users could be a good strategy.

“If you can get Facebook’s millions of users to think that this is a better way to do what they do on Facebook, that would be an opportunity to take advantage of,” he said.

By MIGUEL HELFT, New York Times, 8/14/09

Les Paul, Guitar Innovator, Dies at 94


Les Paul, the virtuoso guitarist and inventor whose solid-body electric guitar and recording studio innovations changed the course of 20th-century popular music, died Thursday in White Plains, N.Y. He was 94.

The cause was complications of pneumonia, the Gibson Guitar Corporation and his family announced. .

Mr. Paul was a remarkable musician as well as a tireless tinkerer. He played guitar alongside leading prewar jazz and pop musicians from Louis Armstrong to Bing Crosby. In the 1930s he began experimenting with guitar amplification, and by 1941 he had built what was probably the first solid-body electric guitar, although there are other claimants. With his guitar and the vocals of his wife, Mary Ford, he used overdubbing, multitrack recording and new electronic effects to create a string of hits in the 1950s.

Mr. Paul’s style encompassed the twang of country music, the harmonic richness of jazz and, later, the bite of rock ’n’ roll. For all his technological impact, though, he remained a down-home performer whose main goal, he often said, was to make people happy.

Mr. Paul, whose original name was Lester William Polsfuss, was born on June 9, 1915, in Waukesha, Wis. His childhood piano teacher wrote to his mother, “Your boy, Lester, will never learn music.” But he picked up harmonica, guitar and banjo by the time he was a teenager and started playing with country bands in the Midwest. In Chicago he performed for radio broadcasts on WLS and led the house band at WJJD; he billed himself as the Wizard of Waukesha, Hot Rod Red and Rhubarb Red.

His interest in gadgets came early. At the age of 10 he devised a harmonica holder from a coat hanger. Soon afterward he made his first amplified guitar by opening the back of a Sears acoustic model and inserting, behind the strings, the pickup from a dismantled Victrola. With the record player on, the acoustic guitar became an electric one. Later, he built his own pickup from ham radio earphone parts and assembled a recording machine using a Cadillac flywheel and the belt from a dentist’s drill.

From country music Mr. Paul moved into jazz, influenced by players like Django Reinhardt and Eddie Lang, who were using amplified hollow-body guitars to play hornlike single-note solo lines. He formed the Les Paul Trio in 1936 and moved to New York, where he was heard regularly on Fred Waring’s radio show from 1938 to 1941.

In 1940 or 1941 — the exact date is unknown — , Mr. Paul made his guitar breakthrough. Seeking to create electronically sustained notes on the guitar, he attached strings and two pickups to a wooden board with a guitar neck. “The log,” as he called it, if not the first solid-body electric guitar, became the most influential one.

“You could go out and eat and come back and the note would still be sounding,” Mr. Paul once said.

The odd-looking instrument drew derision when he first played it in public, so he hid the works inside a conventional-looking guitar. But the log was a conceptual turning point. With no acoustic resonance of its own, it was designed to generate an electronic signal that could be amplified and processed — the beginning of a sonic transformation of the world’s music.

Mr. Paul was drafted in 1942 and worked in California for the Armed Forces Radio Service, accompanying Rudy Vallee, Kate Smith and others. When he was discharged in 1943, he was hired as a staff musician for NBC radio in Los Angeles. His trio toured with the Andrews Sisters and backed Nat King Cole and Bing Crosby, with whom he recorded the hit “It’s Been a Long, Long Time” in 1945. Crosby encouraged Mr. Paul to build his own recording studio, and so he did, in his garage in Los Angeles.

There he experimented with recording techniques, using them to create not realistic replicas of a performance but electronically enhanced fabrications. Toying with his mother’s old Victrola had shown him that changing the speed of a recording could alter both pitch and timbre. He could record at half-speed and replay the results at normal speed, creating the illusion of superhuman agility. He altered instrumental textures through microphone positioning and reverberation. Technology and studio effects, he realized, were instruments themselves.

He also noticed that by playing along with previous recordings, he could become a one-man ensemble. As early as his 1948 hit “Lover,” he made elaborate, multilayered recordings, using two acetate disc machines, which demanded that each layer of music be captured in a single take. From discs he moved to magnetic tape, and in the late 1950s he built the first eight-track multitrack recorder. Each track could be recorded and altered separately, without affecting the others. The machine ushered in the modern recording era.

In 1947 Mr. Paul teamed up with Colleen Summers, who had been singing with Gene Autry’s band. He changed her name to Mary Ford, a name found in a telephone book.

They were touring in 1948 when Mr. Paul’s car skidded off an icy bridge. Among his many injuries, his right elbow was shattered; once set, it would be immovable for life. Mr. Paul had it set at an angle, slightly less than 90 degrees, so that he could continue to play guitar.

Mr. Paul, whose first marriage, to Virginia, had ended in divorce, married Ms. Ford in 1949. They had a television show, “Les Paul and Mary Ford at Home,” which was broadcast from their living room until 1958. They began recording together, mixing multiple layers of Ms. Ford’s vocals with Mr. Paul’s guitars and effects, and the dizzying results became hits in the early 1950s. Among their more than three dozen hits, “Mockingbird Hill,” “How High the Moon” and “The World Is Waiting for the Sunrise” in 1951 and “Vaya Con Dios” in 1953 were million-sellers.

Some of their music was recorded with microphones hanging in various rooms of the house, including one over the kitchen sink, so that Ms. Ford could record vocals while washing dishes. Mr. Paul also recorded instrumentals on his own, including the hits “Whispering,” “Tiger Rag” and “Meet Mister Callaghan” in 1951 and 1952.

The Gibson company hired Mr. Paul to design a Les Paul model guitar in the early 1950s, and variations of the first 1952 model have sold steadily ever since, accounting at one point for half of the privately held company’s total sales. Built with Mr. Paul’s patented pickups, his design is prized for its clarity and sustained tone. It has been used by musicians like Led Zeppelin’s Jimmy Page and Slash of Guns N’ Roses. The Les Paul Standard version is unchanged since 1958, the company says. In the mid-1950s, Mr. Paul and Ms. Ford moved to a house in Mahwah, N.J., where Mr. Paul eventually installed both film and recording studios and amassed a collection of hundreds of guitars.

The couple’s string of hits ended in 1961, and they were divorced in 1964. Ms. Ford died in 1977. Mr. Paul is survived by three sons, Lester (Rus) G. Paul, Gene W. Paul and Robert (Bobby) R. Paul; a daughter, Colleen Wess; his companion, Arlene Palmer; five grandchildren; and five great-grandchildren.In 1964, Mr. Paul underwent surgery for a broken eardrum, and he began suffering from arthritis in 1965. Through the 1960s he concentrated on designing guitars for Gibson. He invented and patented various pickups and transducers, as well as devices like the Les Paulverizer, an echo-repeat device, which he introduced in 1974. In the late 1970s he made two albums with the dean of country guitarists, Chet Atkins.

In 1981 Mr. Paul underwent a quintuple-bypass heart operation. After recuperating, he returned to performing, though the progress of his arthritis forced him to relearn the guitar. In 1983 he started to play weekly performances at Fat Tuesday’s, an intimate Manhattan jazz club. “I was always happiest playing in a club,” he said in a 1987 interview. “So I decided to find a nice little club in New York that I would be happy to play in.”

After Fat Tuesday’s closed in 1995, he moved his Monday-night residency to Iridium. He performed there until early June; guest stars have been appearing with his trio since then and will continue to do so indefinitely, a spokesman for the club said.

At his shows he used one of his own customized guitars, which included a microphone on a gooseneck pointing toward his mouth so that he could talk through the guitar. In his sets he would mix reminiscences, wisecracks and comments with versions of jazz standards. Guests — famous and unknown — showed up to pay homage or test themselves against him. Despite paralysis in some fingers on both hands, he retained some of his remarkable speed and fluency. Mr. Paul also performed regularly at jazz festivals through the 1980s.

He recorded a final album, “American Made, World Played” (Capitol), to celebrate his 90th birthday in 2005. It featured guest appearances by Eric Clapton, Keith Richards, Jeff Beck, Sting, Joe Perry of Aerosmith and Billy Gibbons of ZZ Top. The album brought him two Grammy Awards: for best pop instrumental performance and best rock instrumental performance. He had already won recognition from the Grammy trustees for technical achievements and another performance Grammy in 1976, for the album “Chester and Lester,” made with Chet Atkins.

In recent years, he said he was working on another major invention but would not reveal what it was.

“Honestly, I never strove to be an Edison,” he said in a 1991 interview in The New York Times. “The only reason I invented these things was because I didn’t have them and neither did anyone else. I had no choice, really.”

By JON PARELES, New York Times, 8/14/09

Tuesday, August 11, 2009

Getting A Good Night's Sleep With Online Therapy



You can do almost anything on the Internet these days. What about getting a good night’s sleep?

It might be possible, some researchers say. Web-based programs to treat insomnia are proliferating, and two small but rigorous studies suggest that online applications based on cognitive behavioral therapy can be effective.

“Fifteen years ago, people would have thought it was crazy to get therapy remotely,” said Bruce Wampold, a professor of counseling psychology at the University of Wisconsin. “But as we do more and more things electronically, including have social relationships, more therapists have come to believe that this can be an effective way to deliver services to some people.”

The first controlled study of an online program for insomnia was published in 2004. But the results were hard to interpret, because they showed similar benefits for those who used the program and those in the control group. The two new studies, from researchers in Virginia and in Canada, advance the evidence that such programs can work.

In the Virginia study, called SHUTi, patients enter several weeks of sleep diaries, and the program calculates a window of time during which they are allowed to sleep. Patients limit the time they spend in bed to roughly the hours that they have actually been sleeping.

The goal is to consolidate sleep, then gradually expand its duration — the same technique that would be used in face-to-face therapy, said Lee Ritterband, a psychologist at the University of Virginia, who developed the program.

Stella Parolisi, 65, a registered nurse in Virginia and a patient in the study, said sticking to the restricted sleep schedule was hard, “but toward the end, it started to pay off.”

“Before, if I was exhausted, I would try to get to bed earlier and earlier, which was the wrong thing,” she said. “It just gave me more time to toss and turn.”

But after using the program, she began to sleep for at least one four-hour stretch a night.

The SHUTi program, which spans nine weeks, advises patients to get out of bed if they wake and are unable to return to sleep for more than 15 minutes. It also uses readings, vignettes, animation and interactive exercises to help patients deal with factors that interfere with sleep. For example, the program helps patients manage anxious thoughts, like the idea that they cannot function without eight solid hours of sleep. It also reinforces the message that they should not do work or watch TV in bed, should limit the light in the bedroom and should avoid stimulants like caffeine late in the day.

In a small, randomized, controlled study, which included 45 adults, those who were assigned to try the online program reported significantly greater increases in sleep efficiency and decreases in nighttime wakefulness than those who remained on the waiting lists.

Specifically, participants’ sleep efficiency, a measure of the proportion of time spent asleep relative to the total time in bed, improved by 16 percent and their nighttime wakefulness (minutes awake during the night) decreased by 55 percent; neither measure changed significantly for the control group. The findings appeared last month in The Archives of General Psychiatry.

“The outcomes were very impressive, almost unbelievable,” said Jack Edinger, a psychologist at Duke University Medical Center.

The Canadian study tested a five-week program that also emphasized sleep restriction, controlling negative thoughts and avoiding stimuli like light and noise in the bedroom. It also included readings, and audio and video clips to teach and reinforce its messages.

Led by Norah Vincent, a psychologist at the University of Manitoba, the study included 118 adults who were randomly assigned to complete the program or remain on a waiting list.

“I liked that it was over the Internet,” said one participant, Kelly Lawrence, 51, of Winnipeg, “because when you don’t get your sleep you don’t want to have to get up and go to appointments. You don’t want to be out there on the roads.”

The online format made it easier to work around child care and other responsibilities, and to “pause the program and go back to something any time I needed to,” she added.

Thirty-five percent of those who completed the program described their insomnia as “much improved” or “very much improved,” compared with just 4 percent of those who remained on the waiting list. The findings were published in June in the journal Sleep.

Dr. Ritterband says he plans to make the online program publicly available, though not until after further study. Dr. Vincent also said she planned to commercialize her program, charging participants roughly $20 to $30.

Other online programs offering cognitive behavioral therapy for sleeplessness include CBTforinsomnia.com, developed and run by Gregg Jacobs, an insomnia specialist at University of Massachusetts Medical School, and “Overcoming Insomnia,” created by HealthMedia, a company based in Michigan.

In-person cognitive behavioral therapy is not readily available to many of the sleepless, whether because they do not have access to a trained therapist or because their schedules make it hard to keep the appointments.

“The sleep community recognizes that if everyone with insomnia showed up on our doorstep today, we wouldn’t be able to help them all,” said Lawrence Epstein, an instructor at Harvard Medical School and medical director of Sleep HealthCenters in Boston.

Still, Dr. Wampold, of Wisconsin, said some people were bound to be skeptical of online therapy. Therapists who tend to see “the interpersonal relationship between patient and clinician as a key source of motivation and change are likely to be suspicious,” he said.

For many insomniacs, he said, “the actual sleep disturbance is just an indication of more or other problems that need to be addressed.”

“And you can’t do that,” he added, “without more clinician contact and flexibility.”

New York Times 8/11/09

Friday, August 7, 2009

Inspired Genius John Hughes, Director of ’80s Comedies, Passes Away at 59


John Hughes, the prolific filmmaker whose sweet and sassy comedies like “Sixteen Candles” and “The Breakfast Club” plumbed the lives of teenagers in the 1980s, died Thursday on a morning walk while visiting Manhattan. He was 59.

The cause was a heart attack, according to a statement from the publicists Paul Bloch and Michelle Bega.

Mr. Hughes turned out a series of hits that captured audiences and touched popular culture — and then flummoxed both Hollywood and his fans by suddenly fading from the scene in the early 1990s.

His seeming disappearance inspired a 2009 documentary, “Don’t You Forget About Me,” by four young filmmakers who went in search of a man who was by then being compared to J. D. Salinger because of his reclusiveness. It became a tribute to Mr. Hughes’s influence on youth culture.

Mr. Hughes, who began his career as an advertising copywriter in Chicago, had been living quietly on a farm in northern Illinois. He is survived by his wife, the former Nancy Ludwig, whom he met in high school; two sons, John and James; and four grandchildren.

John Wilden Hughes Jr. was born on Feb. 18, 1950, in the suburbs of Detroit before moving, at 13, to the Chicago area. His father worked in sales, and he lived in a middle-class, all-American reality that became the mainstay of his films.

“I didn’t have this tortured childhood,” he told The New York Times in a 1991 interview. “I liked it.”

While visiting New York during his advertising days, Mr. Hughes hung around the offices of National Lampoon magazine and was published when he showed a gift for comedy. Once having begun work as a screenwriter, he pursued the craft relentlessly.

In the 1991 interview, he said: “If I’m on a roll, and I finish a script at 3:00, I’ll start another at 3:02.”

Mr. Hughes’ biggest success, in box-office terms, was the “Home Alone” series, of which he was the writer and a producer. The first film, released by 20th Century Fox in 1990, turned the simple tale of a young boy, played by Macaulay Culkin, who was forgotten by his vacationing family, into a monster hit. The film took in more than $285 million at the domestic box office and spawned two sequels.

He had a reputation for discovering and bringing out the best in young actors. In a statement on Thursday, Mr. Culkin said: “I was a fan of both his work and a fan of him as a person. The world has lost not only a quintessential filmmaker whose influence will be felt for generations, but a great and decent man.”

Mr. Hughes’s greatest professional effect came from a series of teen-oriented films he directed in the 1980s, beginning with “Sixteen Candles” in 1984. It was a whip-smart but tender look at coming of age, with Molly Ringwald as a girl whose 16th birthday is forgotten in the whirlwind of her sister’s wedding; it featured emerging actors like Anthony Michael Hall, John Cusack, Joan Cusack and Jami Gertz, among others.

“The Breakfast Club” followed in 1985, with “Weird Science,” immediately behind, in the same year. By then, the troupe of young actors who showed up in films by Mr. Hughes and others who worked in the same vein had expanded to include Emilio Estevez, Judd Nelson and Ally Sheedy; they were tagged “The Brat Pack.”

Probably no film so completely captured the arch and almost noxious, yet somehow loveable, quality of Mr. Hughes’s characters as “Ferris Bueller’s Day Off.” The movie, released by Paramount Pictures in 1986, starred Matthew Broderick as a ne’er-do-well high-schooler who spends more energy avoiding the classroom than he might have used inside.

“He can lie, manipulate and con people with inspired genius, especially in the service of a noble cause such as playing hooky,” Nina Darnton wrote of the Bueller character in a less-than-admiring New York Times review.

But the movie took in $70 million at the box office, and wound up 20 years later on an Entertainment Weekly list of the 50 best high school movies of all time, alongside others from Mr. Hughes.

If the magic seemed to fade — Mr. Hughes’s last movie as a director, “Curly Sue,” fell flat in 1991 — he continued to write for the screen. As recently as last year, working as Edmond Dantès, he shared a story credit with Seth Rogen and Kristofor Brown on “Drillbit Taylor,” in which Owen Wilson played a low-budget bodyguard hired to keep a couple of kids from getting pushed around.

Some in Hollywood surmised that he had stepped away simply because, for all his successes, he did not particularly like the film business and its ways. He was known as a stickler for control who often tangled with executives even as he made their companies a fortune.

Yet Mr. Hughes ultimately marked the business so indelibly that his name has become identified with an entire genre: comedies about disaffected youth.


by By Michael Cieply, New York Times, 8/7/09

Thursday, August 6, 2009

35th Anniversary - World Trade Center Wire Walker

"Don't worry about what you can't control" is what Philippe Petit thought about the wind at the Word Trade Center towers.

Today is the 35th anniversary of Phillip Petit's world famous 1974 World Trade Center tightrope walk.

Philippe Petit (born August 13, 1949) is a French high wire artist who gained fame for his high-wire walk between the Twin Towers (WTC) in New York City on August 7, 1974.

For his feat (that he referred to as "le coup", he used a 450-pound cable and a custom-made 26-foot long, 55-pound balancing pole.
Petit was first inspired to attempt what he called his "coup" on the Twin Towers while he sat in his dentist's office in Paris in 1968. In a magazine, he came upon an article about the as-yet-unconstructed buildings, along with an illustration of the model. He became obsessed with the towers, collecting articles on them whenever possible.

The 'artistic crime of the century' took six years of planning, during which Petit learned everything he could about the buildings, taking into account such problems as the swaying of the towers because of wind, and how to rig the steel cable across the 140-foot (43 m) gap between the towers (at a height of 1,368 ft (417.0 m)). He traveled to New York on several occasions to make first-hand observations. Since the towers were still under construction, Philippe and a NY-based photographer went up in a helicopter to make aerial photographs of the WTC.

Petit sneaked into the towers several times, hiding on the roof and other areas in the unfinished towers, in order to get a sense of what type of security measures were in place. Using his own observations and photographs, Petit was able to make a scale model of the towers to help him design the rigging he needed to prepare for the wirewalk. He made fake identification cards for himself and his collaborators (claiming that they were contractors who were installing an electrified fence on the roof) to gain access to the towers. Prior to this, to make it easier to get into the buildings, Petit carefully observed the clothes worn by construction workers and the kinds of tools they carried. He also took note of the clothing of businessmen so that he could blend in with them when he tried to enter the buildings. He observed what time the workers arrived and left, so he could determine when he would have roof access. As the target date of his "coup" approached, he claimed to be a journalist with a French architecture magazine so that he could gain permission to interview the workers on the roof. The Port Authority allowed Petit to conduct the interviews, which he used as a pretext to make more observations. He was once caught by a police officer on the roof, and his hopes to do the high wire walk were dampened, but he eventually regained the confidence to proceed.

On the night of August 6, 1974, Petit and his crew were able to ride in a freight elevator to the 104th floor with their equipment, and to store this equipment just nineteen steps from the roof. In order to pass the cable across the void, Petit and his crew had settled on using a bow and arrow. They first shot across a fishing line, and then passed larger and larger ropes across the space between the towers until they were able to pass the 450-pound steel cable across. Two cavalettis (guy lines) anchored to other points on the roof were used to stabilize the cable and keep the swaying of the wire to a minimum. For the first time in the history of the Twin Towers, they were joined.

On August 7, 1974, shortly after 7:15 a.m., Petit stepped off the South Tower and onto his 3/4" 6×19 IWRC (independent wire rope core) steel cable. He walked the wire for 45 minutes, making eight crossings between the towers, a quarter mile above the sidewalks of Manhattan. In addition to walking, he sat on the wire, gave knee salutes and, while lying on the wire, spoke with a gull circling above his head.

As soon as Petit was observed by witnesses on the ground, the Port Authority Police Department dispatched officers to the roof to take him into custody. One of the officers, Sgt. Charles Daniels, later reported his experience:

I observed the tightrope 'dancer'—because you couldn't call him a 'walker'—approximately halfway between the two towers. And upon seeing us he started to smile and laugh and he started going into a dancing routine on the high wire....And when he got to the building we asked him to get off the high wire but instead he turned around and ran back out into the middle....He was bouncing up and down. His feet were actually leaving the wire and then he would resettle back on the wire again....Unbelievable really....Everybody was spellbound in the watching of it.

Petit was warned by his friend on the South Tower that a police helicopter would come to pick him off the wire unless he got off. Rain had begun to fall, and Petit decided he had taken enough risks, so he decided to give himself up to the police waiting for him on the South Tower. He was arrested once he stepped off the wire. Provoked by his taunting behaviour while on the wire, police handcuffed him behind his back and roughly pushed him down a flight of stairs. This he later described as the most dangerous part of the stunt.

His audacious high wire performance made headlines around the world. When asked why he did the stunt, Petit would say "When I see three oranges, I juggle; when I see two towers, I walk."

The immense news coverage and public appreciation of Petit's high wire walk resulted in all formal charges relating to his walk being dropped. The court did however "sentence" Petit to perform a show for the children of New York City, which he transformed into another high-wire walk, in Central Park above Belvedere Lake (which has now become Turtle Pond.) Petit was also presented with a lifetime pass to the Twin Towers' Observation Deck by the Port Authority of New York and New Jersey. He signed a steel beam close to the point where he began his walk.

Petit's high-wire walk is credited with bringing the then rather unpopular Twin Towers much needed attention and even affection. Up to that point, critics such as technology historian Lewis Mumford had regarded them as ugly and utilitarian. The landlords were having trouble renting out all of their office space.

The documentary film Man on Wire by UK director James Marsh, about Petit's 1974 WTC performance, won both the World Cinema Jury and Audience awards at the Sundance Film Festival 2008. The film also won awards at the 2008 Full Frame Documentary Film Festival in Durham, N.C. and won the Academy Award for Best Documentary.

Petit has made dozens of public high-wire performances in his career; in 1986 he re-enacted the crossing of the Niagara River by Blondin for an Imax film. In 1989, to celebrate the 200th anniversary of the French Revolution, president Jacques Chirac permitted him to walk a wire strung from the ground, at the Place du Trocadero, to the second stage of the Eiffel Tower.

He is one of the Artists-in-Residence at the Cathedral of St. John the Divine in New York City. He currently lives in Woodstock, New York.

From "New York, the Unknown City"

After groups on each tower assempled the support structure, they had to find a way of getting the heavy walk wire across the 140 foot gap separating the towers. The answer fit the romantic poetry of the entire stunt. An arrow attached to a fishing line was shot from a bow from one building to the other, and then they simply towed the cable across.

Despite his years of planning, Petit was resigned to the fate of physics. To be walked upon, the cable had to be stretched to a tension of 3.5 tons. But because the buildings were designed to sway with the wind, Petit knew that a bad breeze would cause his wire to rip apart and send him to his death. He decided not to worry about what he couldn't control.