Group Think

Big business and the myth of the lone inventor

1.

Philo T. Farnsworth was born in 1906, tadalafil pharmacy and he looked the way an inventor of that era was supposed to look: slight and gaunt, with bright-blue exhausted eyes, and a mane of brown hair swept back from his forehead. He was nervous and tightly wound. He rarely slept. He veered between fits of exuberance and depression. At the age of three, he was making precise drawings of the internal mechanisms of locomotives. At six, he declared his intention to follow in the footsteps of Thomas Edison and Alexander Graham Bell. At fourteen, while tilling a potato field on his family’s farm in Idaho, he saw the neat, parallel lines of furrows in front of him, and it occurred to him–in a single, blinding moment–that a picture could be sent electronically through the airwaves in the same way, broken down into easily transmitted lines and then reassembled into a complete picture at the other end. He went to see his high-school science teacher, and covered the blackboard with drawings and equations. At nineteen, after dropping out of college, he impressed two local investors with his brilliance and his conviction. He moved to California and set up shop in a tiny laboratory. He got married on an impulse. On his wedding night, he seized his bride by the shoulders and looked at her with those bright-blue eyes. “Pemmie,” he said. “I have to tell you. There is another woman in my life–and her name is Television.”

Philo T. Farnsworth was the inventor of television. Through the nineteen-thirties and forties, he engaged in a heroic battle to perfect and commercialize his discovery, fending off creditors and predators, and working himself to the point of emotional and physical exhaustion. His nemesis was David Sarnoff, the head of RCA, then one of the most powerful American electronics companies. Sarnoff lived in an enormous Upper East Side mansion and smoked fat cigars and travelled by chauffeured limousine. His top television researcher was Vladimir Zworykin, the scion of a wealthy Russian family, who wore elegant three-piece suits and round spectacles, had a Ph.D. in physics, and apprenticed with the legendary Boris Rosing at the St. Petersburg Institute of Technology. Zworykin was never more than half a step behind Farnsworth: he filed for a patent on his own version of electronic television two years after Farnsworth had his potato-field vision. At one point, Sarnoff sent Zworykin to Farnsworth’s tiny laboratory, on Green Street in San Francisco, and he stayed for three days, asking suspiciously detailed questions. He had one of Farnsworth’s engineers build the heart of Farnsworth’s television system–the so-called image dissector–before his eyes, and then picked the tube up and turned it over in his hands and said, ominously, “This is a beautiful instrument. I wish I had invented it myself.” Soon Sarnoff himself came out to Green Street, swept imperially through the laboratory, and declared, “There’s nothing here we’ll need.” It was, of course, a lie. In the nineteen-thirties, television was not possible without Philo Farnsworth’s work. But in the end it didn’t much matter. Farnsworth’s company was forced out of the TV business. Farnsworth had a nervous breakdown, and Sarnoff used his wealth and power to declare himself the father of television.

The life of Philo Farnsworth is the subject of two new books, “The Last Lone Inventor,” by Evan I. Schwartz (HarperCollins; $24.95), and “The Boy Genius and the Mogul,” by Daniel Stashower (Broadway; $24.95). It is a wonderful tale, riveting and bittersweet. But its lessons, on closer examination, are less straightforward than the clichés of the doomed inventor and the villainous mogul might suggest. Philo Farnsworth’s travails make a rather strong case for big corporations, not against them.

2.

The idea of television arose from two fundamental discoveries. The first was photoconductivity. In 1872, Joseph May and Willoughby Smith discovered that the electrical resistance of certain metals varied according to their exposure to light. And, since everyone knew how to transmit electricity from one place to another, it made sense that images could be transmitted as well. The second discovery was what is called visual persistence. In 1880, the French engineer Maurice LeBlanc pointed out that, because the human eye retains an image for about a tenth of a second, if you wanted to transmit a picture you didn’t have to send it all at once. You could scan it, one line at a time, and, as long as you put all those lines back together at the other end within that fraction of a second, the human eye would be fooled into thinking that it was seeing a complete picture.

The hard part was figuring out how to do the scanning. In 1883, the German engineer Paul Nipkow devised an elaborate and ultimately unworkable system using a spinning metal disk. The disk was punctured with a spiral of small holes, and, as it spun, one line of light after another was projected through the holes onto a photocell. In 1908, a British electrical engineer named A. A. Campbell Swinton suggested that it would make more sense to scan images electronically, using a cathode ray. Philo Farnsworth was the first to work out how to do that. His image dissector was a vacuum tube with a lens at one end, a photoelectric plate right in front of the lens to convert the image from light to electricity, and then an “anode finger” to scan the electrical image line by line. After setting up his laboratory, Farnsworth tinkered with his makeshift television camera day and night for months. Finally, on September 7, 1927, he was ready. His wife, Pem, was by his side. His tiny television screen was in front of him. His brother-in-law, Cliff Gardner, was manning the television camera in a room at the other end of the lab. Stashower writes:

Squaring his shoulders, Farnsworth took his place at the controls and flicked a series of switches. A small, bluish patch of light appeared at the end of the receiving tube. Farnsworth lifted his head and began calling out instructions to Gardner in the next room.

“Put in the slide, Cliff,” Farnsworth said.

“Okay, it’s in,” Gardner answered. “Can you see it?”

A faint but unmistakable line appeared across the receiving end of the tube. As Farnsworth made some adjustments, the line became more distinct.

“Turn the slide a quarter turn, Cliff,” Farnsworth called. Seconds later, the line on the receiving tube rotated ninety degrees. Farnsworth looked up from the tube. “That’s it, folks,” he announced with a tremor in his voice. “We’ve done it–there you have electronic television.”

Both Stashower and Schwartz talk about how much meaning Farnsworth attached to this moment. He was a romantic, and in the romance of invention the creative process consists of two discrete, euphoric episodes, linked by long years of grit and hard work. First is the magic moment of conception: Farnsworth in the potato field. Second is the moment of execution: the day in the lab. If you had the first of those moments and not the second, you were a visionary. But if you had both you were in a wholly different category. Farnsworth must have known the story of King Gillette, the bottle-cap salesman, who woke up one morning in the summer of 1895 to find his razor dull. Gillette had a sudden vision: if all he wanted was a sharp edge, then why should he have to refashion the whole razor? Gillette later recalled:

As I stood there with the razor in my hand, my eyes resting on it as lightly as a bird settling down on its nest, the Gillette razor was born–more with the rapidity of a dream than by a process of reasoning. In a moment I saw it all: the way the blade could be held in a holder; the idea of sharpening the two opposite edges on the thin piece of steel; the clamping plates for the blade, with a handle half-way between the two edges of the blade…I stood there before the mirror in a trance of joy. My wife was visiting Ohio and I hurriedly wrote to her: “I’ve got it! Our fortune is made!”

If you had the vision and you made the vision work, then the invention was yours–that was what Farnsworth believed. It belonged to you, just as the safety razor belonged to King Gillette.

But this was Farnsworth’s mistake, because television wasn’t at all like the safety razor. It didn’t belong to one person. May and Smith stumbled across photoconductivity, and inspired LeBlanc, who, in turn, inspired Swinton, and Swinton’s idea inspired inventors around the world. Then there was Zworykin, of course, and his mentor Boris Rosing, and the team of Max Dieckmann and Rudolf Hell, in Germany, who tried to patent something in the mid-twenties that was virtually identical to the image dissector. In 1931, when Zworykin perfected his own version of the television camera, called the Iconoscope, RCA did a worldwide patent search and found very similar patent applications from a Hungarian named Kolomon Tihany, a Canadian named François Henrouteau, a Japanese inventor named Kenjiro Takayanagi, two Englishmen, and a Russian. Everyone was working on television and everyone was reading everyone else’s patent applications, and, because television was such a complex technology, nearly everyone had something new to add. Farnsworth came up with the first camera. Zworykin had the best early picture tube. And when Zworykin finally came up with his own camera it was not as good as Farnsworth’s camera in some respects, but it was better in others. In September of 1939, when RCA finally licensed the rights to Farnsworth’s essential patents, it didn’t replace the Iconoscope with Farnsworth’s image dissector. It took the best parts of both.

It is instructive to compare the early history of television with the development, some seventy-five years earlier, of the sewing machine. As the historian Grace Rogers Cooper points out, a sewing machine is really six different mechanisms in one–a means of supporting the cloth, a needle and a combining device to form the stitch, a feeding mechanism to allow one stitch to follow another, a means of insuring the even delivery of thread, and a governing mechanism to insure that each of the previous five steps is performed in sequence. Cooper writes in her book “The Sewing Machine”:

Weisenthal had added a point to the eye-end of the needle. Saint supported the fabric by placing it in a horizontal position with a needle entering vertically, Duncan successfully completed a chainstitch for embroidery purposes, Chapman used a needle with an eye at its point and did not pass it completely through the fabric, Krems stitched circular caps with an eye-pointed needle used with a hook to form a chainstitch, Thimmonier used the hooked needle to form a chainstitch on a fabric laid horizontally, and Hunt created a new stitch that was more readily adapted to sewing by machine than the hand stitches had been.

The man generally credited with combining and perfecting these elements is Elias Howe, a machinist from Boston. But even Howe’s patents were quickly superseded by a new round of patents, each taking one of the principles of his design and either augmenting it or replacing it. The result was legal and commercial gridlock, broken only when, in 1856, Howe and three of the leading sewing-machine manufacturers (among them Isaac Merritt Singer, who gave the world the sewing-machine foot pedal) agreed to pool their patents and form a trust. It was then that the sewing-machine business took off. For the sewing machine to succeed, in other words, those who saw themselves as sewing-machine inventors had to swallow their pride and concede that the machine was larger than they were–that groups, not individuals, invent complex technologies. That was what Farnsworth could not do, and it explains the terrible turn that his life took.

3.

David Sarnoff’s RCA had a very strict policy on patents. If you worked for RCA and you invented something patentable, it belonged to RCA. Your name was on the patent, and you got credit for your work. But you had to sign over your rights for one dollar. In “The Last Lone Inventor,” Schwartz tells the story of an RCA engineer who thought the system was so absurd that he would paste his one-dollar checks to the wall of his office–until the accounting department, upset with the unresolved balance on its books, steamed them off and forced him to cash them. At the same time, Sarnoff was a patient and generous benefactor. When Zworykin and Sarnoff discussed television for the first time, in 1929, Zworykin promised the RCA chief that he would create a working system in two years, at a cost of a hundred thousand dollars. In fact, it took more than ten years and fifty million dollars, and through all those years–which just happened to coincide with the Depression–Sarnoff’s support never wavered. Sarnoff “hired the best engineers out of the best universities,” Schwartz writes. “He paid them competitive salaries, provided them with ample research budgets, and offered them a chance to join his crusade to change the world, working in the most dynamic industry the world had ever seen.” What Sarnoff presented was a compromise. In exchange for control over the fruits of invention, he gave his engineers the freedom to invent.

Farnsworth didn’t want to relinquish that control. Both RCA and General Electric offered him a chance to work on television in their laboratories. He turned them both down. He wanted to go it alone. This was the practical consequence of his conviction that television was his, and it was, in retrospect, a grievous error. It meant that Farnsworth was forced to work in a state of chronic insecurity. He never had enough money. He feuded constantly with his major investor, a man named Jesse McCargar, who didn’t have the resources to play the television game. At the time of what should have been one of Farnsworth’s greatest triumphs–the granting of his principal –McCargar showed up at the lab complaining about costs, and made Farnsworth fire his three star engineers. When, in 1928, the Green Street building burned down, a panicked Farnsworth didn’t know whether or not his laboratory was insured. It was, as it happened, but a second laboratory, in Maine, wasn’t, and when it burned down, years later, he lost everything. Twice, he testified before Congress. The first time, he rambled off on a tangent about transmission bandwidth which left people scratching their heads. The second time, he passed up a perfect opportunity to register his complaints about RCA, and launched, instead, into a sentimental account of his humble origins. He simply did not understand how to play politics, just as he did not understand how to raise money or run a business or organize his life. All he really knew how to do was invent, which was something that, as a solo operator, he too seldom had time for.

This is the reason that so many of us work for big companies, of course: in a big company, there is always someone to do what we do not want to do or do not do well–someone to answer the phone, and set up our computer, and arrange our health insurance, and clean our office at night, and make sure the building is insured. In a famous 1937 essay, “The Nature of the Firm,” the economist Ronald Coase said that the reason we have corporations is to reduce the everyday transaction costs of doing business: a company puts an accountant on the staff so that if a staffer needs to check the books all he has to do is walk down the hall. It’s an obvious point, but one that is consistently overlooked, particularly by those who periodically rail, in the name of efficiency, against corporate bloat and superfluous middle managers. Yes, the middle manager does not always contribute directly to the bottom line. But he does contribute to those who contribute to the bottom line, and only an absurdly truncated account of human productivity–one that assumes real work to be somehow possible when phones are ringing, computers are crashing, and health insurance is expiring–does not see that secondary contribution as valuable.

In April, 1931, Sarnoff showed up at the Green Street laboratory to review Farnsworth’s work. This was, by any measure, an extraordinary event. Farnsworth was twenty-four, and working out of a ramshackle building. Sarnoff was one of the leading industrialists of his day. It was as if Bill Gates were to get in his private jet and visit a software startup in a garage across the country. But Farnsworth wasn’t there. He was in New York, trapped there by a court order resulting from a frivolous lawsuit filed by a shady would-be investor. Stashower calls this one of the great missed opportunities of Farnsworth’s career, because he almost certainly would have awed Sarnoff with his passion and brilliance, winning a lucrative licensing deal. Instead, an unimpressed Sarnoff made a token offer of a hundred thousand dollars for Farnsworth’s patents, and Farnsworth dismissed the offer out of hand. This, too, is a reason that inventors ought to work for big corporations: big corporations have legal departments to protect their employees against being kept away from their laboratories by frivolous lawsuits. A genius is a terrible thing to waste.

4.

In 1939, at the World’s Fair in New York City, David Sarnoff set up a nine-thousand-square-foot pavilion to showcase the new technology of television. The pavilion, shaped like a giant radio tube, was covered with RCA logos, and stood next to the Perisphere Theatre, the centerpiece of the fairgrounds. On opening day, thirty thousand people gathered to hear from President Roosevelt and Albert Einstein. The gala was televised by RCA, beamed across the New York City area from the top of the Empire State Building. As it happened, Farnsworth was in New York City that day, and he caught the opening ceremonies on a television in a department-store window. He saw Sarnoff introducing both Roosevelt and Einstein, and effectively claiming this wondrous new technology as his own. “Farnsworth’s entire existence seemed to be annulled in this moment,” Schwartz writes:

The dreams of a farm boy, the eureka moment in a potato field, the confession to a teacher, the confidence in him shown by businessmen and bankers and investors, the breakthroughs in the laboratory, all the years of work, the decisions of the official patent examiners, those hard-fought victories, all of those demonstrations that had come and gone, the entire vision of the future. All of it was being negated by Sarnoff’s performance at the World’s Fair. Would the public ever know the truth?… The agony of it set off sharp pains in his stomach.

Finally, later that summer, RCA settled with Farnsworth. It agreed to pay him a million dollars for the rights to his main patents, plus royalties on every television set sold. But it was too late. Something had died in him. “It’s come to the point of choosing whether I want to be a drunk or go crazy,” he told his wife. One doctor prescribed chloral hydrate, which destroyed his appetite and left him dangerously thin. Another doctor prescribed cigarettes, to soothe his nerves. A third prescribed uppers. He became addicted to the painkiller Pantipon. He committed himself to a sanitarium in Massachusetts, where he was given a course of shock therapy. After the war, his brother died in a plane crash. His patents expired, drying up his chief source of income. His company, unable to compete with RCA, was forced out of the television business. He convinced himself that he could unlock the secrets of nuclear fusion, and launched another private research project, mortgaging his home, selling his stock, and cashing in his life insurance to fund the project. But nothing came of it. He died in 1971–addicted to alcohol, deeply depressed, and all but forgotten. He was sixty-four.

In “Tube,” a history of television, David E. Fisher and Marshall Jon Fisher point out that Farnsworth was not the only television pioneer to die in misery. So did two others–John Logie Baird and Charles Francis Jenkins–who had tried and failed to produce mechanical television. This should not come as a surprise. The creative enterprise is a hazardous journey, and those who venture on it alone do so at their peril. Baird and Jenkins and Farnsworth risked their psychological and financial well-being on the romantic notion of the solitary inventor, and when that idea failed them what resources did they have left? Zworykin had his share of setbacks as well. He took on Farnsworth in court, and lost. He promised television in two years for a hundred thousand dollars and he came in eight years and fifty million dollars over budget. But he ended his life a prosperous and contented man, lauded and laurelled with awards and honorary degrees. He had the cocoon of RCA to protect him: a desk and a paycheck and a pension and a secretary and a boss with the means to rewrite history in his favor. This is perhaps a more important reason that we have companies–or, for that matter, that we have universities and tenure. Institutions are not just the best environment for success; they are also the safest environment for failure–and, much of the time, failure is what lies in store for innovators and visionaries. Philo Farnsworth should have gone to work for RCA. He would still have been the father of television, and he might have died a happy man.
Are smart people overrated?

1.

Five years ago, viagra several executives at McKinsey & Company, America’s largest and most prestigious management-consulting firm, launched what they called the War for Talent. Thousands of questionnaires were sent to managers across the country. Eighteen companies were singled out for special attention, and the consultants spent up to three days at each firm, interviewing everyone from the C.E.O. down to the human-resources staff. McKinsey wanted to document how the top-performing companies in America differed from other firms in the way they handle matters like hiring and promotion. But, as the consultants sifted through the piles of reports and questionnaires and interview transcripts, they grew convinced that the difference between winners and losers was more profound than they had realized. “We looked at one another and suddenly the light bulb blinked on,” the three consultants who headed the project–Ed Michaels, Helen Handfield-Jones, and Beth Axelrod–write in their new book, also called “The War for Talent.” The very best companies, they concluded, had leaders who were obsessed with the talent issue. They recruited ceaselessly, finding and hiring as many top performers as possible. They singled out and segregated their stars, rewarding them disproportionately, and pushing them into ever more senior positions. “Bet on the natural athletes, the ones with the strongest intrinsic skills,” the authors approvingly quote one senior General Electric executive as saying. “Don’t be afraid to promote stars without specifically relevant experience, seemingly over their heads.” Success in the modern economy, according to Michaels, Handfield-Jones, and Axelrod, requires “the talent mind-set”: the “deep-seated belief that having better talent at all levels is how you outperform your competitors.”

This “talent mind-set” is the new orthodoxy of American management. It is the intellectual justification for why such a high premium is placed on degrees from first-tier business schools, and why the compensation packages for top executives have become so lavish. In the modern corporation, the system is considered only as strong as its stars, and, in the past few years, this message has been preached by consultants and management gurus all over the world. None, however, have spread the word quite so ardently as McKinsey, and, of all its clients, one firm took the talent mind-set closest to heart. It was a company where McKinsey conducted twenty separate projects, where McKinsey’s billings topped ten million dollars a year, where a McKinsey director regularly attended board meetings, and where the C.E.O. himself was a former McKinsey partner. The company, of course, was Enron.

The Enron scandal is now almost a year old. The reputations of Jeffrey Skilling and Kenneth Lay, the company’s two top executives, have been destroyed. Arthur Andersen, Enron’s auditor, has been driven out of business, and now investigators have turned their attention to Enron’s investment bankers. The one Enron partner that has escaped largely unscathed is McKinsey, which is odd, given that it essentially created the blueprint for the Enron culture. Enron was the ultimate “talent” company. When Skilling started the corporate division known as Enron Capital and Trade, in 1990, he “decided to bring in a steady stream of the very best college and M.B.A. graduates he could find to stock the company with talent,” Michaels, Handfield-Jones, and Axelrod tell us. During the nineties, Enron was bringing in two hundred and fifty newly minted M.B.A.s a year. “We had these things called Super Saturdays,” one former Enron manager recalls. “I’d interview some of these guys who were fresh out of Harvard, and these kids could blow me out of the water. They knew things I’d never heard of.” Once at Enron, the top performers were rewarded inordinately, and promoted without regard for seniority or experience. Enron was a star system. “The only thing that differentiates Enron from our competitors is our people, our talent,” Lay, Enron’s former chairman and C.E.O., told the McKinsey consultants when they came to the company’s headquarters, in Houston. Or, as another senior Enron executive put it to Richard Foster, a McKinsey partner who celebrated Enron in his 2001 book, “Creative Destruction,” “We hire very smart people and we pay them more than they think they are worth.”

The management of Enron, in other words, did exactly what the consultants at McKinsey said that companies ought to do in order to succeed in the modern economy. It hired and rewarded the very best and the very brightest–and it is now in bankruptcy. The reasons for its collapse are complex, needless to say. But what if Enron failed not in spite of its talent mind-set but because of it? What if smart people are overrated?

2.

At the heart of the McKinsey vision is a process that the War for Talent advocates refer to as “differentiation and affirmation.” Employers, they argue, need to sit down once or twice a year and hold a “candid, probing, no-holds-barred debate about each individual,” sorting employees into A, B, and C groups. The A’s must be challenged and disproportionately rewarded. The B’s need to be encouraged and affirmed. The C’s need to shape up or be shipped out. Enron followed this advice almost to the letter, setting up internal Performance Review Committees. The members got together twice a year, and graded each person in their section on ten separate criteria, using a scale of one to five. The process was called “rank and yank.” Those graded at the top of their unit received bonuses two-thirds higher than those in the next thirty per cent; those who ranked at the bottom received no bonuses and no extra stock options–and in some cases were pushed out.

How should that ranking be done? Unfortunately, the McKinsey consultants spend very little time discussing the matter. One possibility is simply to hire and reward the smartest people. But the link between, say, I.Q. and job performance is distinctly underwhelming. On a scale where 0.1 or below means virtually no correlation and 0.7 or above implies a strong correlation (your height, for example, has a 0.7 correlation with your parents’ height), the correlation between I.Q. and occupational success is between 0.2 and 0.3. “What I.Q. doesn’t pick up is effectiveness at common-sense sorts of things, especially working with people,” Richard Wagner, a psychologist at Florida State University, says. “In terms of how we evaluate schooling, everything is about working by yourself. If you work with someone else, it’s called cheating. Once you get out in the real world, everything you do involves working with other people.”

Wagner and Robert Sternberg, a psychologist at Yale University, have developed tests of this practical component, which they call “tacit knowledge.” Tacit knowledge involves things like knowing how to manage yourself and others, and how to navigate complicated social situations. Here is a question from one of their tests:

You have just been promoted to head of an important department in your organization. The previous head has been transferred to an equivalent position in a less important department. Your understanding of the reason for the move is that the performance of the department as a whole has been mediocre. There have not been any glaring deficiencies, just a perception of the department as so-so rather than very good. Your charge is to shape up the department. Results are expected quickly. Rate the quality of the following strategies for succeeding at your new position.

a) Always delegate to the most junior person who can be trusted with the task.
b) Give your superiors frequent progress reports.
c) Announce a major reorganization of the department that includes getting rid of whomever you believe to be “dead wood.”
d) Concentrate more on your people than on the tasks to be done.
e) Make people feel completely responsible for their work.

Wagner finds that how well people do on a test like this predicts how well they will do in the workplace: good managers pick (b) and (e); bad managers tend to pick (c). Yet there’s no clear connection between such tacit knowledge and other forms of knowledge and experience. The process of assessing ability in the workplace is a lot messier than it appears.

An employer really wants to assess not potential but performance. Yet that’s just as tricky. In “The War for Talent,” the authors talk about how the Royal Air Force used the A, B, and C ranking system for its pilots during the Battle of Britain. But ranking fighter pilots–for whom there are a limited and relatively objective set of performance criteria (enemy kills, for example, and the ability to get their formations safely home)–is a lot easier than assessing how the manager of a new unit is doing at, say, marketing or business development. And whom do you ask to rate the manager’s performance? Studies show that there is very little correlation between how someone’s peers rate him and how his boss rates him. The only rigorous way to assess performance, according to human-resources specialists, is to use criteria that are as specific as possible. Managers are supposed to take detailed notes on their employees throughout the year, in order to remove subjective personal reactions from the process of assessment. You can grade someone’s performance only if you know their performance. And, in the freewheeling culture of Enron, this was all but impossible. People deemed “talented” were constantly being pushed into new jobs and given new challenges. Annual turnover from promotions was close to twenty per cent. Lynda Clemmons, the so-called “weather babe” who started Enron’s weather derivatives business, jumped, in seven quick years, from trader to associate to manager to director and, finally, to head of her own business unit. How do you evaluate someone’s performance in a system where no one is in a job long enough to allow such evaluation?

The answer is that you end up doing performance evaluations that aren’t based on performance. Among the many glowing books about Enron written before its fall was the best-seller “Leading the Revolution,” by the management consultant Gary Hamel, which tells the story of Lou Pai, who launched Enron’s power-trading business. Pai’s group began with a disaster: it lost tens of millions of dollars trying to sell electricity to residential consumers in newly deregulated markets. The problem, Hamel explains, is that the markets weren’t truly deregulated: “The states that were opening their markets to competition were still setting rules designed to give their traditional utilities big advantages.” It doesn’t seem to have occurred to anyone that Pai ought to have looked into those rules more carefully before risking millions of dollars. He was promptly given the chance to build the commercial electricity-outsourcing business, where he ran up several more years of heavy losses before cashing out of Enron last year with two hundred and seventy million dollars. Because Pai had “talent,” he was given new opportunities, and when he failed at those new opportunities he was given still more opportunities . . . because he had “talent.” “At Enron, failure–even of the type that ends up on the front page of the Wall Street Journal–doesn’t necessarily sink a career,” Hamel writes, as if that were a good thing. Presumably, companies that want to encourage risk-taking must be willing to tolerate mistakes. Yet if talent is defined as something separate from an employee’s actual performance, what use is it, exactly?

3.

What the War for Talent amounts to is an argument for indulging A employees, for fawning over them. “You need to do everything you can to keep them engaged and satisfied–even delighted,” Michaels, Handfield-Jones, and Axelrod write. “Find out what they would most like to be doing, and shape their career and responsibilities in that direction. Solve any issues that might be pushing them out the door, such as a boss that frustrates them or travel demands that burden them.” No company was better at this than Enron. In one oft-told story, Louise Kitchin, a twenty-nine-year-old gas trader in Europe, became convinced that the company ought to develop an online-trading business. She told her boss, and she began working in her spare time on the project, until she had two hundred and fifty people throughout Enron helping her. After six months, Skilling was finally informed. “I was never asked for any capital,” Skilling said later. “I was never asked for any people. They had already purchased the servers. They had already started ripping apart the building. They had started legal reviews in twenty-two countries by the time I heard about it.” It was, Skilling went on approvingly, “exactly the kind of behavior that will continue to drive this company forward.”

Kitchin’s qualification for running EnronOnline, it should be pointed out, was not that she was good at it. It was that she wanted to do it, and Enron was a place where stars did whatever they wanted. “Fluid movement is absolutely necessary in our company. And the type of people we hire enforces that,” Skilling told the team from McKinsey. “Not only does this system help the excitement level for each manager, it shapes Enron’s business in the direction that its managers find most exciting.” Here is Skilling again: “If lots of [employees] are flocking to a new business unit, that’s a good sign that the opportunity is a good one. . . . If a business unit can’t attract people very easily, that’s a good sign that it’s a business Enron shouldn’t be in.” You might expect a C.E.O. to say that if a business unit can’t attract customers very easily that’s a good sign it’s a business the company shouldn’t be in. A company’s business is supposed to be shaped in the direction that its managers find most profitable. But at Enron the needs of the customers and the shareholders were secondary to the needs of its stars.

A dozen years ago, the psychologists Robert Hogan, Robert Raskin, and Dan Fazzini wrote a brilliant essay called “The Dark Side of Charisma.” It argued that flawed managers fall into three types. One is the High Likability Floater, who rises effortlessly in an organization because he never takes any difficult decisions or makes any enemies. Another is the Homme de Ressentiment, who seethes below the surface and plots against his enemies. The most interesting of the three is the Narcissist, whose energy and self-confidence and charm lead him inexorably up the corporate ladder. Narcissists are terrible managers. They resist accepting suggestions, thinking it will make them appear weak, and they don’t believe that others have anything useful to tell them. “Narcissists are biased to take more credit for success than is legitimate,” Hogan and his co-authors write, and “biased to avoid acknowledging responsibility for their failures and shortcomings for the same reasons that they claim more success than is their due.” Moreover:

Narcissists typically make judgments with greater confidence than other people . . . and, because their judgments are rendered with such conviction, other people tend to believe them and the narcissists become disproportionately more influential in group situations. Finally, because of their self-confidence and strong need for recognition, narcissists tend to “self-nominate”; consequently, when a leadership gap appears in a group or organization, the narcissists rush to fill it.

Tyco Corporation and WorldCom were the Greedy Corporations: they were purely interested in short-term financial gain. Enron was the Narcissistic Corporation–a company that took more credit for success than was legitimate, that did not acknowledge responsibility for its failures, that shrewdly sold the rest of us on its genius, and that substituted self-nomination for disciplined management. At one point in “Leading the Revolution,” Hamel tracks down a senior Enron executive, and what he breathlessly recounts–the braggadocio, the self-satisfaction–could be an epitaph for the talent mind-set:

“You cannot control the atoms within a nuclear fusion reaction,” said Ken Rice when he was head of Enron Capital and Trade Resources (ECT), America’s largest marketer of natural gas and largest buyer and seller of electricity. Adorned in a black T-shirt, blue jeans, and cowboy boots, Rice drew a box on an office whiteboard that pictured his business unit as a nuclear reactor. Little circles in the box represented its “contract originators,” the gunslingers charged with doing deals and creating new businesses. Attached to each circle was an arrow. In Rice’s diagram the arrows were pointing in all different directions. “We allow people to go in whichever direction that they want to go.”

The distinction between the Greedy Corporation and the Narcissistic Corporation matters, because the way we conceive our attainments helps determine how we behave. Carol Dweck, a psychologist at Columbia University, has found that people generally hold one of two fairly firm beliefs about their intelligence: they consider it either a fixed trait or something that is malleable and can be developed over time. Five years ago, Dweck did a study at the University of Hong Kong, where all classes are conducted in English. She and her colleagues approached a large group of social-sciences students, told them their English-proficiency scores, and asked them if they wanted to take a course to improve their language skills. One would expect all those who scored poorly to sign up for the remedial course. The University of Hong Kong is a demanding institution, and it is hard to do well in the social sciences without strong English skills. Curiously, however, only the ones who believed in malleable intelligence expressed interest in the class. The students who believed that their intelligence was a fixed trait were so concerned about appearing to be deficient that they preferred to stay home. “Students who hold a fixed view of their intelligence care so much about looking smart that they act dumb,” Dweck writes, “for what could be dumber than giving up a chance to learn something that is essential for your own success?”

In a similar experiment, Dweck gave a class of preadolescent students a test filled with challenging problems. After they were finished, one group was praised for its effort and another group was praised for its intelligence. Those praised for their intelligence were reluctant to tackle difficult tasks, and their performance on subsequent tests soon began to suffer. Then Dweck asked the children to write a letter to students at another school, describing their experience in the study. She discovered something remarkable: forty per cent of those students who were praised for their intelligence lied about how they had scored on the test, adjusting their grade upward. They weren’t naturally deceptive people, and they weren’t any less intelligent or self-confident than anyone else. They simply did what people do when they are immersed in an environment that celebrates them solely for their innate “talent.” They begin to define themselves by that description, and when times get tough and that self-image is threatened they have difficulty with the consequences. They will not take the remedial course. They will not stand up to investors and the public and admit that they were wrong. They’d sooner lie.

4.

The broader failing of McKinsey and its acolytes at Enron is their assumption that an organization’s intelligence is simply a function of the intelligence of its employees. They believe in stars, because they don’t believe in systems. In a way, that’s understandable, because our lives are so obviously enriched by individual brilliance. Groups don’t write great novels, and a committee didn’t come up with the theory of relativity. But companies work by different rules. They don’t just create; they execute and compete and coördinate the efforts of many different people, and the organizations that are most successful at that task are the ones where the system is the star.

There is a wonderful example of this in the story of the so-called Eastern Pearl Harbor, of the Second World War. During the first nine months of 1942, the United States Navy suffered a catastrophe. German U-boats, operating just off the Atlantic coast and in the Caribbean, were sinking our merchant ships almost at will. U-boat captains marvelled at their good fortune. “Before this sea of light, against this footlight glare of a carefree new world were passing the silhouettes of ships recognizable in every detail and sharp as the outlines in a sales catalogue,” one U-boat commander wrote. “All we had to do was press the button.”

What made this such a puzzle is that, on the other side of the Atlantic, the British had much less trouble defending their ships against U-boat attacks. The British, furthermore, eagerly passed on to the Americans everything they knew about sonar and depth-charge throwers and the construction of destroyers. And still the Germans managed to paralyze America’s coastal zones.

You can imagine what the consultants at McKinsey would have concluded: they would have said that the Navy did not have a talent mind-set, that President Roosevelt needed to recruit and promote top performers into key positions in the Atlantic command. In fact, he had already done that. At the beginning of the war, he had pushed out the solid and unspectacular Admiral Harold R. Stark as Chief of Naval Operations and replaced him with the legendary Ernest Joseph King. “He was a supreme realist with the arrogance of genius,” Ladislas Farago writes in “The Tenth Fleet,” a history of the Navy’s U-boat battles in the Second World War. “He had unbounded faith in himself, in his vast knowledge of naval matters and in the soundness of his ideas. Unlike Stark, who tolerated incompetence all around him, King had no patience with fools.”

The Navy had plenty of talent at the top, in other words. What it didn’t have was the right kind of organization. As Eliot A. Cohen, a scholar of military strategy at Johns Hopkins, writes in his brilliant book “Military Misfortunes in the Atlantic”:

To wage the antisubmarine war well, analysts had to bring together fragments of information, direction-finding fixes, visual sightings, decrypts, and the “flaming datum” of a U-boat attack–for use by a commander to coordinate the efforts of warships, aircraft, and convoy commanders. Such synthesis had to occur in near “real time”–within hours, even minutes in some cases.

The British excelled at the task because they had a centralized operational system. The controllers moved the British ships around the Atlantic like chess pieces, in order to outsmart U-boat “wolf packs.” By contrast, Admiral King believed strongly in a decentralized management structure: he held that managers should never tell their subordinates ” ‘how’ as well as what to ‘do.’ ” In today’s jargon, we would say he was a believer in “loose-tight” management, of the kind celebrated by the McKinsey consultants Thomas J. Peters and Robert H. Waterman in their 1982 best-seller, “In Search of Excellence.” But “loose-tight” doesn’t help you find U-boats. Throughout most of 1942, the Navy kept trying to act smart by relying on technical know-how, and stubbornly refused to take operational lessons from the British. The Navy also lacked the organizational structure necessary to apply the technical knowledge it did have to the field. Only when the Navy set up the Tenth Fleet–a single unit to coördinate all anti-submarine warfare in the Atlantic–did the situation change. In the year and a half before the Tenth Fleet was formed, in May of 1943, the Navy sank thirty-six U-boats. In the six months afterward, it sank seventy-five. “The creation of the Tenth Fleet did not bring more talented individuals into the field of ASW”–anti-submarine warfare–“than had previous organizations,” Cohen writes. “What Tenth Fleet did allow, by virtue of its organization and mandate, was for these individuals to become far more effective than previously.” The talent myth assumes that people make organizations smart. More often than not, it’s the other way around.

5.

There is ample evidence of this principle among America’s most successful companies. Southwest Airlines hires very few M.B.A.s, pays its managers modestly, and gives raises according to seniority, not “rank and yank.” Yet it is by far the most successful of all United States airlines, because it has created a vastly more efficient organization than its competitors have. At Southwest, the time it takes to get a plane that has just landed ready for takeoff–a key index of productivity–is, on average, twenty minutes, and requires a ground crew of four, and two people at the gate. (At United Airlines, by contrast, turnaround time is closer to thirty-five minutes, and requires a ground crew of twelve and three agents at the gate.)

In the case of the giant retailer Wal-Mart, one of the most critical periods in its history came in 1976, when Sam Walton “unretired,” pushing out his handpicked successor, Ron Mayer. Mayer was just over forty. He was ambitious. He was charismatic. He was, in the words of one Walton biographer, “the boy-genius financial officer.” But Walton was convinced that Mayer was, as people at McKinsey would say, “differentiating and affirming” in the corporate suite, in defiance of Wal-Mart’s inclusive culture. Mayer left, and Wal-Mart survived. After all, Wal-Mart is an organization, not an all-star team. Walton brought in David Glass, late of the Army and Southern Missouri State University, as C.E.O.; the company is now ranked No. 1 on the Fortune 500 list.

Procter & Gamble doesn’t have a star system, either. How could it? Would the top M.B.A. graduates of Harvard and Stanford move to Cincinnati to work on detergent when they could make three times as much reinventing the world in Houston? Procter & Gamble isn’t glamorous. Its C.E.O. is a lifer–a former Navy officer who began his corporate career as an assistant brand manager for Joy dishwashing liquid–and, if Procter & Gamble’s best played Enron’s best at Trivial Pursuit, no doubt the team from Houston would win handily. But Procter & Gamble has dominated the consumer-products field for close to a century, because it has a carefully conceived managerial system, and a rigorous marketing methodology that has allowed it to win battles for brands like Crest and Tide decade after decade. In Procter & Gamble’s Navy, Admiral Stark would have stayed. But a cross-divisional management committee would have set the Tenth Fleet in place before the war ever started.

6.

Among the most damning facts about Enron, in the end, was something its managers were proudest of. They had what, in McKinsey terminology, is called an “open market” for hiring. In the open-market system–McKinsey’s assault on the very idea of a fixed organization–anyone could apply for any job that he or she wanted, and no manager was allowed to hold anyone back. Poaching was encouraged. When an Enron executive named Kevin Hannon started the company’s global broadband unit, he launched what he called Project Quick Hire. A hundred top performers from around the company were invited to the Houston Hyatt to hear Hannon give his pitch. Recruiting booths were set up outside the meeting room. “Hannon had his fifty top performers for the broadband unit by the end of the week,” Michaels, Handfield-Jones, and Axelrod write, “and his peers had fifty holes to fill.” Nobody, not even the consultants who were paid to think about the Enron culture, seemed worried that those fifty holes might disrupt the functioning of the affected departments, that stability in a firm’s existing businesses might be a good thing, that the self-fulfillment of Enron’s star employees might possibly be in conflict with the best interests of the firm as a whole.

These are the sort of concerns that management consultants ought to raise. But Enron’s management consultant was McKinsey, and McKinsey was as much a prisoner of the talent myth as its clients were. In 1998, Enron hired ten Wharton M.B.A.s; that same year, McKinsey hired forty. In 1999, Enron hired twelve from Wharton; McKinsey hired sixty-one. The consultants at McKinsey were preaching at Enron what they believed about themselves. “When we would hire them, it wouldn’t just be for a week,” one former Enron manager recalls, of the brilliant young men and women from McKinsey who wandered the hallways at the company’s headquarters. “It would be for two to four months. They were always around.” They were there looking for people who had the talent to think outside the box. It never occurred to them that, if everyone had to think outside the box, maybe it was the box that needed fixing.
Can you read people’s thoughts just by looking at them?

1.

Some years ago, ed John Yarbrough was working patrol for the Los Angeles County Sheriff’s Department. It was about two in the morning. He and his partner were in the Willowbrook section of South Central Los Angeles, visit web and they pulled over a sports car. “Dark, nighttime, average stop,” Yarbrough recalls. “Patrol for me was like going hunting. At that time of night in the area I was working, there was a lot of criminal activity, and hardly anyone had a driver’s license. Almost everyone had something intoxicating in the car. We stopped drunk drivers all the time. You’re hunting for guns or lots of dope, or suspects wanted for major things. You look at someone and you get an instinctive reaction. And the longer you’ve been working the stronger that instinctive reaction is.”

Yarbrough was driving, and in a two-man patrol car the procedure is for the driver to make the approach and the officer on the passenger side to provide backup. He opened the door and stepped out onto the street, walking toward the vehicle with his weapon drawn. Suddenly, a man jumped out of the passenger side and pointed a gun directly at him. The two of them froze, separated by no more than a few yards. “There was a tree behind him, to his right,” Yarbrough recalls. “He was about seventeen. He had the gun in his right hand. He was on the curb side. I was on the other side, facing him. It was just a matter of who was going to shoot first. I remember it clear as day. But for some reason I didn’t shoot him.” Yarbrough is an ex-marine with close-cropped graying hair and a small mustache, and he speaks in measured tones. “Is he a danger? Sure. He’s standing there with a gun, and what person in his right mind does that facing a uniformed armed policeman? If you looked at it logically, I should have shot him. But logic had nothing to do with it. Something just didn’t feel right. It was a gut reaction not to shoot– a hunch that at that exact moment he was not an imminent threat to me.” So Yarbrough stopped, and, sure enough, so did the kid. He pointed a gun at an armed policeman on a dark street in South Central L.A., and then backed down.

Yarbrough retired last year from the sheriff’s department after almost thirty years, sixteen of which were in homicide. He now lives in western Arizona, in a small, immaculate house overlooking the Colorado River, with pictures of John Wayne, Charles Bronson, Clint Eastwood, and Dale Earnhardt on the wall. He has a policeman’s watchfulness: while he listens to you, his eyes alight on your face, and then they follow your hands, if you move them, and the areas to your immediate left and right– and then back again, in a steady cycle. He grew up in an affluent household in the San Fernando Valley, the son of two doctors, and he is intensely analytical: he is the sort to take a problem and break it down, working it over slowly and patiently in his mind, and the incident in Willowbrook is one of those problems. Policemen shoot people who point guns directly at them at two in the morning. But something he saw held him back, something that ninety-nine people out of a hundred wouldn’t have seen.

Many years later, Yarbrough met with a team of psychologists who were conducting training sessions for law enforcement. They sat beside him in a darkened room and showed him a series of videotapes of people who were either lying or telling the truth. He had to say who was doing what. One tape showed people talking about their views on the death penalty and on smoking in public. Another featured a series of nurses who were all talking about a nature film they were supposedly watching, even though some of them were actually watching grisly documentary footage about burn victims and amputees. It may sound as if the tests should have been easy, because we all think we can tell whether someone is lying. But these were not the obvious fibs of a child, or the prevarications of people whose habits and tendencies we know well. These were strangers who were motivated to deceive, and the task of spotting the liars turns out to be fantastically difficult. There is just too much information–words, intonation, gestures, eyes, mouth–and it is impossible to know how the various cues should be weighted, or how to put them all together, and in any case it’s all happening so quickly that you can’t even follow what you think you ought to follow. The tests have been given to policemen, customs officers, judges, trial lawyers, and psychotherapists, as well as to officers from the F.B.I., the C.I.A., the D.E.A., and the Bureau of Alcohol, Tobacco, and Firearms– people one would have thought would be good at spotting lies. On average, they score fifty per cent, which is to say that they would have done just as well if they hadn’t watched the tapes at all and just guessed. But every now and again– roughly one time in a thousand–someone scores off the charts. A Texas Ranger named David Maxwell did extremely well, for example, as did an ex-A.T.F. agent named J.J. Newberry, a few therapists, an arbitrator, a vice cop– and John Yarbrough, which suggests that what happened in Willowbrook may have been more than a fluke or a lucky guess. Something in our faces signals whether we’re going to shoot, say, or whether we’re lying about the film we just saw. Most of us aren’t very good at spotting it. But a handful of people are virtuosos. What do they see that we miss?

2.

All of us, a thousand times a day, read faces. When someone says “I love you,” we look into that person’s eyes to judge his or her sincerity. When we meet someone new, we often pick up on subtle signals, so that, even though he or she may have talked in a normal and friendly manner, afterward we say, “I don’t think he liked me,” or “I don’t think she’s very happy.” We easily parse complex distinctions in facial expression. If you saw me grinning, for example, with my eyes twinkling, you’d say I was amused. But that’s not the only way we interpret a smile. If you saw me nod and smile exaggeratedly, with the corners of my lips tightened, you would take it that I had been teased and was responding sarcastically. If I made eye contact with someone, gave a small smile and then looked down and averted my gaze, you would think I was flirting. If I followed a remark with an abrupt smile and then nodded, or tilted my head sideways, you might conclude that I had just said something a little harsh, and wanted to take the edge off it. You wouldn’t need to hear anything I was saying in order to reach these conclusions. The face is such an extraordinarily efficient instrument of communication that there must be rules that govern the way we interpret facial expressions. But what are those rules? And are they the same for everyone?

In the nineteen-sixties, a young San Francisco psychologist named Paul Ekman began to study facial expression, and he discovered that no one knew the answers to those questions. Ekman went to see Margaret Mead, climbing the stairs to her tower office at the American Museum of Natural History. He had an idea. What if he travelled around the world to find out whether people from different cultures agreed on the meaning of different facial expressions? Mead, he recalls, “looked at me as if I were crazy.” Like most social scientists of her day, she believed that expression was culturally determined– that we simply used our faces according to a set of learned social conventions. Charles Darwin had discussed the face in his later writings; in his 1872 book, “The Expression of the Emotions in Man and Animals,” he argued that all mammals show emotion reliably in their faces. But in the nineteen-sixties academic psychologists were more interested in motivation and cognition than in emotion or its expression. Ekman was undaunted; he began travelling to places like Japan, Brazil, and Argentina, carrying photographs of men and women making a variety of distinctive faces. Everywhere he went, people agreed on what those expressions meant. But what if people in the developed world had all picked up the same cultural rules from watching the same movies and television shows? So Ekman set out again, this time making his way through the jungles of Papua New Guinea, to the most remote villages, and he found that the tribesmen there had no problem interpreting the expressions, either. This may not sound like much of a breakthrough. But in the scientific climate of the time it was a revelation. Ekman had established that expressions were the universal products of evolution. There were fundamental lessons to be learned from the face, if you knew where to look.

Paul Ekman is now in his sixties. He is clean-shaven, with closely set eyes and thick, prominent eyebrows, and although he is of medium build, he seems much larger than he is: there is something stubborn and substantial in his demeanor. He grew up in Newark, the son of a pediatrician, and entered the University of Chicago at fifteen. He speaks deliberately: before he laughs, he pauses slightly, as if waiting for permission. He is the sort to make lists, and number his arguments. His academic writing has an orderly logic to it; by the end of an Ekman essay, each stray objection and problem has been gathered up and catalogued. In the mid-sixties, Ekman set up a lab in a ramshackle Victorian house at the University of California at San Francisco, where he holds a professorship. If the face was part of a physiological system, he reasoned, the system could be learned. He set out to teach himself. He treated the face as an adventurer would a foreign land, exploring its every crevice and contour. He assembled a videotape library of people’s facial expressions, which soon filled three rooms in his lab, and studied them to the point where he could look at a face and pick up a flicker of emotion that might last no more than a fraction of a second. Ekman created the lying tests. He filmed the nurses talking about the movie they were watching and the movie they weren’t watching. Working with Maureen O’Sullivan, a psychologist from the University of San Francisco, and other colleagues, he located people who had a reputation for being uncannily perceptive, and put them to the test, and that’s how Yarbrough and the other high-scorers were identified. O’Sullivan and Ekman call this study of gifted face readers the Diogenes Project, after the Greek philosopher of antiquity who used to wander around Athens with a lantern, peering into people’s faces as he searched for an honest man. Ekman has taken the most vaporous of sensations– the hunch you have about someone else– and sought to give them definition. Most of us don’t trust our hunches, because we don’t know where they came from. We think they can’t be explained. But what if they can?

3.

Paul Ekman got his start in the face-reading business because of a man named Silvan Tomkins, and Silvan Tomkins may have been the best face reader there ever was. Tomkins was from Philadelphia, the son of a dentist from Russia. He was short, and slightly thick around the middle, with a wild mane of white hair and huge black plastic-rimmed glasses. He taught psychology at Princeton and Rutgers, and was the author of “Affect, Imagery, Consciousness,” a four-volume work so dense that its readers were evenly divided between those who understood it and thought it was brilliant and those who did not understand it and thought it was brilliant. He was a legendary talker. At the end of a cocktail party, fifteen people would sit, rapt, at Tomkins’s feet, and someone would say, “One more question!” and they would all sit there for another hour and a half, as Tomkins held forth on, say, comic books, a television sitcom, the biology of emotion, his problem with Kant, and his enthusiasm for the latest fad diets, all enfolded into one extended riff. During the Depression, in the midst of his doctoral studies at Harvard, he worked as a handicapper for a horse-racing syndicate, and was so successful that he lived lavishly on Manhattan’s Upper East Side. At the track, where he sat in the stands for hours, staring at the horses through binoculars, he was known as the Professor. “He had a system for predicting how a horse would do based on what horse was on either side of him, based on their emotional relationship,” Ekman said. If a male horse, for instance, had lost to a mare in his first or second year, he would be ruined if he went to the gate with a mare next to him in the lineup. (Or something like that– no one really knew for certain.) Tomkins felt that emotion was the code to life, and that with enough attention to particulars the code could be cracked. He thought this about the horses, and, more important, he thought this about the human face.

Tomkins, it was said, could walk into a post office, go over to the “Wanted” posters, and, just by looking at mug shots, tell you what crimes the various fugitives had committed. “He would watch the show “To Tell the Truth,’ and without fault he could always pick the person who was lying and who his confederates were,” his son, Mark, recalls. “He actually wrote the producer at one point to say it was too easy, and the man invited him to come to New York, go backstage, and show his stuff.” Virginia Demos, who teaches psychology at Harvard, recalls having long conversations with Tomkins. “We would sit and talk on the phone, and he would turn the sound down as Jesse Jackson was talking to Michael Dukakis, at the Democratic National Convention. And he would read the faces and give his predictions on what would happen. It was profound.”

Ekman’s most memorable encounter with Tomkins took place in the late sixties. Ekman had just tracked down a hundred thousand feet of film that had been shot by the virologist Carleton Gajdusek in the remote jungles of Papua New Guinea. Some of the footage was of a tribe called the South Fore, who were a peaceful and friendly people. The rest was of the Kukukuku, who were hostile and murderous and who had a homosexual ritual where pre-adolescent boys were required to serve as courtesans for the male elders of the tribe. Ekman was still working on the problem of whether human facial expressions were universal, and the Gajdusek film was invaluable. For six months, Ekman and his collaborator, Wallace Friesen, sorted through the footage. They cut extraneous scenes, focussing just on closeups of the faces of the tribesmen, and when the editing was finished Ekman called in Tomkins.

The two men, protégé and mentor, sat at the back of the room, as faces flickered across the screen. Ekman had told Tomkins nothing about the tribes involved; all identifying context had been edited out. Tomkins looked on intently, peering through his glasses. At the end, he went up to the screen and pointed to the faces of the South Fore. “These are a sweet, gentle people, very indulgent, very peaceful,” he said. Then he pointed to the faces of the Kukukuku. “This other group is violent, and there is lots of evidence to suggest homosexuality.” Even today, a third of a century later, Ekman cannot get over what Tomkins did. “My God! I vividly remember saying, “Silvan, how on earth are you doing that?’ ” Ekman recalls. “And he went up to the screen and, while we played the film backward, in slow motion, he pointed out the particular bulges and wrinkles in the face that he was using to make his judgment. That’s when I realized, “I’ve got to unpack the face.’ It was a gold mine of information that everyone had ignored. This guy could see it, and if he could see it, maybe everyone else could, too.”

Ekman and Friesen decided that they needed to create a taxonomy of facial expressions, so day after day they sat across from each other and began to make every conceivable face they could. Soon, though, they realized that their efforts weren’t enough. “I met an anthropologist, Wade Seaford, told him what I was doing, and he said, ‘Do you have this movement?'” –and here Ekman contracted what’s called the triangularis, which is the muscle that depresses the corners of the lips, forming an arc of distaste– “and it wasn’t in my system, because I had never seen it before. I had built a system not on what the face can do but on what I had seen. I was devastated. So I came back and said, ‘I’ve got to learn the anatomy.’ ” Friesen and Ekman then combed through medical textbooks that outlined each of the facial muscles, and identified every distinct muscular movement that the face could make. There were forty-three such movements. Ekman and Friesen called them “action units.” Then they sat across from each other again, and began manipulating each action unit in turn, first locating the muscle in their mind and then concentrating on isolating it, watching each other closely as they did, checking their movements in a mirror, making notes of how the wrinkle patterns on their faces would change with each muscle movement, and videotaping the movement for their records. On the few occasions when they couldn’t make a particular movement, they went next door to the U.C.S.F. anatomy department, where a surgeon they knew would stick them with a needle and electrically stimulate the recalcitrant muscle. “That wasn’t pleasant at all,” Ekman recalls. When each of those action units had been mastered, Ekman and Friesen began working action units in combination, layering one movement on top of another. The entire process took seven years. “There are three hundred combinations of two muscles,” Ekman says. “If you add in a third, you get over four thousand. We took it up to five muscles, which is over ten thousand visible facial configurations.” Most of those ten thousand facial expressions don’t mean anything, of course. They are the kind of nonsense faces that children make. But, by working through each action-unit combination, Ekman and Friesen identified about three thousand that did seem to mean something, until they had catalogued the essential repertoire of human emotion.

4.

On a recent afternoon, Ekman sat in his office at U.C.S.F., in what is known as the Human Interaction Laboratory, a standard academic’s lair of books and files, with photographs of his two heroes, Tomkins and Darwin, on the wall. He leaned forward slightly, placing his hands on his knees, and began running through the action-unit configurations he had learned so long ago. “Everybody can do action unit four,” he began. He lowered his brow, using his depressor glabellae, depressor supercilli, and corrugator. “Almost everyone can do A.U. nine.” He wrinkled his nose, using his levator labii superioris, alaeque nasi. “Everybody can do five.” He contracted his levator palpebrae superioris, raising his upper eyelid.

I was trying to follow along with him, and he looked up at me. “You’ve got a very good five,” he said generously. “The more deeply set your eyes are, the harder it is to see the five. Then there’s seven.” He squinted. “Twelve.” He flashed a smile, activating the zygomatic major. The inner parts of his eyebrows shot up. “That’s A.U. —- distress, anguish.” Then he used his frontalis, pars lateralis, to raise the outer half of his eyebrows. “That’s A.U. two. It’s also very hard, but it’s worthless. It’s not part of anything except Kabuki theatre. Twenty-three is one of my favorites. It’s the narrowing of the red margin of the lips. Very reliable anger sign. It’s very hard to do voluntarily.” He narrowed his lips. “Moving one ear at a time is still the hardest thing to do. I have to really concentrate. It takes everything I’ve got.” He laughed. “This is something my daughter always wanted me to do for her friends. Here we go.” He wiggled his left ear, then his right ear. Ekman does not appear to have a particularly expressive face. He has the demeanor of a psychoanalyst, watchful and impassive, and his ability to transform his face so easily and quickly was astonishing. “There is one I can’t do,” he went on. “It’s A.U. thirty-nine. Fortunately, one of my postdocs can do it. A.U. thirty-eight is dilating the nostrils. Thirty-nine is the opposite. It’s the muscle that pulls them down.” He shook his head and looked at me again. “Oooh! You’ve got a fantastic thirty-nine. That’s one of the best I’ve ever seen. It’s genetic. There should be other members of your family who have this heretofore unknown talent. You’ve got it, you’ve got it.” He laughed again. “You’re in a position to flash it at people. See, you should try that in a singles bar!”

Ekman then began to layer one action unit on top of another, in order to compose the more complicated facial expressions that we generally recognize as emotions. Happiness, for instance, is essentially A.U. six and twelve– contracting the muscles that raise the cheek (orbicularis oculi, pars orbitalis) in combination with the zygomatic major, which pulls up the corners of the lips. Fear is A.U. one, two and four, or, more fully, one, two, four, five, and twenty, with or without action units twenty-five, twenty-six, or twenty-seven. That is: the inner brow raiser (frontalis, pars medialis) plus the outer brow raiser (frontalis, pars lateralis) plus the brow-lowering depressor supercilli plus the levator palpebrae superioris (which raises the upper lid), plus the risorius (which stretches the lips), the parting of the lips (depressor labii), and the masseter (which drops the jaw). Disgust? That’s mostly A.U. nine, the wrinkling of the nose (levator labii superioris, alaeque nasi), but it can sometimes be ten, and in either case may be combined with A.U. fifteen or sixteen or seventeen.

Ekman and Friesen ultimately assembled all these combinations–and the rules for reading and interpreting them– into the Facial Action Coding System, or FACS, and wrote them up in a five-hundred-page binder. It is a strangely riveting document, full of details like the possible movements of the lips (elongate, de-elongate, narrow, widen, flatten, protrude, tighten and stretch); the four different changes of the skin between the eyes and the cheeks (bulges, bags, pouches, and lines); or the critical distinctions between infraorbital furrows and the nasolabial furrow. Researchers have employed the system to study everything from schizophrenia to heart disease; it has even been put to use by computer animators at Pixar (“Toy Story”), andat DreamWorks (“Shrek”). FACS takes weeks to master in its entirety, and only five hundred people around the world have been certified to use it in research. But for those who have, the experience of looking at others is forever changed. They learn to read the face the way that people like John Yarbrough did intuitively. Ekman compares it to the way you start to hear a symphony once you’ve been trained to read music: an experience that used to wash over you becomes particularized and nuanced.

Ekman recalls the first time he saw Bill Clinton, during the 1992 Democratic primaries. “I was watching his facial expressions, and I said to my wife, ‘This is Peck’s Bad Boy,’ ” Ekman says. “This is a guy who wants to be caught with his hand in the cookie jar, and have us love him for it anyway. There was this expression that’s one of his favorites. It’s that hand-in-the-cookie-jar, love-me-Mommy-because-I’m-a-rascal look. It’s A.U. twelve, fifteen, seventeen, and twenty-four, with an eye roll.” Ekman paused, then reconstructed that particular sequence of expressions on his face. He contracted his zygomatic major, A.U. twelve, in a classic smile, then tugged the corners of his lips down with his triangularis, A.U. fifteen. He flexed the mentalis, A.U. seventeen, which raises the chin, slightly pressed his lips together in A.U. twenty-four, and finally rolled his eyes–and it was as if Slick Willie himself were suddenly in the room. “I knew someone who was on his communications staff. So I contacted him. I said, ‘Look, Clinton’s got this way of rolling his eyes along with a certain expression, and what it conveys is “I’m a bad boy.” I don’t think it’s a good thing. I could teach him how not to do that in two to three hours.’ And he said, ‘Well, we can’t take the risk that he’s known to be seeing an expert on lying.’ I think it’s a great tragedy, because . . .” Ekman’s voice trailed off. It was clear that he rather liked Clinton, and that he wanted Clinton’s trademark expression to have been no more than a meaningless facial tic. Ekman shrugged. “Unfortunately, I guess, he needed to get caught–and he got caught.”

5.

Early in his career, Paul Ekman filmed forty psychiatric patients, including a woman named Mary, a forty-two-year-old housewife. She had attempted suicide three times, and survived the last attempt–an overdose of pills–only because someone found her in time and rushed her to the hospital. Her children had left home and her husband was inattentive, and she was depressed. When she first went to the hospital, she simply sat and cried, but she seemed to respond well to therapy. After three weeks, she told her doctor that she was feeling much better and wanted a weekend pass to see her family. The doctor agreed, but just before Mary was to leave the hospital she confessed that the real reason she wanted to go on weekend leave was so that she could make another suicide attempt. Several years later, a group of young psychiatrists asked Ekman how they could tell when suicidal patients were lying. He didn’t know, but, remembering Mary, he decided to try to find out. If the face really was a reliable guide to emotion, shouldn’t he be able to look back on the film and tell that she was lying? Ekman and Friesen began to analyze the film for clues. They played it over and over for dozens of hours, examining in slow motion every gesture and expression. Finally, they saw it. As Mary’s doctor asked her about her plans for the future, a look of utter despair flashed across her face so quickly that it was almost imperceptible.

Ekman calls that kind of fleeting look a “microexpression,” and one cannot understand why John Yarbrough did what he did on that night in South Central without also understanding the particular role and significance of microexpressions. Many facial expressions can be made voluntarily. If I’ m trying to look stern as I give you a tongue-lashing, I’ll have no difficulty doing so, and you’ ll have no difficulty interpreting my glare. But our faces are also governed by a separate, involuntary system. We know this because stroke victims who suffer damage to what is known as the pyramidal neural system will laugh at a joke, but they cannot smile if you ask them to. At the same time, patients with damage to another part of the brain have the opposite problem. They can smile on demand, but if you tell them a joke they can’t laugh. Similarly, few of us can voluntarily do A.U. one, the sadness sign. (A notable exception, Ekman points out, is Woody Allen, who uses his frontalis, pars medialis, to create his trademark look of comic distress.) Yet we raise our inner eyebrows all the time, without thinking, when we are unhappy. Watch a baby just as he or she starts to cry, and you’ll often see the frontalis, pars medialis, shoot up, as if it were on a string.

Perhaps the most famous involuntary expression is what Ekman has dubbed the Duchenne smile, in honor of the nineteenth-century French neurologist Guillaume Duchenne, who first attempted to document the workings of the muscles of the face with the camera. If I ask you to smile, you’ ll flex your zygomatic major. By contrast, if you smile spontaneously, in the presence of genuine emotion, you’ ll not only flex your zygomatic but also tighten the orbicularis oculi, pars orbitalis, which is the muscle that encircles the eye. It is almost impossible to tighten the orbicularis oculi, pars lateralis, on demand, and it is equally difficult to stop it from tightening when we smile at something genuinely pleasurable. This kind of smile “does not obey the will,” Duchenne wrote. “Its absence unmasks the false friend.” When we experience a basic emotion, a corresponding message is automatically sent to the muscles of the face. That message may linger on the face for just a fraction of a second, or be detectable only if you attached electrical sensors to the face, but It’s always there. Silvan Tomkins once began a lecture by bellowing, “The face is like the penis!” and this is what he meant–that the face has, to a large extent, a mind of its own. This doesn’t mean we have no control over our faces. We can use our voluntary muscular system to try to suppress those involuntary responses. But, often, some little part of that suppressed emotion–the sense that I’ m really unhappy, even though I deny it–leaks out. Our voluntary expressive system is the way we intentionally signal our emotions. But our involuntary expressive system is in many ways even more important: it is the way we have been equipped by evolution to signal our authentic feelings.

“You must have had the experience where somebody comments on your expression and you didn’t know you were making it,”Ekman says. “Somebody tells you, “What are you getting upset about?’ “Why are you smirking?’ You can hear your voice, but you can’t see your face. If we knew what was on our face, we would be better at concealing it. But that wouldn’t necessarily be a good thing. Imagine if there were a switch that all of us had, to turn off the expressions on our face at will. If babies had that switch, we wouldn’t know what they were feeling. They’ d be in trouble. You could make an argument, if you wanted to, that the system evolved so that parents would be able to take care of kids. Or imagine if you were married to someone with a switch? It would be impossible. I don’t think mating and infatuation and friendships and closeness would occur if our faces didn’t work that way.”

Ekman slipped a tape taken from the O.J. Simpson trial into the VCR. It was of Kato Kaelin, Simpson’s shaggy-haired house guest, being examined by Marcia Clark, one of the prosecutors in the case. Kaelin sits in the witness box, with his trademark vacant look. Clark asks a hostile question. Kaelin leans forward and answers softly. “Did you see that?” Ekman asked me. I saw nothing, just Kato being Kato– harmless and passive. Ekman stopped the tape, rewound it, and played it back in slow motion. On the screen, Kaelin moved forward to answer the question, and in that fraction of a second his face was utterly transformed. His nose wrinkled, as he flexed his levator labii superioris, alaeque nasi. His teeth were bared, his brows lowered. “It was almost totally A.U. nine,” Ekman said. “It’s disgust, with anger there as well, and the clue to that is that when your eyebrows go down, typically your eyes are not as open as they are here. The raised upper eyelid is a component of anger, not disgust. It’s very quick.” Ekman stopped the tape and played it again, peering at the screen. “You know, he looks like a snarling dog.”

Ekman said that there was nothing magical about his ability to pick up an emotion that fleeting. It was simply a matter of practice. “I could show you forty examples, and you could pick it up. I have a training tape, and people love it. They start it, and they can’t see any of these expressions. Thirty-five minutes later, they can see them all. What that says is that this is an accessible skill.”

Ekman showed another clip, this one from a press conference given by Kim Philby in 1955. Philby had not yet been revealed as a Soviet spy, but two of his colleagues, Donald Maclean and Guy Burgess, had just defected to the Soviet Union. Philby is wearing a dark suit and a white shirt. His hair is straight and parted to the left. His face has the hauteur of privilege.

“Mr. Philby,” he is asked. “Mr. Macmillan, the foreign secretary, said there was no evidence that you were the so-called third man who allegedly tipped off Burgess and Maclean. Are you satisfied with that clearance that he gave you?”

Philby answers confidently, in the plummy tones of the English upper class. “Yes, I am.”

“Well, if there was a third man, were you in fact the third man?”

“No,” Philby says, just as forcefully. “I was not.”

Ekman rewound the tape, and replayed it in slow motion. “Look at this,” he said, pointing to the screen. “Twice, after being asked serious questions about whether he’s committed treason, he’s going to smirk. He looks like the cat who ate the canary.” The expression was too brief to see normally. But at quarter speed it was painted on his face–the lips pressed together in a look of pure smugness. “He’s enjoying himself, isn’t he?” Ekman went on. “I call this–duping delight– the thrill you get from fooling other people.” Ekman started the VCR up again. “There’s another thing he does.” On the screen, Philby was answering another question. “In the second place, the Burgess-Maclean affair has raised issues of great”– he pauses– “delicacy.” Ekman went back to the pause, and froze the tape. “Here it is,”he said. “A very subtle microexpression of distress or unhappiness. It’s only in the eyebrows– in fact, just in one eyebrow.” Sure enough, Philby’s right inner eyebrow was raised in an unmistakable A.U. one. “It’s very brief,” Ekman said. “He’s not doing it voluntarily. And it totally contradicts all his confidence and assertiveness. It comes when he’s talking about Burgess and Maclean, whom he had tipped off. It’s a hot spot that suggests, ‘You shouldn’t trust what you hear.’ ”

A decade ago, Ekman joined forces with J. J. Newberry–the ex-A.T.F. agent who is one of the high-scorers in the Diogenes Project– to put together a program for educating law-enforcement officials around the world in the techniques of interviewing and lie detection. In recent months, they have flown to Washington, D.C., to assist the C.I.A. and the F.B.I. in counter-terrorism training. At the same time, the Defense Advanced Research Projects Agency (DARPA) has asked Ekman and his former student Mark Frank, now at Rutgers, to develop experimental scenarios for studying deception that would be relevant to counter-terrorism. The objective is to teach people to look for discrepancies between what is said and what is signalled–to pick up on the difference between Philby’s crisp denials and his fleeting anguish. It’s a completely different approach from the shouting cop we see on TV and in the movies, who threatens the suspect and sweeps all of the papers and coffee cups off the battered desk. The Hollywood interrogation is an exercise in intimidation, and its point is to force the suspect to tell you what you need to know. It does not take much to see the limitations of this strategy. It depends for its success on the coöperation of the suspect–when, of course, the suspect’s involuntary communication may be just as critical. And it privileges the voice over the face, when the voice and the face are equally significant channels in the same system.

Ekman received his most memorable lesson in this truth when he and Friesen first began working on expressions of anger and distress. “It was weeks before one of us finally admitted feeling terrible after a session where we’ d been making one of those faces all day,” Friesen says. “Then the other realized that he’d been feeling poorly, too, so we began to keep track.” They then went back and began monitoring their body during particular facial movements. “Say you do A.U. one, raising the inner eyebrows, and six, raising the cheeks, and fifteen, the lowering of the corner of the lips,” Ekman said, and then did all three. “What we discovered is that that expression alone is sufficient to create marked changes in the autonomic nervous system. When this first occurred, we were stunned. We weren’t expecting this at all. And it happened to both of us. We felt terrible . What we were generating was sadness, anguish. And when I lower my brows, which is four, and raise the upper eyelid, which is five, and narrow the eyelids, which is seven, and press the lips together, which is twenty-four, I’ m generating anger. My heartbeat will go up ten to twelve beats. My hands will get hot. As I do it, I can’t disconnect from the system. It’s very unpleasant, very unpleasant.”

Ekman, Friesen, and another colleague, Robert Levenson, who teaches at Berkeley, published a study of this effect in Science. They monitored the bodily indices of anger, sadness, and fear–heart rate and body temperature–in two groups. The first group was instructed to remember and relive a particularly stressful experience. The other was told to simply produce a series of facial movements, as instructed by Ekman– to “assume the position,” as they say in acting class. The second group, the people who were pretending, showed the same physiological responses as the first. A few years later, a German team of psychologists published a similar study. They had a group of subjects look at cartoons, either while holding a pen between their lips–an action that made it impossible to contract either of the two major smiling muscles, the risorius and the zygomatic major– or while holding a pen clenched between their teeth, which had the opposite effect and forced them to smile. The people with the pen between their teeth found the cartoons much funnier. Emotion doesn’t just go from the inside out. It goes from the outside in. What’s more, neither the subjects “assuming the position” nor the people with pens in their teeth knew they were making expressions of emotion. In the facial-feedback system, an expression you do not even know that you have can create an emotion you did not choose to feel.

It is hard to talk to anyone who knows FACS without this point coming up again and again. Face-reading depends not just on seeing facial expressions but also on taking them seriously. One reason most of us–like the TV cop– do not closely attend to the face is that we view its evidence as secondary, as an adjunct to what we believe to be real emotion. But there’s nothing secondary about the face, and surely this realization is what set John Yarbrough apart on the night that the boy in the sports car came at him with a gun. It’s not just that he saw a microexpression that the rest of us would have missed. It’s that he took what he saw so seriously that he was able to overcome every self-protective instinct in his body, and hold his fire.

6.

Yarbrough has a friend in the L.A. County Sheriff’s Department, Sergeant Bob Harms, who works in narcotics in Palmdale. Harms is a member of the Diogenes Project as well, but the two men come across very differently. Harms is bigger than Yarbrough, taller and broader in the chest, with soft brown eyes and dark, thick hair. Yarbrough is restoring a Corvette and wears Rush Limbaugh ties, and he says that if he hadn’t been a cop he would have liked to stay in the Marines. Harms came out of college wanting to be a commercial artist; now he plans to open a bed-and-breakfast in Vermont with his wife when he retires. On the day we met, Harms was wearing a pair of jean shorts and a short-sleeved patterned shirt. His badge was hidden inside his shirt. He takes notes not on a yellow legal pad, which he considers unnecessarily intimidating to witnesses, but on a powder-blue one. “I always get teased because I’m the touchy-feely one,” Harms said. “John Yarbrough is very analytical. He thinks before he speaks. There is a lot going on inside his head. He’s constantly thinking four or five steps ahead, then formulating whatever his answers are going to be. That’s not how I do my interviews. I have a conversation. It’s not “Where were you on Friday night?’ Because that’s the way we normally communicate. I never say, “I’m Sergeant Harms.’ I always start by saying, “I’m Bob Harms, and I’m here to talk to you about your case,’ and the first thing I do is smile.”

The sensation of talking to the two men, however, is surprisingly similar. Normal conversation is like a game of tennis: you talk and I listen, you listen and I talk, and we feel scrutinized by our conversational partner only when the ball is in our court. But Yarbrough and Harms never stop watching, even when they’re doing the talking. Yarbrough would comment on my conversational style, noting where I held my hands as I talked, or how long I would wait out a lull in the conversation. At one point, he stood up and soundlessly moved to the door– which he could have seen only in his peripheral vision–opening it just before a visitor rang the doorbell. Harms gave the impression that he was deeply interested in me. It wasn’t empathy. It was a kind of powerful curiosity. “I remember once, when I was in prison custody, I used to shake prisoners’ hands,” Harms said. “The deputies thought I was crazy. But I wanted to see what happened, because that’s what these men are starving for, some dignity and respect.”

Some of what sets Yarbrough and Harms and the other face readers apart is no doubt innate. But the fact that people can be taught so easily to recognize microexpressions, and can learn FACS, suggests that we all have at least the potential capacity for this kind of perception. Among those who do very well at face-reading, tellingly, are some aphasics, such as stroke victims who have lost the ability to understand language. Collaborating with Ekman on a paper that was recently published in Nature, the psychologist Nancy Etcoff, of Massachusetts General Hospital, described how a group of aphasics trounced a group of undergraduates at M.I.T. on the nurses tape. Robbed of the power to understand speech, the stroke victims had apparently been forced to become far more sensitive to the information written on people’s faces. “They are compensating for the loss in one channel through these other channels,” Etcoff says. “We could hypothesize that there is some kind of rewiring in the brain, but I don’t think we need that explanation. They simply exercise these skills much more than we do.” Ekman has also done work showing that some abused children are particularly good at reading faces as well: like the aphasics in the study, they developed “interpretive strategies”–in their case, so they could predict the behavior of their volatile parents.

What appears to be a kind of magical, effortless intuition about faces, then, may not really be effortless and magical at all. This kind of intuition is a product of desire and effort. Silvan Tomkins took a sabbatical from Princeton when his son Mark was born, and stayed in his house on the Jersey Shore, staring into his son’s face, long and hard, picking up the patterns of emotion–the cycles of interest, joy, sadness, and anger–that flash across an infant’s face in the first few months of life. He taught himself the logic of the furrows and the wrinkles and the creases, the subtle differences between the pre-smile and the pre-cry face. Later, he put together a library of thousands of photographs of human faces, in every conceivable expression. He developed something called the Picture Arrangement Test, which was his version of the Rorschach blot: a patient would look at a series of pictures and be asked to arrange them in a sequence and then tell a story based on what he saw. The psychologist was supposed to interpret the meaning of the story, but Tomkins would watch a videotape of the patient with the sound off, and by studying the expressions on the patient’s face teach himself to predict what the story was. Face-reading, for those who have mastered it, becomes a kind of compulsion; it becomes hard to be satisfied with the level and quality of information that most of us glean from normal social encounters. “Whenever we get together,” Harms says of spending time with other face readers, “we debrief each other. We’re constantly talking about cases, or some of these videotapes of Ekman’s, and we say, “I missed that, did you get that?’ Maybe there’s an emotion attached there. We’re always trying to place things, and replaying interviews in our head.”

This is surely why the majority of us don’t do well at reading faces: we feel no need to make that extra effort. People fail at the nurses tape, Ekman says, because they end up just listening to the words. That’s why, when Tomkins was starting out in his quest to understand the face, he always watched television with the sound turned off. “We are such creatures of language that what we hear takes precedence over what is supposed to be our primary channel of communication, the visual channel,” he once said. “Even though the visual channel provides such enormous information, the fact is that the voice preëmpts the individual’s attention, so that he cannot really see the face while he listens.” We prefer that way of dealing with the world because it does not challenge the ordinary boundaries of human relationships. Ekman, in one of his essays, writes of what he learned from the legendary sociologist Erving Goffman. Goffman said that part of what it means to be civilized is not to “steal” information that is not freely given to us. When someone picks his nose or cleans his ears, out of unthinking habit, we look away. Ekman writes that for Goffman the spoken word is “the acknowledged information, the information for which the person who states it is willing to take responsibility,” and he goes on:

When the secretary who is miserable about a fight with her husband the previous night answers, “Just fine,” when her boss asks, “How are you this morning?”–that false message may be the one relevant to the boss’s interactions with her. It tells him that she is going to do her job. The true message–that she is miserable–he may not care to know about at all as long as she does not intend to let it impair her job performance.

What would the boss gain by reading the subtle and contradictory microexpressions on his secretary’s face? It would be an invasion of her privacy and an act of disrespect. More than that, it would entail an obligation. He would be obliged to do something, or say something, or feel something that might otherwise be avoided entirely. To see what is intended to be hidden, or, at least, what is usually missed, opens up a world of uncomfortable possibilities. This is the hard part of being a face reader. People like that have more faith in their hunches than the rest of us do. But faith is not certainty. Sometimes, on a routine traffic stop late at night, you end up finding out that your hunch was right. But at other times you’ll never know. And you can’t even explain it properly, because what can you say? You did something the rest of us would never have done, based on something the rest of us would never have seen.

“I was working in West Hollywood once, in the nineteen-eighties,” Harms said. “I was with a partner, Scott. I was driving. I had just recently come off the prostitution team, and we spotted a man in drag. He was on Sunset, and I didn’t recognize him. At that time, Sunset was normally for females. So it was kind of odd. It was a cold night in January. There was an all-night restaurant on Sunset called Ben Franks, so I asked my partner to roll down the window and ask the guy if he was going to Ben Franks– just to get a reaction. And the guy immediately keys on Scott, and he’s got an overcoat on, and he’s all bundled up, and he starts walking over to the car. It had been raining so much that the sewers in West Hollywood had backed up, and one of the manhole covers had been cordoned off because it was pumping out water. The guy comes over to the squad car, and he’s walking right through that. He’s fixated on Scott. So we asked him what he was doing. He says, “I was out for a walk.’ And then he says, “I have something to show you.'”

Later, after the incident was over, Harms and his partner learned that the man had been going around Hollywood making serious threats, that he was unstable and had just attempted suicide, that he was in all likelihood about to erupt. A departmental inquiry into the incident would affirm that Harms and his partner had been in danger: the man was armed with a makeshift flamethrower, and what he had in mind, evidently, was to turn the inside of the squad car into an inferno. But at the time all Harms had was a hunch, a sense from the situation and the man’s behavior and what he glimpsed inside the man’s coat and on the man’s face– something that was the opposite of whatever John Yarbrough saw in the face of the boy in Willowbrook. Harms pulled out his gun and shot the man through the open window. “Scott looked at me and was, like, “What did you do?’ because he didn’t perceive any danger,” Harms said. “But I did.”
The great Chicago heat wave, viagra dosage and other unnatural disasters.

1.

In the first week of July, 1995, a strong high-pressure air mass developed over the plains of the Southwest and began moving slowly eastward toward Chicago. Illinois usually gets its warm summer air from the Gulf of Mexico, and the air coming off the ocean is relatively temperate. But this was a blast of western air that had been baked in the desert ovens of West Texas and New Mexico. It was hot, bringing temperatures in excess of a hundred degrees, and, because the preceding two months had been very wet in the Midwest and the ground was damp, the air steadily picked up moisture as it moved across the farmlands east of the Rockies. Ordinarily, this would not have been a problem, since humid air tends to become diluted as it mixes with the drier air higher up in the atmosphere. But it was Chicago’s misfortune, in mid-July, to be in the grip of an unusually strong temperature inversion: the air in the first thousand feet above the city surface was cooler than the air at two and three thousand feet. The humid air could not rise and be diluted. It was trapped by the warmer air above. The United States has cities that are often humid–like Houston and New Orleans–without being tremendously hot. And it has very hot cities–like Las Vegas and Phoenix–that are almost never humid. But for one long week, beginning on Thursday, July 13, 1995, Chicago was both. Meteorologists measure humidity with what is called the dew point–the point at which the air is so saturated with moisture that it cannot cool without forming dew. On a typical Chicago summer day, the dew point is in the low sixties, and on a very warm, humid day it is in the low seventies. At Chicago’s Midway Airport, during the heat wave of 1995, the dew point hit the low eighties–a figure reached regularly only in places like the coastal regions of the Middle East. In July of 1995, Chicago effectively turned into Dubai.

As the air mass settled on the city, cars began to overheat and stall in the streets. Roads buckled. Hundreds of children developed heat exhaustion when school buses were stuck in traffic. More than three thousand fire hydrants were opened in poorer neighborhoods around the city, by people looking for relief from the heat, and this caused pressure to drop so precipitately that entire buildings were left without water. So many air-conditioners were turned on that the city’s electrical infrastructure was overwhelmed. A series of rolling blackouts left thousands without power. As the heat took its toll, the city ran out of ambulances. More than twenty hospitals, mostly on Chicago’s poorer South Side, shut their doors to new admissions. Callers to 911 were put on hold, and as the police and paramedics raced from one home to another it became clear that the heat was killing people in unprecedented numbers. The police took the bodies to the Cook County Medical Examiner’s office, and a line of cruisers stretched outside the building. Students from a nearby mortuary school, and then ex-convicts looking to earn probation points, were brought in to help. The morgue ran out of bays in which to put the bodies. Office space was cleared. It wasn’t enough. The owner of a local meatpacking firm offered the city his refrigerated trucks to help store the bodies. The first set wasn’t enough. He sent another. It wasn’t enough. In the end, there were nine forty-eight-foot meatpacking trailers in the morgue’s parking lot. When the final statistics were tallied, the city calculated that in the seven days between July 14th and July 20th, the heat wave had resulted in the deaths of seven hundred and thirty-nine Chicagoans; on Saturday, July 15th, alone, three hundred and sixty-five people died from the heat. The chance intersection of a strong high-pressure ridge, a wet spring, and an intense temperature inversion claimed more lives than Hurricane Andrew, the crash of T.W.A. Flight 800, the Oklahoma City bombing, and the Northridge, California, earthquake combined.

2.

In “Heat Wave: A Social Autopsy of Disaster in Chicago” (Chicago; $27.50), the New York University sociologist Eric Klinenberg sets out to understand what happened during those seven days in July. He looks at who died, and where they died, and why they died. He goes to the county morgue and sifts through the dozens of boxes of unclaimed personal effects of heat-wave victims–“watches, wallets, letters, tax returns, photographs, and record books”–and reads the police reports on the victims, with their dry recitations of the circumstances of death. Here is one for a seventy-three-year-old white woman who was found on Monday, July 17th:

A recluse for 10 yrs, never left apartment, found today by son, apparently DOA. Conditions in apartment when R/O’s [responding officers] arrived thermostat was registering over 90 degrees f. with no air circulation except for windows opened by son (after death).

Here is another, for a seventy-nine-year-old black man found on Wednesday the 19th:

Victim did not respond to phone calls or knocks on victim’s door since Sunday, 16 July 1995. Victim was known as quiet, [kept] to himself and at times, not to answer the door. Landlord . . . does not have any information to any relatives to victim. . . . Chain was on door. R/O was able to see victim on sofa with flies on victim and a very strong odor decay.

The city’s response to the crisis, Klinenberg argues, was to look at people like those two victims–the recluse who did not open her windows and the man who would not answer his door–and conclude that their deaths were inevitable, the result of an unavoidable collision between their own infirmity and an extreme environmental event. As one Health Department official put it at the time, “Government can’t guarantee there won’t be a heat wave.” On the Friday, the human-services commissioner, Daniel Alvarez, told the press, “We’re talking about people who die because they neglect themselves. We did everything possible. But some people didn’t want to open their doors to us.” In its official postmortem four months later, the city sounded the same fatalistic note: the disaster had been a “unique meteorological event” that proved that the “government alone cannot do it all.”

Klinenberg finds that conclusion unacceptably superficial. The disaster may look inevitable, but beneath the surface he sees numerous explanations for why it took the shape it did. One chapter of the book is devoted to a comparison of two adjoining low-income neighborhoods in Chicago, Little Village and North Lawndale. Statistically, the two are almost identical, each with heavy concentrations of poor, elderly people living alone, so it would seem that the heat wave should have taken a similar toll in both neighborhoods. But North Lawndale had ten times the fatality rate of Little Village. Why? Because Little Village is a bustling, relatively safe, close-knit Hispanic community; the elderly had family and friends nearby who could look in on them, and streets and stores where they could go to escape their stifling apartments. North Lawndale, by contrast, is a sprawling, underpopulated, drug-infested neighborhood. The elderly there were afraid to go outside, and had no one close by to visit them. The heat was deadly only in combination with particular social and physical circumstances.

Klinenberg takes an equally close look at the city’s ambulance shortage. The city could have nearly tripled the number of available ambulances by calling in reserves from the suburbs, but it was slow to realize that it had a disaster on its hands. “It’s hot. It’s very hot. But let’s not blow it out of proportion”: this was Mayor Richard Daley’s assessment of the situation on Friday, July 14th. The streamlining of city governments like Chicago’s, Klinenberg explains, isolated city officials. Social-services departments had been professionalized as if they were corporations. Responsibilities had been outsourced. “Police officers replace aldermen and precinct captains as the community sentries,” he writes, and as a result political organizations began to lose contact with the needs of their constituents.

Problem solving, in our day and age, brings with it the requirement of compression: we are urged to distill the most pertinent lessons from any experience. Klinenberg suggests that such distillation only obscures the truth, and by the end of “Heat Wave” he has traced the lines of culpability in dozens of directions, drawing a dense and subtle portrait of exactly what happened during that week in July. It is an approach that resembles, most of all, the way the heat wave was analyzed by meteorologists. They took hourly surface-airways observations of temperature, wind speed, and humidity, estimated radiation from cloud cover, and performed complex calculations using the Penman-Monteith formula to factor in soil-heat flux, latent heat of vaporization, stomatal resistance, and the von Kármán constant. Why, Klinenberg asks, can’t we bring the same rigor to our study of the social causes of disaster?

3.

Take the question of air-conditioning. The Centers for Disease Control, in their Chicago investigation, concluded that the use of air-conditioners could have prevented more than half of the deaths. But many low-income people in Chicago couldn’t afford to turn on an air-conditioner even if they had been given one for free. Many of those who did have air-conditioners, meanwhile, were hit by the power failures that week. Chicago had a problem with a vulnerable population: a lot of very old and very sick people. But it also, quite apart from this, had an air-conditioning problem. What was the cause of that problem?

As it turns out, this is a particularly timely question, since there is a debate going on now in Washington over air-conditioners which bears directly on what happens during heat waves. All air-conditioners consist of a motor and a long coil that acts as a heat exchanger, taking hot air out of the room and replacing it with cold air. If you use a relatively unsophisticated motor and a small coil, an air-conditioner will be cheap to make but will use a lot of electricity. If you use a better motor and a larger heat exchanger, the air-conditioner will cost more to buy but far less to run. Rationally, consumers should buy the more expensive, energy-efficient units, because their slightly higher purchase price is dwarfed by the amount of money the owner pays over time in electric bills. But fifteen years ago Congress realized that this wasn’t happening. The people who generally bought air-conditioners–builders and landlords–weren’t the people who paid the utility bills to run them. Their incentive was to buy the cheapest unit. So Congress passed a minimum standard for air-conditioning efficiency. Residential central air-conditioning units now had to score at least 10 on a scale known as SEER–the seasonal energy-efficiency ratio. One of Bill Clinton’s last acts as President was to raise that standard to 13. This spring, however, the Bush Administration cut the efficiency increase by a third, making SEER 12 the law.

It should be said that SEER 13 is no more technologically difficult than SEER 12. SEER 12 is simply a bit cheaper to make, and SEER 13 is simply cheaper to operate. Nor is this a classic regulatory battle that pits corporate against consumer interests. The nation’s largest air-conditioner manufacturer, Carrier, is in favor of 12. But the second-largest manufacturer, Goodman (which makes Amana air-conditioners), is in favor of 13. The Bush decision is really about politics, and the White House felt free to roll back the Clinton standard because most of the time the difference between the two standards is negligible. There is one exception, however: heat waves.

Air-conditioning is, of course, the reason that electrical consumption soars on very hot days. On the worst day in August, electricity consumption in, say, Manhattan might be three or four times what it is on a cool spring day. For most of the year, a local utility can use the electricity from its own power plants, or sign stable, long-term contracts with other power companies. But the extra electricity a city needs on that handful of very hot days presents a problem. You can’t build a power plant just to supply this surge–what would you do with it during the rest of the year? So, at peak periods, utilities buy the power they need on the “spot” market, and power bought on the spot market can cost fifty times as much as the power used on normal days. The amount of power that a utility has to buy for that handful of hot days every summer, in other words, is a huge factor in the size of our electric bills.

For anyone wanting to make electricity cheaper, then, the crucial issue is not how to reduce average electrical consumption but how to reduce peak consumption. A recent study estimates that moving the SEER standard from 10 to 13 would have the effect of cutting peak demand by the equivalent of more than a hundred and fifty power plants. The Bush Administration’s decision to cut the SEER upgrade by a third means that by 2020 demand will be fourteen thousand megawatts higher than it would have been, and that we’ll have to build about fifty more power plants. The cost of those extra power plants–and of running a less efficient air-conditioner on hot days–is part of what will make air-conditioning less affordable for people who will someday desperately need it.

The sheer volume of electricity required on a very hot day also puts enormous strain on a city’s power-distribution system. On the Friday of the Chicago heat wave, when power demand peaked, one of the main problem areas was the transmission substation (TSS) at California Avenue and Addison Street, in the city’s northwest corner. TSS 114 consists of a series of giant transformers–twenty feet high and fifteen feet across–that help convert the high-voltage electricity that comes into Chicago along power lines into the low-voltage power that is used in offices and homes. Throughout that Friday afternoon, the four transformers in the second terminal at TSS 114 were running at a hundred and eighteen per cent of capacity–that is, they were handling roughly a fifth more electricity than they were designed to carry. The chief side effect of overcapacity is heat. The more current you run through a transformer the hotter it gets, and, combined with the ambient temperature that afternoon, which averaged a hundred and eleven degrees, the heat turned the inside of terminal two into an oven.

At 4:56 P.M., the heat overwhelmed a monitoring device known as a CT–a gauge almost small enough to fit in the palm of one’s hand–on the first of the transformers. It tripped and shut down. The current that had been shared by four transformers had to be carried by just three, making them still hotter. The second transformer was now carrying a hundred and twenty-four per cent of its rated capacity. Fifty-one minutes later, a circuit breaker on the second transformer burst into flames. Transformers are engineered to handle extra loads for short periods of time, but there was just a little too much current and a little too much heat. At 6:19, two more CTs tripped on the third transformer and, as workmen struggled to get the terminal up and running, a CT failed on the fourth transformer. In all, forty-nine thousand customers and all of the people in those customers’ houses and apartments and offices were without air-conditioning for close to twenty hours–and this is merely what happened at TSS 114.

All around the city that week, between Wednesday and Sunday, there were 1,327 separate equipment failures that left an additional hundred and forty-nine thousand customers without power. Those are staggering numbers. But what is really staggering is how easy it would have been to avoid these power outages. Commonwealth Edison, the city’s utility, had forecast a year earlier that electricity use in the summer of 1995 would peak at 18,600 megawatts. The actual high, on the Friday of the heat wave, was 19,201. The difference, in other words, between the demand that the utility was prepared to handle and the demand that brought the city to its knees was six hundred and one megawatts, or 3.2 per cent of the total–which is just about what a place like Chicago might save by having a city full of SEER 13 air-conditioners instead of SEER 12 air-conditioners.

4.

In 1928, a storm near Palm Beach, Florida, killed almost two thousand people, most of them black migrant workers on the shores of Lake Okeechobee. This was, the state comptroller declared, “an act of God.” In 1935, the most severe hurricane in American history hit the Florida Keys, sending a storm surge fifteen to twenty feet high through a low-lying encampment of war veterans working on the highway. About four hundred people died. “The catastrophe must be characterized as an act of God and was by its very nature beyond the power of man,” the Veterans Authority and Federal Emergency Relief Administration declared in an official report. In 1972, an earthen dam put up by a mining company in Logan County, West Virginia, collapsed in heavy rains, killing a hundred and thirty-nine people. It was an “act of God,” a mining-company official said, disavowing any culpability. In 1974, a series of twisters swept across ten states, killing three hundred and fifteen people. Senator Thomas Eagleton, of Missouri, said at the time that his colleagues in Washington viewed the tornado “as an act of God where even the Congress can’t intervene,” explaining why the government would not fund an early-warning system. This is the way we have thought of catastrophes in the United States. The idea of an “act of God” suggests that any search for causes is unnecessary. It encourages us to see disasters, as the environmental historian Ted Steinberg writes in “Acts of God: The Unnatural History of Natural Disaster in America” (2000), simply as things that happen “from time to time.” It suggests, too, that systems or institutions ought to be judged on the basis of how they perform most of the time, under “normal” conditions, rather than by how they perform under those rare moments of extreme stress. But this idea, as “Heat Wave” makes clear, is a grave mistake. Political systems and social institutions ought to be judged the way utilities are judged. The true test is how they perform on a blistering day in July.

Klinenberg tells the story of Pauline Jankowitz, an elderly woman living alone in a third-floor apartment in a transitional neighborhood. Her air-conditioner was old and didn’t work well. She had a bladder problem that left her incontinent, and she had to walk with a crutch because she had a weak leg. That made it difficult for her to get down the stairs, and once she was outside she was terrified of being mugged. “Chicago is just a shooting gallery,” she said to Klinenberg. She left her apartment only about six times a year. Jankowitz was the prototypical heat-wave victim, and, as she told Klinenberg, that week in July was “the closest I’ve ever come to death.” But she survived. A friend had told her to leave her apartment if it got too hot; so, early on what would turn out to be the worst of the seven days, she rose and crept down the stairs. She caught a city bus to a nearby store, which was air-conditioned, and there she bought fresh cherries and leaned on the shopping cart until she recovered her strength. On the trip home, she recalled, “climbing the stairs was almost impossible.” Back in her apartment, she felt her body begin to swell and go numb. She telephoned a friend. She turned a fan on high, lay down on the floor, covered herself with wet towels, and dreamed that she was on a Caribbean cruise. She was poor and old and infirm, but she lived, and one of the many lessons of her story is that in order to survive that week in July she suddenly depended on services and supports that previously she had barely needed at all. Her old air-conditioner was useless most of the time. But that week it helped to keep her apartment at least habitable. She rarely travelled. But on that day the fact that there was a city bus, and that it came promptly and that it was air-conditioned, was of the greatest importance. She rarely went to the store; she had her groceries delivered. But now the proximity of a supermarket, where she could lean on the shopping cart and breathe in the cool air, was critical. Pauline Jankowitz’s life depended not on the ordinary workings of the social institutions in her world but on their ability to perform at one critical moment of peak demand. On the hottest of all days, her neighborhood substation did not fail. Her bus came. Her grocery store was open. She was one of the lucky ones.
What does ‘Saturday Night Live’ have in common with German philosophy?

1.

Lorne Michaels, healing the creator of “Saturday Night Live, viagra order was married to one of the show’s writers, Rosie Shuster. One day when the show was still young, an assistant named Paula Davis went to Shuster’s apartment in New York and found Dan Aykroyd getting out of her bed–which was puzzling, not just because Shuster was married to Michaels but because Aykroyd was supposedly seeing another member of the original “S.N.L.” cast, Laraine Newman. Aykroyd and Gilda Radner had also been an item, back when the two of them worked for the Second City comedy troupe in Toronto, although by the time they got to New York they were just friends, in the way that everyone was friends with Radner. Second City was also where Aykroyd met John Belushi, because Belushi, who was a product of the Second City troupe in Chicago, came to Toronto to recruit for the “National Lampoon Radio Hour,” which he starred in along with Radner and Bill Murray (who were also an item for a while). The writer Michael O’Donoghue (who famously voiced his aversion to the appearance of the Muppets on “S.N.L.” by saying, “I don’t write for felt”) also came from The National Lampoon, as did another of the original writers, Anne Beatts (who was, in the impeccably ingrown logic of “S.N.L.,” living with O’Donoghue). Chevy Chase came from a National Lampoon spinoff called “Lemmings,” which also starred Belushi, doing his legendary Joe Cocker impersonation. Lorne Michaels hired Belushi after Radner, among others, insisted on it, and he hired Newman because he had worked with her on a Lily Tomlin special, and he hired Aykroyd because Michaels was also from Canada and knew him from the comedy scene there. When Aykroyd got the word, he came down from Toronto on his Harley.

In the early days of “S.N.L.,” as Tom Shales and James Andrew Miller tell us in “Live from New York” (Little, Brown; $25.95), everyone knew everyone and everyone was always in everyone else’s business, and that fact goes a long way toward explaining the extraordinary chemistry among the show’s cast. Belushi would stay overnight at people’s apartments, and he was notorious for getting hungry in the middle of the night and leaving spaghetti-sauce imprints all over the kitchen, or setting fires by falling asleep with a lit joint. Radner would go to Jane Curtin’s house and sit and watch Curtin and her husband, as if they were some strange species of mammal, and say things like “Oh, now you are going to turn the TV on together. How will you decide what to watch?” Newman would hang out at Radner’s house, and Radner would be eating a gallon of ice cream and Newman would be snorting heroin. Then Radner would go to the bathroom to make herself vomit, and say, “I’m so full, I can’t hear.” And they would laugh. “There we were,” Newman recalls, “practicing our illnesses together.”

The place where they all really lived, though, was the “S.N.L.” office, on the seventeenth floor of NBC headquarters, at Rockefeller Center. The staff turned it into a giant dormitory, installing bunk beds and fooling around in the dressing rooms and staying up all night. Monday night was the first meeting, where ideas were pitched. On Tuesday, the writing started after dinner and continued straight through the night. The first read-through took place on Wednesday at three in the afternoon. And then came blocking and rehearsals and revisions. “It was emotional,” the writer Alan Zweibel tells Shales and Miller. “We were a colony. I don’t mean this in a bad way, but we were Guyana on the seventeenth floor. We didn’t go out. We stayed there. It was a stalag of some sort.” Rosie Shuster remembers waking up at the office and then going outside with Aykroyd, to “walk each other like dogs around 30 Rock just to get a little fresh air.” On Saturdays, after the taping was finished, the cast would head downtown to a storefront that Belushi and Aykroyd had rented and dubbed the Blues Bar. It was a cheerless dive, with rats and crumbling walls and peeling paint and the filthiest toilets in all of New York. But did anyone care? “It was the end of the week and, well, you were psyched,” Shuster recalls. “It was like you were buzzing, you’d get turbocharged from the intense effort of it, and then there’s like adrenal burnout later. I remember sleeping at the Blues Bar, you know, as the light broke.” Sometimes it went even later. “I remember rolling down the armor at the Blues Bar and closing the building at eleven o’clock Sunday morning–you know, when it was at its height–and saying good morning to the cops and firemen,”Aykroyd said. “S.N.L.” was a television show, but it was also an adult fraternity house, united by bonds of drugs and sex and long hours and emotion and affection that went back years. “The only entrée to that boys club was basically by fucking somebody in the club,” Anne Beatts tells Shales and Miller. “Which wasn’t the reason you were fucking them necessarily. I mean, you didn’t go “Oh, I want to get into this, I think I’ll have to have sex with this person.’ It was just that if you were drawn to funny people who were doing interesting things, then the only real way to get to do those things yourself was to make that connection.”

2.

We are inclined to think that genuine innovators are loners, that they do not need the social reinforcement the rest of us crave. But that’s not how it works, whether it’s television comedy or, for that matter, the more exalted realms of art and politics and ideas. In his book “The Sociology of Philosophies,” Randall Collins finds in all of known history only three major thinkers who appeared on the scene by themselves:the first-century Taoist metaphysician Wang Ch’ung, the fourteenth-century Zen mystic Bassui Tokusho, and the fourteenth-century Arabic philosopher Ibn Khaldun. Everyone else who mattered was part of a movement, a school, a band of followers and disciples and mentors and rivals and friends who saw each other all the time and had long arguments over coffee and slept with one another’s spouses. Freud may have been the founder of psychoanalysis, but it really began to take shape in 1902, when Alfred Adler, Wilhelm Stekel, Max Kahane, and Rudolf Reitler would gather in Freud’s waiting room on Wednesdays, to eat strudel and talk about the unconscious. The neo-Confucian movement of the Sung dynasty in China revolved around the brothers Ch’eng Hao and Ch’eng I, their teacher Chou Tun-i, their father’s cousin Chang Tsai, and, of course, their neighbor Shao Yung. Pissarro and Degas enrolled in the École des Beaux-Arts at the same time, then Pissarro met Monet and, later, Cézanne at the Académie Suisse, Manet met Degas at the Louvre, Monet befriended Renoir at Charles Gleyre’s studio, and Renoir, in turn, met Pissarro and Cézanne and soon enough everyone was hanging out at the Café Guerbois on the Rue des Batignolles. Collins’s point is not that innovation attracts groups but that innovation is found in groups: that it tends to arise out of social interaction–conversation, validation, the intimacy of proximity, and the look in your listener’s eye that tells you you’re onto something. German Idealism, he notes, centered on Fichte, Schelling, and Hegel. Why? Because they all lived together in the same house. “Fichte takes the early lead,” Collins writes,

inspiring the others on a visit while they are young students at Tübingen in the 1790s, then turning Jena into a center for the philosophical movement to which a stream of the soon-to-be-eminent congregate; then on to Dresden in the heady years 1799-1800 to live with the Romantic circle of the Schlegel brothers (where August Schlegel’s wife, Caroline, has an affair with Schelling, followed later by a scandalous divorce and remarriage). Fichte moves on to Berlin, allying with Schleiermacher (also of the Romantic circle) and with Humboldt to establish the new-style university; here Hegel eventually comes and founds his school, and Schopenhauer lectures fruitlessly in competition.

There is a wonderful illustration of this social dimension of innovation in Jenny Uglow’s new book, “The Lunar Men” (Farrar, Straus & Giroux; $30), which is the story of a remarkable group of friends in Birmingham in the mid-eighteenth century. Their leader was Erasmus Darwin, a physician, inventor, and scientist, who began thinking about evolution a full fifty years before his grandson Charles. Darwin met, through his medical practice, an industrialist named Mathew Boulton and, later, his partner James Watt, the steam-engine pioneer. They, in turn, got to know Josiah Wedgwood, he of the famous pottery, and Joseph Priestley, the preacher who isolated oxygen and became known as one of history’s great chemists, and the industrialist Samuel Galton (whose son married Darwin’s daughter and produced the legendary nineteenth-century polymath Francis Galton), and the innovative glass-and-chemicals entrepreneur James Keir, and on and on. They called themselves the Lunar Society because they arranged to meet at each full moon, when they would get together in the early afternoon to eat, piling the table high, Uglow tells us, with wine and “fish and capons, Cheddar and Stilton, pies and syllabubs.” Their children played underfoot. Their wives chatted in the other room, and the Lunar men talked well into the night, clearing the table to make room for their models and plans and instruments. “They developed their own cryptic, playful language and Darwin, in particular, liked to phrase things as puzzles–like the charades and poetic word games people used to play,” Uglow writes. “Even though they were down-to-earth champions of reason, a part of the delight was to feel they were unlocking esoteric secrets, exploring transmutations like alchemists of old.”

When they were not meeting, they were writing to each other with words of encouragement or advice or excitement. This was truly–in a phrase that is invariably and unthinkingly used in the pejorative–a mutual-admiration society. “Their inquiries ranged over the whole spectrum, from astronomy and optics to fossils and ferns,” Uglow tells us, and she goes on:

One person’s passion–be it carriages, steam, minerals, chemistry, clocks–fired all the others. There was no neat separation of subjects. Letters between [William] Small and Watt were a kaleidoscope of invention and ideas, touching on steam-engines and cylinders; cobalt as a semi-metal; how to boil down copal, the resin of tropical trees, for varnish; lenses and clocks and colours for enamels; alkali and canals; acids and vapours–as well as the boil on Watt’s nose.

What were they doing? Darwin, in a lovely phrase, called it “philosophical laughing,” which was his way of saying that those who depart from cultural or intellectual consensus need people to walk beside them and laugh with them to give them confidence. But there’s more to it than that. One of the peculiar features of group dynamics is that clusters of people will come to decisions that are far more extreme than any individual member would have come to on his own. People compete with each other and egg each other on, showboat and grandstand; and along the way they often lose sight of what they truly believed when the meeting began. Typically, this is considered a bad thing, because it means that groups formed explicitly to find middle ground often end up someplace far away. But at times this quality turns out to be tremendously productive, because, after all, losing sight of what you truly believed when the meeting began is one way of defining innovation.

Uglow tells us, for instance, that the Lunar men were active in the campaign against slavery. Wedgwood, Watt, and Darwin pushed for the building of canals, to improve transportation. Priestley came up with soda water and the rubber eraser, and James Keir was the man who figured out how to mass-produce soap, eventually building a twenty-acre soapworks in Tipton that produced a million pounds of soap a year. Here, surely, are all the hallmarks of group distortion. Somebody comes up with an ambitious plan for canals, and someone else tries to top that by building a really big soap factory, and in that feverish atmosphere someone else decides to top them all with the idea that what they should really be doing is fighting slavery.

Uglow’s book reveals how simplistic our view of groups really is. We divide them into cults and clubs, and dismiss the former for their insularity and the latter for their banality. The cult is the place where, cut off from your peers, you become crazy. The club is the place where, surrounded by your peers, you become boring. Yet if you can combine the best of those two –the right kind of insularity with the right kind of homogeneity–you create an environment both safe enough and stimulating enough to make great thoughts possible. You get Fichte, Schelling, and Hegel, and a revolution in Western philosophy. You get Darwin, Watt, Wedgwood, and Priestley, and the beginnings of the Industrial Revolution. And sometimes, on a more modest level, you get a bunch of people goofing around and bringing a new kind of comedy to network television.

3.

One of “S.N.L.”‘s forerunners was a comedy troupe based in San Francisco called the Committee. The Committee’s heyday was in the nineteen-sixties, and its humor had the distinctive political bite of that period. In one of the group’s memorable sketches, the actor Larry Hankin played a condemned prisoner being led to the electric chair by a warden, a priest, and a prison guard. Hankin was strapped in and the switch was thrown–and nothing happened. Hankin started to become abusive, and the three men huddled briefly together. Then, as Tony Hendra recounts, in “Going Too Far,” his history of “boomer humor”:

They confer and throw the switch again. Still nothing. Hankin starts cackling with glee, doubly abusive. They throw it yet again. Nothing yet again. Hankin then demands to be set free–he can’t be executed more than once, they’re a bunch of assholes, double jeopardy, nyah-nyah, etc., etc. Totally desperate, the three confer once more, check that they’re alone in the cell, and kick Hankin to death.

Is that sketch funny? Some people thought so. When the Committee performed it at a benefit at the Vacaville prison, in California, the inmates laughed so hard they rioted. But others didn’t, and even today it’s clear that this humor is funny only to those who can appreciate the particular social and political sensibility of the Committee. We call new cultural or intellectual movements “circles” for a reason: the circle is a closed loop. You are either inside or outside. In “Live from New York,” Lorne Michaels describes going to the White House to tape President Ford saying, “Live from New York, it’s Saturday Night,” the “S.N.L.” intro: “We’d done two or three takes, and to relax him, I said to him–my sense of humor at the time–“Mr. President, if this works out, who knows where it will lead?’ Which was completely lost on him.” In another comic era, the fact that Ford did not laugh would be evidence of the joke’s failure. But when Michaels says the joke “was completely lost on him” it isn’t a disclaimer–it’s the punch line. He said what he said because he knew Ford would not get it. As the writers of “Saturday Night Live” worked on sketches deep into the night, they were sustained by something like what sustained the Lunar men and the idealists in Tübingen–the feeling that they all spoke a private language.

To those on the inside, of course, nothing is funnier than an inside joke. But the real significance of inside jokes is what they mean for those who aren’t on the inside. Laughing at a joke creates an incentive to join the joke-teller. But not laughing–not getting the joke–creates an even greater incentive. We all want to know what we’re missing, and this is one of the ways that revolutions spread from the small groups that spawn them.

“One of Michaels’s rules was, no groveling to the audience either in the studio or at home,” Shales and Miller write. “The collective approach of the show’s creators could be seen as a kind of arrogance, a stance of defiance that said in effect, “We think this is funny, and if you don’t, you’re wrong.’ . . . To viewers raised on TV that was forever cajoling, importuning, and talking down to them, the blunt and gutsy approach was refreshing, a virtual reinvention of the medium.”

The successful inside joke, however, can never last. In “A Great Silly Grin” (Public Affairs; $27.50), a history of nineteen-sixties British satire, Humphrey Carpenter relates a routine done at the comedy club the Establishment early in the decade. The sketch was about the rebuilt Coventry Cathedral, which had been destroyed in the war, and the speaker was supposed to be the Cathedral’s architect, Sir Basil Spence:

First of all, of course, we owe an enormous debt of gratitude to the German people for making this whole project possible in the first place. Second, we owe a debt of gratitude to the people of Coventry itself, who when asked to choose between having a cathedral and having hospitals, schools and houses, plumped immediately (I’m glad to say) for the cathedral, recognizing, I think, the need of any community to have a place where the whole community can gather together and pray for such things as hospitals, schools and houses.

When that bit was first performed, many Englishmen would have found it offensive. Now, of course, hardly anyone would. Mocking British establishment pieties is no longer an act of rebellion. It is the norm. Successful revolutions contain the seeds of their demise: they attract so many followers, eager to be in on the joke as well, that the circle breaks down. The inside becomes indistinguishable from the outside. The allure of exclusivity is gone.

At the same time, the special bonds that created the circle cannot last forever. Sooner or later, the people who slept together in every combination start to pair off. Those doing drugs together sober up (or die). Everyone starts going to bed at eleven o’clock, and bit by bit the intimacy that fuels innovation slips away. “I was involved with Gilda, yeah. I was in love with her,” Aykroyd tells Shales and Miller.”We were friends, lovers, then friends again,” and in a way that’s the simplest and best explanation for the genius of the original “S.N.L.” Today’s cast is not less talented. It is simply more professional. “I think some people in the cast have fun crushes on other people, but nothing serious,” Cheri Oteri, a cast member from the late nineteen-nineties, tells Shales and Miller, in what might well serve as the show’s creative epitaph. “I guess we’re kind of boring–no romances, no drugs. I had an audition once with somebody who used to work here. He’s very, very big in the business now. And as soon as I went in for the audition, he went, “Hey, you guys still doing coke over at SNL?’ Because back when he was here, they were doing it. What are we doing, for crying out loud? Oh yeah. Thinking up characters.”