A Sister’s Eulogy for Steve Jobs

A Sister’s Eulogy for Steve Jobs

I grew up as an only child, with a single mother. Because we were poor and because I knew my father had emigrated from Syria, I imagined he looked like Omar Sharif. I hoped he would be rich and kind and would come into our lives (and our not yet furnished apartment) and help us. Later, after I’d met my father, I tried to believe he’d changed his number and left no forwarding address because he was an idealistic revolutionary, plotting a new world for the Arab people.

Even as a feminist, my whole life I’d been waiting for a man to love, who could love me. For decades, I’d thought that man would be my father. When I was 25, I met that man and he was my brother.

By then, I lived in New York, where I was trying to write my first novel. I had a job at a small magazine in an office the size of a closet, with three other aspiring writers. When one day a lawyer called me — me, the middle-class girl from California who hassled the boss to buy us health insurance — and said his client was rich and famous and was my long-lost brother, the young editors went wild. This was 1985 and we worked at a cutting-edge literary magazine, but I’d fallen into the plot of a Dickens novel and really, we all loved those best. The lawyer refused to tell me my brother’s name and my colleagues started a betting pool. The leading candidate: John Travolta. I secretly hoped for a literary descendant of Henry James — someone more talented than I, someone brilliant without even trying.

When I met Steve, he was a guy my age in jeans, Arab- or Jewish-looking and handsomer than Omar Sharif.

We took a long walk — something, it happened, that we both liked to do. I don’t remember much of what we said that first day, only that he felt like someone I’d pick to be a friend. He explained that he worked in computers.

I didn’t know much about computers. I still worked on a manual Olivetti typewriter.

I told Steve I’d recently considered my first purchase of a computer: something called the Cromemco.

Steve told me it was a good thing I’d waited. He said he was making something that was going to be insanely beautiful.

I want to tell you a few things I learned from Steve, during three distinct periods, over the 27 years I knew him. They’re not periods of years, but of states of being. His full life. His illness. His dying.

Steve worked at what he loved. He worked really hard. Every day.

That’s incredibly simple, but true.

He was the opposite of absent-minded.

He was never embarrassed about working hard, even if the results were failures. If someone as smart as Steve wasn’t ashamed to admit trying, maybe I didn’t have to be.

When he got kicked out of Apple, things were painful. He told me about a dinner at which 500 Silicon Valley leaders met the then-sitting president. Steve hadn’t been invited.

He was hurt but he still went to work at Next. Every single day.

Novelty was not Steve’s highest value. Beauty was.

For an innovator, Steve was remarkably loyal. If he loved a shirt, he’d order 10 or 100 of them. In the Palo Alto house, there are probably enough black cotton turtlenecks for everyone in this church.

He didn’t favor trends or gimmicks. He liked people his own age.

His philosophy of aesthetics reminds me of a quote that went something like this: “Fashion is what seems beautiful now but looks ugly later; art can be ugly at first but it becomes beautiful later.”

Steve always aspired to make beautiful later.

He was willing to be misunderstood.

Uninvited to the ball, he drove the third or fourth iteration of his same black sports car to Next, where he and his team were quietly inventing the platform on which Tim Berners-Lee would write the program for the World Wide Web.

Steve was like a girl in the amount of time he spent talking about love. Love was his supreme virtue, his god of gods. He tracked and worried about the romantic lives of the people working with him.

Whenever he saw a man he thought a woman might find dashing, he called out, “Hey are you single? Do you wanna come to dinner with my sister?”

I remember when he phoned the day he met Laurene. “There’s this beautiful woman and she’s really smart and she has this dog and I’m going to marry her.”

When Reed was born, he began gushing and never stopped. He was a physical dad, with each of his children. He fretted over Lisa’s boyfriends and Erin’s travel and skirt lengths and Eve’s safety around the horses she adored.

None of us who attended Reed’s graduation party will ever forget the scene of Reed and Steve slow dancing.

His abiding love for Laurene sustained him. He believed that love happened all the time, everywhere. In that most important way, Steve was never ironic, never cynical, never pessimistic. I try to learn from that, still.

Steve had been successful at a young age, and he felt that had isolated him. Most of the choices he made from the time I knew him were designed to dissolve the walls around him. A middle-class boy from Los Altos, he fell in love with a middle-class girl from New Jersey. It was important to both of them to raise Lisa, Reed, Erin and Eve as grounded, normal children. Their house didn’t intimidate with art or polish; in fact, for many of the first years I knew Steve and Lo together, dinner was served on the grass, and sometimes consisted of just one vegetable. Lots of that one vegetable. But one. Broccoli. In season. Simply prepared. With the just the right, recently snipped, herb.

Even as a young millionaire, Steve always picked me up at the airport. He’d be standing there in his jeans.

When a family member called him at work, his secretary Linetta answered, “Your dad’s in a meeting. Would you like me to interrupt him?”

When Reed insisted on dressing up as a witch every Halloween, Steve, Laurene, Erin and Eve all went wiccan.

They once embarked on a kitchen remodel; it took years. They cooked on a hotplate in the garage. The Pixar building, under construction during the same period, finished in half the time. And that was it for the Palo Alto house. The bathrooms stayed old. But — and this was a crucial distinction — it had been a great house to start with; Steve saw to that.

This is not to say that he didn’t enjoy his success: he enjoyed his success a lot, just minus a few zeros. He told me how much he loved going to the Palo Alto bike store and gleefully realizing he could afford to buy the best bike there.

And he did.

Steve was humble. Steve liked to keep learning.

Once, he told me if he’d grown up differently, he might have become a mathematician. He spoke reverently about colleges and loved walking around the Stanford campus. In the last year of his life, he studied a book of paintings by Mark Rothko, an artist he hadn’t known about before, thinking of what could inspire people on the walls of a future Apple campus.

Steve cultivated whimsy. What other C.E.O. knows the history of English and Chinese tea roses and has a favorite David Austin rose?

He had surprises tucked in all his pockets. I’ll venture that Laurene will discover treats — songs he loved, a poem he cut out and put in a drawer — even after 20 years of an exceptionally close marriage. I spoke to him every other day or so, but when I opened The New York Times and saw a feature on the company’s patents, I was still surprised and delighted to see a sketch for a perfect staircase.

With his four children, with his wife, with all of us, Steve had a lot of fun.

He treasured happiness.

Then, Steve became ill and we watched his life compress into a smaller circle. Once, he’d loved walking through Paris. He’d discovered a small handmade soba shop in Kyoto. He downhill skied gracefully. He cross-country skied clumsily. No more.

Eventually, even ordinary pleasures, like a good peach, no longer appealed to him.

Yet, what amazed me, and what I learned from his illness, was how much was still left after so much had been taken away.

I remember my brother learning to walk again, with a chair. After his liver transplant, once a day he would get up on legs that seemed too thin to bear him, arms pitched to the chair back. He’d push that chair down the Memphis hospital corridor towards the nursing station and then he’d sit down on the chair, rest, turn around and walk back again. He counted his steps and, each day, pressed a little farther.

Laurene got down on her knees and looked into his eyes.

“You can do this, Steve,” she said. His eyes widened. His lips pressed into each other.

He tried. He always, always tried, and always with love at the core of that effort. He was an intensely emotional man.

I realized during that terrifying time that Steve was not enduring the pain for himself. He set destinations: his son Reed’s graduation from high school, his daughter Erin’s trip to Kyoto, the launching of a boat he was building on which he planned to take his family around the world and where he hoped he and Laurene would someday retire.

Even ill, his taste, his discrimination and his judgment held. He went through 67 nurses before finding kindred spirits and then he completely trusted the three who stayed with him to the end. Tracy. Arturo. Elham.

One time when Steve had contracted a tenacious pneumonia his doctor forbid everything — even ice. We were in a standard I.C.U. unit. Steve, who generally disliked cutting in line or dropping his own name, confessed that this once, he’d like to be treated a little specially.

I told him: Steve, this is special treatment.

He leaned over to me, and said: “I want it to be a little more special.”

Intubated, when he couldn’t talk, he asked for a notepad. He sketched devices to hold an iPad in a hospital bed. He designed new fluid monitors and x-ray equipment. He redrew that not-quite-special-enough hospital unit. And every time his wife walked into the room, I watched his smile remake itself on his face.

For the really big, big things, you have to trust me, he wrote on his sketchpad. He looked up. You have to.

By that, he meant that we should disobey the doctors and give him a piece of ice.

None of us knows for certain how long we’ll be here. On Steve’s better days, even in the last year, he embarked upon projects and elicited promises from his friends at Apple to finish them. Some boat builders in the Netherlands have a gorgeous stainless steel hull ready to be covered with the finishing wood. His three daughters remain unmarried, his two youngest still girls, and he’d wanted to walk them down the aisle as he’d walked me the day of my wedding.

We all — in the end — die in medias res. In the middle of a story. Of many stories.

I suppose it’s not quite accurate to call the death of someone who lived with cancer for years unexpected, but Steve’s death was unexpected for us.

What I learned from my brother’s death was that character is essential: What he was, was how he died.

Tuesday morning, he called me to ask me to hurry up to Palo Alto. His tone was affectionate, dear, loving, but like someone whose luggage was already strapped onto the vehicle, who was already on the beginning of his journey, even as he was sorry, truly deeply sorry, to be leaving us.

He started his farewell and I stopped him. I said, “Wait. I’m coming. I’m in a taxi to the airport. I’ll be there.”

“I’m telling you now because I’m afraid you won’t make it on time, honey.”

When I arrived, he and his Laurene were joking together like partners who’d lived and worked together every day of their lives. He looked into his children’s eyes as if he couldn’t unlock his gaze.

Until about 2 in the afternoon, his wife could rouse him, to talk to his friends from Apple.

Then, after awhile, it was clear that he would no longer wake to us.

His breathing changed. It became severe, deliberate, purposeful. I could feel him counting his steps again, pushing farther than before.

This is what I learned: he was working at this, too. Death didn’t happen to Steve, he achieved it.

He told me, when he was saying goodbye and telling me he was sorry, so sorry we wouldn’t be able to be old together as we’d always planned, that he was going to a better place.

Dr. Fischer gave him a 50/50 chance of making it through the night.

He made it through the night, Laurene next to him on the bed sometimes jerked up when there was a longer pause between his breaths. She and I looked at each other, then he would heave a deep breath and begin again.

This had to be done. Even now, he had a stern, still handsome profile, the profile of an absolutist, a romantic. His breath indicated an arduous journey, some steep path, altitude.

He seemed to be climbing.

But with that will, that work ethic, that strength, there was also sweet Steve’s capacity for wonderment, the artist’s belief in the ideal, the still more beautiful later.

Steve’s final words, hours earlier, were monosyllables, repeated three times.

Before embarking, he’d looked at his sister Patty, then for a long time at his children, then at his life’s partner, Laurene, and then over their shoulders past them.

Steve’s final words were:

OH WOW. OH WOW. OH WOW.

Mona Simpson is a novelist and a professor of English at the University of California, Los Angeles. She delivered this eulogy for her brother, Steve Jobs, on Oct. 16 at his memorial service at the Memorial Church of Stanford University.

Steve Jobs’s Genius

The Genius of Jobs

by Walter Isaacson, nytimes.com
October 29th 2011

ONE of the questions I wrestled with when writing about Steve Jobs was how smart he was. On the surface, this should not have been much of an issue. You’d assume the obvious answer was: he was really, really smart. Maybe even worth three or four reallys. After all, he was the most innovative and successful business leader of our era and embodied the Silicon Valley dream writ large: he created a start-up in his parents’ garage and built it into the world’s most valuable company.

But I remember having dinner with him a few months ago around his kitchen table, as he did almost every evening with his wife and kids. Someone brought up one of those brainteasers involving a monkey’s having to carry a load of bananas across a desert, with a set of restrictions about how far and how many he could carry at one time, and you were supposed to figure out how long it would take. Mr. Jobs tossed out a few intuitive guesses but showed no interest in grappling with the problem rigorously. I thought about how Bill Gates would have gone click-click-click and logically nailed the answer in 15 seconds, and also how Mr. Gates devoured science books as a vacation pleasure. But then something else occurred to me: Mr. Gates never made the iPod. Instead, he made the Zune.

So was Mr. Jobs smart? Not conventionally. Instead, he was a genius. That may seem like a silly word game, but in fact his success dramatizes an interesting distinction between intelligence and genius. His imaginative leaps were instinctive, unexpected, and at times magical. They were sparked by intuition, not analytic rigor. Trained in Zen Buddhism, Mr. Jobs came to value experiential wisdom over empirical analysis. He didn’t study data or crunch numbers but like a pathfinder, he could sniff the winds and sense what lay ahead.

He told me he began to appreciate the power of intuition, in contrast to what he called “Western rational thought,” when he wandered around India after dropping out of college. “The people in the Indian countryside don’t use their intellect like we do,” he said. “They use their intuition instead ... Intuition is a very powerful thing, more powerful than intellect, in my opinion. That’s had a big impact on my work.”

Mr. Jobs’s intuition was based not on conventional learning but on experiential wisdom. He also had a lot of imagination and knew how to apply it. As Einstein said, “Imagination is more important than knowledge.”

Einstein is, of course, the true exemplar of genius. He had contemporaries who could probably match him in pure intellectual firepower when it came to mathematical and analytic processing. Henri Poincaré, for example, first came up with some of the components of special relativity, and David Hilbert was able to grind out equations for general relativity around the same time Einstein did. But neither had the imaginative genius to make the full creative leap at the core of their theories, namely that there is no such thing as absolute time and that gravity is a warping of the fabric of space-time. (O.K., it’s not that simple, but that’s why he was Einstein and we’re not.)

Einstein had the elusive qualities of genius, which included that intuition and imagination that allowed him to think differently (or, as Mr. Jobs’s ads said, to Think Different.) Although he was not particularly religious, Einstein described this intuitive genius as the ability to read the mind of God. When assessing a theory, he would ask himself, Is this the way that God would design the universe? And he expressed his discomfort with quantum mechanics, which is based on the idea that probability plays a governing role in the universe by declaring that he could not believe God would play dice. (At one physics conference, Niels Bohr was prompted to urge Einstein to quit telling God what to do.)

Both Einstein and Mr. Jobs were very visual thinkers. The road to relativity began when the teenage Einstein kept trying to picture what it would be like to ride alongside a light beam. Mr. Jobs spent time almost every afternoon walking around the studio of his brilliant design chief Jony Ive and fingering foam models of the products they were developing.

Mr. Jobs’s genius wasn’t, as even his fanboys admit, in the same quantum orbit as Einstein’s. So it’s probably best to ratchet the rhetoric down a notch and call it ingenuity. Bill Gates is super-smart, but Steve Jobs was super-ingenious. The primary distinction, I think, is the ability to apply creativity and aesthetic sensibilities to a challenge.

In the world of invention and innovation, that means combining an appreciation of the humanities with an understanding of science — connecting artistry to technology, poetry to processors. This was Mr. Jobs’s specialty. “I always thought of myself as a humanities person as a kid, but I liked electronics,” he said. “Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.”

The ability to merge creativity with technology depends on one’s ability to be emotionally attuned to others. Mr. Jobs could be petulant and unkind in dealing with other people, which caused some to think he lacked basic emotional awareness. In fact, it was the opposite. He could size people up, understand their inner thoughts, cajole them, intimidate them, target their deepest vulnerabilities, and delight them at will. He knew, intuitively, how to create products that pleased, interfaces that were friendly, and marketing messages that were enticing.

In the annals of ingenuity, new ideas are only part of the equation. Genius requires execution. When others produced boxy computers with intimidating interfaces that confronted users with unfriendly green prompts that said things like “C:\>,” Mr. Jobs saw there was a market for an interface like a sunny playroom. Hence, the Macintosh. Sure, Xerox came up with the graphical desktop metaphor, but the personal computer it built was a flop and it did not spark the home computer revolution. Between conception and creation, T. S. Eliot observed, there falls the shadow.

In some ways, Mr. Jobs’s ingenuity reminds me of that of Benjamin Franklin, one of my other biography subjects. Among the founders, Franklin was not the most profound thinker — that distinction goes to Jefferson or Madison or Hamilton. But he was ingenious.

This depended, in part, on his ability to intuit the relationships between different things. When he invented the battery, he experimented with it to produce sparks that he and his friends used to kill a turkey for their end of season feast. In his journal, he recorded all the similarities between such sparks and lightning during a thunderstorm, then declared “Let the experiment be made.” So he flew a kite in the rain, drew electricity from the heavens, and ended up inventing the lightning rod. Like Mr. Jobs, Franklin enjoyed the concept of applied creativity — taking clever ideas and smart designs and applying them to useful devices.

China and India are likely to produce many rigorous analytical thinkers and knowledgeable technologists. But smart and educated people don’t always spawn innovation. America’s advantage, if it continues to have one, will be that it can produce people who are also more creative and imaginative, those who know how to stand at the intersection of the humanities and the sciences. That is the formula for true innovation, as Steve Jobs’s career showed.

Original Page: http://www.nytimes.com/2011/10/30/opinion/sunday/steve-jobss-genius.html?ref=general&src=me&pagewanted=all

Shared from Read It Later

The Arab Intellectuals Who Didn’t Roar

The Arab Intellectuals Who Didn’t Roar - NYTimes.com

IN mid-June, the Syrian poet known as Adonis, one of the Arab world’s most renowned literary figures, addressed an open letter to the Syrian president, Bashar al-Assad. The stage was set for one of those moments, familiar from revolutions past, in which an intellectual hero confronts an oppressive ruler and eloquently voices the grievances of a nation.

Instead, Adonis — who lives in exile in France — bitterly disappointed many Syrians. His letter offered some criticisms, but also denigrated the protest movement that had roiled the country since March, and failed even to acknowledge the brutal crackdown that had left hundreds of Syrians dead. In retrospect, the incident has come to illustrate the remarkable gulf between the Arab world’s established intellectuals — many of them, like Adonis, former radicals — and the largely anonymous young people who have led the protests of the Arab Spring.

More than 10 months after it started with the suicide of a Tunisian fruit vendor, the great wave of insurrection across the Arab world has toppled three autocrats and led last week in Tunisia to an election that many hailed as the dawn of a new era. It has not yielded any clear political or economic project, or any intellectual standard-bearers of the kind who shaped almost every modern revolution from 1776 onward. In those revolts, thinkers or ideologues — from Thomas Paine to Lenin to Mao to Vaclav Havel — helped provide a unifying vision or became symbols of a people’s aspirations.

The absence of such figures in the Arab Spring is partly a measure of the pressures Arab intellectuals have lived under in recent decades, trapped between brutal state repression on one side and stifling Islamic orthodoxy on the other. Many were co-opted by their governments (or Persian Gulf oil money) or forced into exile, where they lost touch with the lived reality of their societies. Those who remained have often applauded the revolts of the past year and even marched along with the crowds. But they have not led them, and often appeared stunned and confused by a movement they failed to predict.

The lack of such leaders may also be the hallmark of a largely post-ideological era in which far less need is felt for unifying doctrines or the grandiose figures who provide them. The role of the intellectual may be shrinking into that of the micro-blogger or street organizer. To some, that is just fine. “I don’t think there is a need for intellectuals to spearhead any revolution,” says Sinan Antoon, an Iraqi-born poet and novelist who has written extensively on the Arab Spring and now teaches at New York University. “It is no longer a movement to be led by heroes.”

That belief may soon be tested. As revolts continue in Syria, their leaderless quality — so useful in deterring crackdowns by the secret police — has become a liability. Organizers in and out of the country are now struggling to shape a set of shared political goals, and intellectual coherence and leadership is increasingly seen as important in that process. “No one wants to be accused of hijacking the revolution,” says Sadik Jalal al-Azm, a Syrian philosopher and advocate of greater civic freedoms. “This excessive fear is becoming a hindrance.”

To some extent, the intellectual silence of the current uprising is a deliberate response to the hollow revolutionary rhetoric of previous generations. The Arab nationalist movement began in the 1930s and ’40s with idealistic young men who hoped to lead the region out of its colonial past, backwardness and tribalism. The Syrian political philosopher Michel Aflaq and other young writers and activists found inspiration in 19th-century German theories of nationalism, and envisioned their Baath Party as an instrument for modernization and economic justice.

But the party and its misty ideas were soon hijacked and distilled into slogans by military officers in Syria and Iraq, whose “revolutionary” leadership was really just the old tribalism and autocracy in a different guise. In Egypt too, Arab socialism soon became little more than a pretext for dictatorship and reckless policies at home and abroad. Arab nationalism reached its zenith — or its nadir — in Col. Muammar el-Qaddafi, who saw himself as a godlike intellectual, publishing his own fiction and imposing his delusional Third Universal Theory on Libya’s hapless people. Everything in Colonel Qaddafi’s Libya was styled “revolutionary.” When the rebels overthrew his government this year, they found it difficult to separate the names of their own revolutionary councils from the ones they were overthrowing.

The protesters who led the Arab Spring had grown tired of the stale internationalist rhetoric of their forebears, which had achieved little for the Palestinians and had deepened the divisions among Arab states rather than unifying them. They wanted to focus instead on the failures of their own societies. “Previously, everything was reduced to the exterior: are you pro- or anti-American, what is the role of Israel, and so on,” says Hazem Saghieh, the political editor of the London-based Arab newspaper Al Hayat. “This revolution is entirely different.”

The shift in emphasis to civil rights and democracy at home did not come out of the blue. Some Arab intellectuals began speaking this language long ago, including Mr. Azm, the Syrian philosopher, who after the humiliation of the 1967 war with Israel published a groundbreaking book called “Self-Criticism After the Defeat.” Others followed suit gradually, and during the short-lived “Damascus Spring” a decade ago, Syrian intellectuals signed the Declaration of the 99, a call for greater civil rights and openness. Many were jailed afterward. The bravery and persistence of these intellectuals — and others like them in Egypt — may have quietly prepared the ground for the uprisings this year.

But in recent years their voices often went unheard, because their secular language had little resonance in societies where political Islam was becoming a dominant force. Nor did Islamic reformers fare much better when they tried to cast their political critique in religious terms. The Egyptian scholar Hassan Hanafi, for instance, in the 1980s began calling for the creation of an “Islamic Left,” a socialist ideology rooted in religion. He was branded a heretic and had to seek police protection after receiving death threats from jihadists. His work gained an audience in Indonesia, but not in his own country, said Carool Kersten, a lecturer at King’s College London who has written on Islamic reformers.

Not all Arab intellectuals fell into these traps. Alaa al-Aswany, the Egyptian novelist, became a fierce critic of the government of Hosni Mubarak in recent years, protected from arrest by his celebrity. He was among the first writers to speak to the protesting crowds in Tahrir Square in January, and in March, he delivered a punishing performance during a televised debate with Ahmed Shafiq, the prime minister appointed by Mr. Mubarak. The following day Egypt’s ruling military council fired Mr. Shafiq, and many credit Mr. Aswany with the achievement.

But Mr. Aswany made clear from the first that his only real goal was to serve as a bullhorn for the demands of the protesters in Tahrir Square. He offered no ideas of his own.

Inevitably, and perhaps unfairly, the current Arab tumult has been compared with the uprising against Communism in Eastern Europe in 1989, the last great social upheaval of comparable scale. Intellectuals played a much more prominent role in those movements. In Poland, for instance, “the unification of intellectuals and labor unions was really important,” said Anne Applebaum, a columnist and the author of an authoritative book on the Soviet gulags. “They helped shape the movement and ran its publications. They facilitated conversations between various workers’ groups. They functioned like the Facebook page of their era.”

The dissident Czech playwright Vaclav Havel wrote an essay, “The Power of the Powerless,” that became a kind of blueprint for how to survive with dignity in a totalitarian country, and later emerged as a champion of his country’s Velvet Revolution.

It may be that the connecting role these figures played is less needed today. It may also be that the ideological platforms of earlier revolutions are obsolete, given the speed of communications and the churn of new perspectives. “It is too fluid, too fast-moving, too complex,” says Peter Harling, a senior analyst with the International Crisis Group. “It is too difficult to come up with a paradigm. People are looking for short pieces that illuminate some aspect of what they’re going through, not grand theories.”

Still, Mr. Harling added, among Syrian intellectuals, “none of them has articulated any kind of forward-looking political platform,” and that failure has contributed to anxieties about the protest movement’s direction.

To the extent that any ideas have arisen from the Arab Spring, they relate to the “Turkish model” — the often-heard hope that Turkey’s blend of mildly Islamist ideology and democratic governance can inspire similar success in Arab lands. But this analogy is a facile one, and may well yield disappointment in the months and years to come.

Turkey’s experience is hard to replicate, in part because the country has had the kind of thoroughgoing revolution against tradition that Arab intellectuals of the 20th century only talked about. Starting in the early 1920s, Turkey’s great autocrat, Mustafa Kemal Ataturk, overhauled the country’s education system, bringing over the American reformer John Dewey to advise him. He abolished the caliphate and gutted the country’s legal system, instituting a strict separation of church and state. The first elections took place in 1946, and only after decades of struggle (and several coups d’état) did Turkey start earning applause for its democratic ways.

Without that punishing preparation, the Arab world’s new revolutionaries may end up repeating history, even if they do study it. Last week, amid the euphoria over Colonel Qaddafi’s death, a few skeptical voices could be heard in the din of triumphant Internet messages in Arabic.

“Let the killing of Qaddafi be a lesson to the revolutionaries as much as to the rulers,” one Arab Twitter user wrote. “And let revolutionaries everywhere remember that Qaddafi came to power by making his own revolution 40 years ago.”

Robert Worth is a staff writer for The New York Times Magazine who has reported from Egypt, Yemen and Libya.

Article: WE OWE IT ALL TO THE HIPPIES -- Steward Brand

WE OWE IT ALL TO THE HIPPIES 
http://www.time.com/time/printout/0,8816,982602,00.html


Wednesday, Mar. 01, 1995

WE OWE IT ALL TO THE HIPPIES

By STEWART BRAND

Newcomers to the Internet are often startled to discover themselves not so much in some soulless colony of technocrats as in a kind of cultural Brigadoon — a flowering remnant of the ’60s, when hippie communalism and libertarian politics formed the roots of the modern cyberrevolution. At the time, it all seemed dangerously anarchic (and still does to many), but the counterculture’s scorn for centralized authority provided the philosophical foundations of not only the leaderless Internet but also the entire personal- computer revolution.

We — the generation of the ’60s — were inspired by the “bards and hot- gospellers of technology,” as business historian Peter Drucker described media maven Marshall McLuhan and technophile Buckminster Fuller. And we bought enthusiastically into the exotic technologies of the day, such as Fuller’s geodesic domes and psychoactive drugs like LSD. We learned from them, but ultimately they turned out to be blind alleys. Most of our generation scorned computers as the embodiment of centralized control. But a tiny contingent — later called “hackers” — embraced computers and set about transforming them into tools of liberation. That turned out to be the true royal road to the future. “Ask not what your country can do for you. Do it yourself,” we said, happily perverting J.F.K.’s Inaugural exhortation. Our ethic of self-reliance came partly from science fiction. We all read Robert Heinlein’s epic Stranger in a Strange Land as well as his libertarian screed-novel, The Moon Is a Harsh Mistress. Hippies and nerds alike reveled in Heinlein’s contempt for centralized authority. To this day, computer scientists and technicians are almost universally science-fiction fans. And ever since the 1950s, for reasons that are unclear to me, science fiction has been almost universally libertarian in outlook.

As Steven Levy chronicled in his 1984 book, Hackers: Heroes of the Computer Revolution, there were three generations of youthful computer programmers who deliberately led the rest of civilization away from centralized mainframe computers and their predominant sponsor, IBM. “The Hacker Ethic,” articulated by Levy, offered a distinctly countercultural set of tenets. Among them:

“Access to computers should be unlimited and total.”

“All information should be free.”

“Mistrust authority — promote decentralization.”

“You can create art and beauty on a computer.”

“Computers can change your life for the better.” Nobody had written these down in manifestoes before; it was just the way hackers behaved and talked while shaping the leading edge of computer technology.

In the 1960s and early ’70s, the first generation of hackers emerged in university computer-science departments. They transformed mainframes into virtual personal computers, using a technique called time sharing that provided widespread access to computers. Then in the late ’70s, the second generation invented and manufactured the personal computer. These nonacademic hackers were hard-core counterculture types — like Steve Jobs, a Beatle- haired hippie who had dropped out of Reed College, and Steve Wozniak, a Hewlett-Packard engineer. Before their success with Apple, both Steves developed and sold “blue boxes,” outlaw devices for making free telephone calls. Their contemporary and early collaborator, Lee Felsenstein, who designed the first portable computer, known as the Osborne 1, was a New Left radical who wrote for the renowned underground paper the Berkeley Barb.

As they followed the mantra “Turn on, tune in and drop out,” college students of the ’60s also dropped academia’s traditional disdain for business. “Do your own thing” easily translated into “Start your own business.” Reviled by the broader social establishment, hippies found ready acceptance in the world of small business. They brought an honesty and a dedication to service that was attractive to vendors and customers alike. Success in business made them disinclined to “grow out of” their countercultural values, and it made a number of them wealthy and powerful at a young age.

The third generation of revolutionaries, the software hackers of the early ’80s, created the application, education and entertainment programs for personal computers. Typical was Mitch Kapor, a former transcendental-meditation teacher, who gave us the spreadsheet program Lotus 1-2-3, which ensured the success of IBM’s Apple-imitating PC. Like most computer pioneers, Kapor is still active. His Electronic Frontier Foundation, which he co- founded with a lyricist for the Grateful Dead, lobbies successfully in Washington for civil rights in cyberspace. In the years since Levy’s book, a fourth generation of revolutionaries has come to power. Still abiding by the Hacker Ethic, these tens of thousands of netheads have created myriad computer bulletin boards and a nonhierarchical linking system called Usenet. At the same time, they have transformed the Defense Department-sponsored ARPAnet into what has become the global digital epidemic known as the Internet. The average age of today’s Internet users, who number in the tens of millions, is about 30 years. Just as personal computers transformed the ’80s, this latest generation knows that the Net is going to transform the ’90s. With the same ethic that has guided previous generations, today’s users are leading the way with tools created initially as “freeware” or “shareware,” available to anyone who wants them. Of course, not everyone on the electronic frontier identifies with the countercultural roots of the ’60s. One would hardly call Nicholas Negroponte, the patrician head of M.I.T.’s Media Lab, or Microsoft magnate Bill Gates “hippies.” Yet creative forces continue to emanate from that period. Virtual reality — computerized sensory immersion — was named, largely inspired and partly equipped by Jaron Lanier, who grew up under a geodesic dome in New Mexico, once played clarinet in the New York City subway and still sports dreadlocks halfway down his back. The latest generation of supercomputers, utilizing massive parallel processing, was invented, developed and manufactured by Danny Hillis, a genial longhair who set out to build “a machine that could be proud of us.” Public-key encryption, which can ensure unbreakable privacy for anyone, is the brainchild of Whitfield Diffie, a lifelong peacenik and privacy advocate who declared in a recent interview, “I have always believed the thesis that one’s politics and the character of one’s intellectual work are inseparable.” Our generation proved in cyberspace that where self-reliance leads, resilience follows, and where generosity leads, prosperity follows. If that dynamic continues, and everything so far suggests that it will, then the information age will bear the distinctive mark of the countercultural ’60s well into the new millennium.

Steve Jobs: 'Computer Science Is A Liberal Art'

Steve Jobs: 'Computer Science Is A Liberal Art'

Copyright ©2011 National Public Radio®. For personal, noncommercial use only. See Terms of Use. For other uses, prior permission required.

Heard on Fresh Air from WHYY

October 6, 2011 - TERRY GROSS, host: When Steve Jobs died yesterday, many of us felt a sense of personal loss because his work transformed computer technology and changed our lives. He was a visionary. He co-founded Apple and played a key role in the creation of the Mac, the iPod, iTunes, the iPhone, iPad and other innovative devices and technologies, which so many other companies have done their best to imitate.

Jobs was 56 and had pancreatic cancer. He had a liver transplant in 2009 and stepped down as Apple's CEO last August. We're going to listen back to an excerpt of the interview I recorded with Steve Jobs in 1996, 11 years after he was ousted from Apple. He returned to Apple the year after we spoke.

(SOUNDBITE OF ARCHIVAL RECORDING)

GROSS: From what I've read, it sounds like you were really the advocate for having a mouse on the Mac. Why did you push for that and what was the argument against it?

STEVE JOBS: Well, as I mentioned earlier, I went to Xerox PARC, Palo Alto Research Center, in 1979 and I saw the early work on graphical user interfaces that they had done. And they had a mouse and it was obvious that you needed a pointing device and a mouse seemed to be the best one. We tried a bunch of other ones subsequently at Apple and a mouse indeed was the best one. We refined it a little bit.

We found that, you know, Xerox's had three buttons. We found that people would push the wrong button or be scared that they were going to push the wrong button, so they always looked at the mouse instead of the screen. So we got it down to one button so that you could never push the wrong button. Made some refinements like that.

The Xerox, you know, mouse cost about $1,000 a piece to build. We had to engineer one that cost 20 bucks to build. So we had to do a lot of those kinds of things. But the basic concept of the mouse came originally from a company called SRI, through Xerox and then to Apple. And there were a lot of people at Apple that just didn't get it. We fought tooth and nail with a variety of people there who thought the whole concept of a graphical user interface was crazy, but fortunate...

GROSS: On what grounds?

JOBS: On the grounds that it either couldn't be done, or on the grounds that real computer users didn't need, you know, menus in plain English, and real computer users didn't care about, you know, putting nice little pictures on the screen. But fortunately I was the largest stockholder and the chairman of the company, so I won.

(SOUNDBITE OF LAUGHTER)

GROSS: I know at Apple there was, at least early on, a very informal, you know, non-corporate type of atmosphere. I wonder if there are any lessons you learned about what worked and didn't work in the corporate lifestyle at Apple that you've applied to your current companies, NeXT and Pixar.

JOBS: Well, you know, I don't know what a corporate lifestyle is. I mean, Apple was a corporation, we were very conscious of that. We were very driven to make money so that we can continue to invest in the things we loved. But it had a few very big differences to other corporate lifestyles that I'd seen. The first one was a real belief that there wasn't a hierarchy of ideas that mapped into the hierarchy of the organization. In other words, great ideas could come from anywhere and that we better sort of treat people in a much more egalitarian sense in terms of where the ideas came from.

And Apple was a very bottoms-up company when it came to a lot of its great ideas. And we hired, you know, truly great people and gave them the room to do great work. A lot of companies - I know it sounds crazy - but a lot of companies don't do that. They hire people to tell them what to do. We hired people to tell us what to do. And that led to a very different corporate culture, and one that's really much more collegial than hierarchical.

GROSS: What do you think the state of the computer would be if it weren't for Apple? This is a chance, I guess, for a really self-serving answer. But, I mean, I'm really curious what you think.

JOBS: I usually believe that if, you know, if one group of people didn't do something within a certain number of years, the times would produce another group of people that would accomplish similar things. I think that, personally, our major contribution was a little different than some people might think. I think our major contribution was in bringing a liberal arts point of view to the use of computers.

GROSS: Yeah, explain what you mean by that.

JOBS: What I mean by that is that, you know, if you really look at the ease of use of the Macintosh, the driving motivation behind that was to bring not only ease of use to people - so that many, many more people could use computers for nontraditional things at that time - but it was to bring, you know, beautiful fonts and typography to people, it was to bring graphics to people, not for, you know, plotting laminar flow calculations, but so that they could see beautiful, you know, photographs, or pictures, or artwork, et cetera, to help them communicate what they were doing potentially.

Our goal was to bring a liberal arts perspective and a liberal arts audience to what had traditionally been, you know, a very geeky technology and a very geeky audience. And...

GROSS: What made you think that that more liberal arts direction was the direction to head in?

JOBS: Because in my perspective, and the way I was raised, was that science and computer science is a liberal art. It's something that everyone should know how to use, at least, and harness in their life. It's not something that should be, you know, should be relegated to five percent of the population over in the corner. It's something that everybody should be exposed to, everyone should have a mastery of to some extent, and that's how we viewed, you know, computation or these computation devices.

GROSS: And you think that, you know, that that concept really caught on in the whole industry, eventually?

JOBS: You know, it's in the - Apple certainly - that's the seed of Apple, you know, computers for the rest of us. And I think the sort of - the liberal arts point of view still lives at Apple. I'm not so sure that it lives that many other places. I mean, one of the reasons I think Microsoft took 10 years to copy the Mac was 'cause they didn't really get it at its core.

GROSS: Steve Jobs, recorded in 1996, before he oversaw the creation of the iPod, iPhone and iPad. We want to add our thanks to him for his many innovations.

Copyright © 2011 National Public Radio®. All rights reserved. No quotes from the materials contained herein may be used in any media without attribution to National Public Radio. This transcript is provided for personal, noncommercial use only, pursuant to our Terms of Use. Any other use requires NPR's prior permission. Visit our permissions page for further information.

NPR transcripts are created on a rush deadline by a contractor for NPR, and accuracy and availability may vary. This text may not be in its final form and may be updated or revised in the future. Please be aware that the authoritative record of NPR's programming is the audio.

Jobs' Biography: Thoughts On Life, Death And Apple

Jobs' Biography: Thoughts On Life, Death And Apple

Walter Isaacson's biography of Apple co-founder Steve Jobs was published Monday, less than three weeks after Job's death on Oct. 5.
Enlarge Joe Raedle/Getty Images

Walter Isaacson's biography of Apple co-founder Steve Jobs was published Monday, less than three weeks after Job's death on Oct. 5.

October 25, 2011

When Steve Jobs was 6 years old, his young next door neighbor found out he was adopted. "That means your parents abandoned you and didn't want you," she told him.

Jobs ran into his home, where his adoptive parents reassured him that he was theirs and that they wanted him.

"[They said] 'You were special, we chose you out, you were chosen," says biographer Walter Isaacson. "And that helped give [Jobs] a sense of being special. ... For Steve Jobs, he felt throughout his life that he was on a journey — and he often said, 'The journey was the reward.' But that journey involved resolving conflicts about ... his role in this world: why he was here and what it was all about."

When Jobs died on Oct. 5 from complications of pancreatic cancer, many people felt a sense of personal loss for the Apple co-founder and former CEO. Jobs played a key role in the creation of the Macintosh, the iPod, iTunes, the iPhone, the iPad — innovative devices and technologies that people have integrated into their daily lives.

Steve Jobs

Jobs detailed how he created those products — and how he rose through the world of Silicon Valley, competed with Google and Microsoft, and helped transform popular culture — in a series of extended interviews with Isaacson, the president of The Aspen Institute and the author of biographies of Albert Einstein and Benjamin Franklin. The two men met more than 40 times throughout 2009 and 2010, often in Jobs' living room. Isaacson also conducted more than 100 interviews with Jobs' colleagues, relatives, friends and adversaries.

His biography tells the story of how Jobs revolutionized the personal computer. It also tells Jobs' personal story — from his childhood growing up in Mountain View, Calif., to his lifelong interest in Zen Buddhism to his relationship with family and friends.

In his last meetings with Isaacson, Jobs shifted the conversation to his thoughts regarding religion and death.

"I remember sitting in the back garden on a sunny day [on a day when] he was feeling bad, and he talked about whether or not he believed in an afterlife," Isaacson tells Fresh Air's Terry Gross. "He said, 'Sometimes I'm 50-50 on whether there's a God. It's the great mystery we never quite know. But I like to believe there's an afterlife. I like to believe the accumulated wisdom doesn't just disappear when you die, but somehow it endures."

Jobs paused for a second, remembers Isaacson.

"And then he says, 'But maybe it's just like an on/off switch and click — and you're gone.' And then he paused for another second and he smiled and said, 'Maybe that's why I didn't like putting on/off switches on Apple devices.' "

'The Depth Of The Simplicity'

Jobs' attention to detail on his creations was unrivaled, says Isaacson. Though he was a technologist and a businessman, he was also an artist and designer.

"[He] connected art with technology," explains Isaacson. "[In his products,] he obsessed over the color of the screws, over the finish of the screws — even the screws you couldn't see." Even with the original Macintosh, he made sure that the circuit board's chips were lined up properly and looked good. He made them go back and redo the circuit board. He made them find the right color, find the right curves on the screw. Even the curves on the machine — he wanted it to feel friendly.

That obsessiveness occasionally drove his Apple co-workers crazy — but it also made them fiercely loyal, says Isaacson.

More On Steve Jobs

"It's one of the dichotomies about Jobs: He could be demanding and tough and irate. On the other hand, he got all A-players and they became fanatically loyal to him," says Isaacson. "Why? They realized they were producing, with other A-players, truly great products for an artist who was a perfectionist — and wasn't always the kindest person when they failed — but he was rallying them to do great stuff."

He relays one story about Jobs that shows, he says, how much he was able to connect great ideas and innovations together. In the early 1980s, Jobs visited Xerox PARC, a research company in Palo Alto that had invented the laser printer, object-oriented programming and the Ethernet. Jobs noticed that the computers running at PARC all featured graphics on their desktops that allowed users to click icons and folders. This was new at the time: Most computers used text prompts and a text interface.

"Steve Jobs made an arrangement with Xerox and he took that concept [of the graphical user interface] and he improved it a hundred-fold," says Isaacson. "He made it so you could drag and drop some of the folders; he invented the pull-down menus. ... So what he was able to do was to take a conception and turn it into a reality."

That's where Jobs' genius was, Isaacson says. Jobs insisted that the software and hardware on Apple products needed to be fully integrated for the best user experience. It was not a great business model at first.

"Microsoft, which licensed itself promiscuously to all sorts of manufacturers, ends up with 90 to 95 percent all the operating system market by the beginning of 2000," says Isaacson. "But in the long run, the end-to-end integration system works very well for Apple and for Steve Jobs. Because it allows him to create devices [like the iPod and iPad] that just work beautifully with the machines."

Isaacson says working with Jobs gave him an additional insight into the design of Jobs' products.

"I see the depth of the simplicity," he says. "[I appreciate] the intuitive nature of the design, and how he would repeatedly sit there with his design engineers and his user-interface software people, and say, 'No, no no, I want to make it simpler.' I also appreciate the beauty of the parts unseen. His father taught him that the back of a fence or the back of a chest of drawers should be as beautiful as the front because [he] would know the craftsmanship that went into it. So somehow, it comes through — the depth of the beauty of the design."


Jobs was a perfectionist with a famously mercurial temperament. He was an artist and a visionary who "could be demanding and tough and irate," says Isaacson.
Enlarge Jeff Chiu/AP

Jobs was a perfectionist with a famously mercurial temperament. He was an artist and a visionary who "could be demanding and tough and irate," says Isaacson.

Interview Highlights

On what Jobs thought of the Microsoft operating system

Isaacson: "When it first came out — I can't use the words on the air — but [Jobs thought it was] clunky and not beautiful and not aesthetic. But as always is the case with Microsoft, it improves. And eventually Microsoft made a graphical operating system — Windows — and each new version got better until it was a dominating operating system."

On the rivalry between Jobs and Bill Gates

Hear Steve Jobs On Fresh Air

Isaacson: "There are all sorts of lawsuits where Apple is trying to sue Microsoft for Windows, for trying to steal the look and feel. Apple loses most of the suits but they drag on and there's even a government investigation. By the time Steve Jobs comes back to Apple in 1997, the relationship is horrible. And when we say that Jobs and Gates had a rivalry, we also have to realize they had a collaboration and a partnership. It was typical of the digital age — both rivalry and partnership."

On the relationship between Jobs and Google

Isaacson: "I think there was an unnerving historic resonance for what had happened a couple of decades earlier [with Microsoft]. Suddenly you have Google taking the operating system of the iPhone and mobile devices and all of the touch-screen technologies and building upon it, and making it an open technology that various device makers could use. ... Steve Jobs felt very possessive about all of the look, the feel, the swipes, the multitouch gestures that you use — and was driven to absolute distraction when Android's operating system, developed by Google and used by hardware manufacturers, started doing the exact same thing. ... He was furious but that probably understates his feeling. He was really furious and he let Eric Schmidt, who was then the CEO of Google, know it."

More With Walter Isaacson

On Jobs' adoptive parents

Isaacson: "When Steve got placed with [parents who were not college graduates], his biological mother initially balked at first but ... the Jobs family made a pledge that they would start a college fund and make sure that Steve went to college."

On approaching Isaacson to write his biography

Isaacson: "It was 2004 and he had broached the subject of doing a biography of him and I thought, 'Well, this guy's in the midst of an up-and-down career and he has maybe 20 years to go, so I said to him, 'I'd love to do a biography of you but let's wait 20 or so years until you retire.' Then off and on after 2004, we would be in touch. ...

"I finally talked to his wife, who was very good at understanding his legacy, and she said, 'If you're going to do a book on Steve, you can't just keep saying, 'I'll do it in 20 years or so.' You really ought to do it now.' This was 2009. Steve Jobs, that year, had had a liver transplant and I realized how sick he was. ... And so, that was when I realized that this was a very fascinating tale and this guy may or may not make it. I thought he was going to live much longer. But at the very least, he was facing the prospect of his mortality so it was time for him to be reflective and do a book."

On his final meeting with Jobs

Isaacson: "He was pretty sick. He was confined to the house. And he said to me, at the end of our long conversation, 'There will be things in this book I don't like, right?' And I said, 'Yes.' Partly because you can interview people right after a meeting they've had with Steve Jobs [and] you interview five people and get five different stories about what happened. ... People have different perceptions of who he is. ...

"He said, 'I'll make you this promise. I'm not going to read the book until next year, until after it comes out.' And it made me feel a grand emotion, of 'Oh! That's great. Steve is going to be alive for another year.' Because when you're around him, the power of his thinking really grabs you. I remember leaving his house and thinking, 'Oh, I'm so relieved. He'll be alive in a year. He just told me so.' Logically, I should have said, 'He doesn't know what ups and downs he's going to have with his health.' But I think that he always felt some miracle would come along because all of his life, miracles had come along."

Regards,
Walid.

Apple's Steve Jobs, Nevada biological father had chance meeting

Apple’s Steve Jobs, Nevada biological father had chance meeting

Click to enlarge photo

Abdulfattah John Jandali

Steve Jobs never sought a relationship with his biological father, a Reno casino manager and one-time Las Vegas resident. But it turns out the pair shared at least one handshake before they ever knew they were related.

In his biography of the late Apple CEO, out in stores today, author Walter Isaacson writes that the pair met briefly when Jobs’ biological father, Abdulfattah John Jandali, ran a restaurant in Silicon Valley.

At the time, neither man realized they were father and son, and their meeting at the restaurant was strictly by chance.

“I remember meeting the owner, who was from Syria,” Jobs told Isaacson in an interview that aired Sunday night on "60 Minutes." “And I shook his hand and he shook my hand and that’s all.”

Jobs began searching for his biological mother in the 1980s, according to Isaacson. He met his mother, Joanne Simpson, who introduced him to his biological sister, novelist Mona Simpson.

“Then they go on a quest, a journey to find the birth father,” Isaacson told "60 Minutes." “Especially Mona wants to find what she calls the lost father.”

But when Mona Simpson found their father managing a restaurant in Sacramento, Jobs said he had no interest in meeting Jandali.

“I learned a little bit about him and I didn’t like what I learned,” Jobs said in an interview with Isaacson. “I asked her to not tell him that we had ever met and not tell him anything about me.”

Jandali, however, bragged to Mona Simpson about running “one of the best restaurants in Silicon Valley” where every one, “even Steve Jobs used to eat.” At the time, Jandali apparently had no idea Jobs was his son.

Later, after he found out, Jandali for years closely guarded the secret of his biological son from all but his closest friends.

He often lamented that he never met Jobs, telling a British tabloid that his Syrian pride wouldn’t let him for fear it would appear he was after his son’s wealth.

Jandali, 80, is general manager of Boomtown Hotel & Casino. He’s managed a variety of restaurants in Reno and Las Vegas and worked briefly as a political science professor at the University of Nevada, Reno in the late 1960s.

Joanne Simpson and Jandali, who met as graduate students, put their infant son up for adoption in San Francisco. The couple later married and had a daughter, Mona Simpson. They divorced, however, and Jandali had no hand in raising either of his children.

Jandali, who is described by friends as having a keen intellect and a calm and friendly demeanor, has granted few interviews. Most in Reno were unaware of his famous progeny.

In the "60 Minutes" interview, Isaacson noted that a sense of abandonment resonated throughout Jobs’ life — including in 1985 when he was kicked out of Apple.

“He always had that feeling of abandonment,” Isaacson said. “There was nothing worse than being abandoned by Apple.”

But Jobs’ adoption also gave him a sense of “being chosen” and special enough to attempt the seemingly impossible.

“I remember right here on the lawn telling (the girl from across the street) that I was adopted,” Jobs said. “And she said, ‘So does that mean your real parents didn’t want you?’ Ooooo, lightning bolt! I remember running into the house crying, asking my parents. They sat me down and said, ‘No, you don’t understand. We specifically picked you out.’”

That moment, that sense of being chosen, Isaacson said, “is the key to understanding Steve Jobs.”

The Biographer’s Dilemma

The Biographer’s Dilemma

I was halfway through Walter Isaacson’s new biography of Steve Jobs when I suddenly went searching through my bookshelf for the book he wrote about Benjamin Franklin. I had read the latter biography when it came out in 2003, and I remembered it fondly. I was trying to figure out why “Steve Jobs,” despite being full of new information about the most compelling businessman of the modern era, was leaving me cold.

It didn’t take long to find the answer. “Benjamin Franklin is the founding father who winks at us,” wrote Isaacson early in Chapter One in “Benjamin Franklin: An American Life.” Oh, for such a sentence in “Steve Jobs!” Oh, for such an insight.

Let me acknowledge that the task facing Isaacson was daunting. He began his research in early 2009, knowing that Jobs had cancer and that his time remaining on this earth was likely to be brief. Although many books have been written about Jobs, Isaacson was the first writer the Apple co-founder had ever cooperated with. (Indeed, it was Jobs who approached Isaacson about writing his biography.) They spoke more than 40 times, about all aspects of Jobs’s life — including his personal life, which he had always guarded fiercely.

Combine that with the enormousness of Jobs’s accomplishments — from starting the personal computer industry in his garage to creating a half-dozen of the most iconic consumer products ever invented — and it’s practically a miracle that Isaacson’s book was published as quickly as it was. (The official publication date was Monday.)

Its 627 pages is, indeed, chock full of revelations, from Jobs’s difficult relationship with a daughter he fathered in his early 20s — and then abandoned for years — to the lessons he learned from his adoptive father, whom he adored. We go behind the scenes during the boardroom battle that forced Jobs out of Apple in 1985 — as well as the one that brought him back a decade later.

“Steve Jobs” offers so many examples of his awful behavior — incorrigible bullying, belittling and lying — that you’re soon numb to them. Isaacson gives us the back story of all of Jobs’s creations, from the Apple II to the iPad. His descriptions of the more recent products — iPod to iPhone to iPad — have a flat, rushed quality, as if the author was racing to finish before his subject died. Chances are, he was.

That there is such a hunger for information about this most private of men is undeniable; that’s why the book went to No. 1 on Amazon’s best-sellers list practically the moment Jobs died. But facts alone — even previously unknown facts — do not, by themselves, make for great biographies. What is required for that is genuine insight. And that is where “Steve Jobs” falls down.

Part of the problem, I think, is that the bond that developed between subject and writer made it nearly impossible for Isaacson to get the kind of critical distance he needed to take his subject’s true measure. He didn’t just interview Jobs; he watched him die. There is a moving scene near the end of the book, with an emaciated Jobs, lying in bed, leafing through photographs with Isaacson, reminiscing. How can one possibly get critical distance about your subject when such moments are part of your experience of him?

“I think there will be a lot in your book that I won’t like,” Jobs tells Isaacson during that conversation, two months before he died. Isaacson agrees, but I don’t. Jobs’s bad behavior is something he never denied. He rationalized it as his way of getting the most out of people — and Isaacson largely accepts this rationalization. An alternative notion — that Jobs was an emotional child his whole life — is something the readers have to come to themselves, by reading between the lines.

When you think about it, it is rare for a truly great biography to be written about someone who is living; in my lifetime, the only one I can think of is “The Power Broker,” Robert Caro’s monumental biography of Robert Moses. When the subjects are alive — and Jobs was still alive when this book was finished — biographers always feel them looking over their shoulders, and pushing back. Jobs does that often with Isaacson, rejecting, for instance, the idea that his own abandonment by his natural parents had a major effect on him. Invariably, at such moments, Isaacson backs off and gives Jobs the last word.

There is another kind of distance biographies of the living lack — the distance of time. It can take decades to truly understand the context in which the subject’s life and achievements played out. Often we need to see what happens after he is gone to realize his true impact on our world. Steve Jobs has been dead for three weeks. We’re not even close to that understanding.

In “Steve Jobs,” Walter Isaacson has recounted a life — a big, sprawling, amazing life. It is a serious accomplishment. What remains for future biographers is to make sense of that life.