Sam Altman Is Creating Himself In The Image Of Robert Oppenheimer Or Bill Gates?

  • October 3, 2023
Some claim OpenAI CEO Sam Altman is the 21st-century version of Robert Oppenheimer, the director of the laboratory at Los Alamos, New Mexico, where the atomic bomb was designed. Oppenheimer was an administrator; Altman is an administrator, venture capitalist, and opportunist. Altman is a pedestrian programmer who clawed his way to the top of a budding empire, just like Bill Gates, who created Microsoft in 1975.

Gates purchased all the early software from others. The programmers he hired after that did all the work while he wined and dined the corporate elite and governments around the world. Gates tried desperately to posture himself as a genius, just as Altman is doing today.

Sam Altman is creating himself in the image of Bill Gates, not Oppenheimer. ⁃ TN Editor

This past spring, Sam Altman, the 38-year-old CEO of OpenAI, sat down with Silicon Valley’s favorite Buddhist monk, Jack Kornfield. This was at Wisdom 2.0, a low-stakes event at San Francisco’s Yerba Buena Center for the Arts, a forum dedicated to merging wisdom and “the great technologies of our age.” The two men occupied huge white upholstered chairs on a dark mandala-backed stage. Even the moderator seemed confused by Altman’s presence.

“What brought you here?” he asked.

“Yeah, um, look,” Altman said. “I’m definitely interested in this topic” — officially, mindfulness and AI. “But, ah, meeting Jack has been one of the great joys of my life. I’d be delighted to come hang out with Jack for literally any topic.”

It was only when Kornfield — who is 78 and whose books, including The Wise Hearthave sold more than a million copies — made his introductory remarks that the agenda became clear.

“My experience is that Sam … the language I’d like to use is that he’s very much a servant leader.” Kornfield was here to testify to the excellence of Altman’s character. He would answer the question that’s been plaguing a lot of us: How safe should we feel with Altman, given that this relatively young man in charcoal Chelsea boots and a gray waffle henley appears to be controlling how AI will enter our world?

Kornfield said he had known Altman for several years. They meditated together. They explored the question: How could Altman “build in values — the bodhisattva vows, to care for all beings”? How could compassion and care “be programmed in in some way, in the deepest way?”

Altman’s signature blue backpack, the same one pictured years earlier above.

Throughout Kornfield’s remarks, Altman sat with his legs uncrossed, his hands folded in his lap, his posture impressive, his face arranged in a manner determined to convey patience (though his face also made it clear patience is not his natural state). “I am going to embarrass you,” Kornfield warned him. Then the monk once again addressed the crowd: “He has a pure heart.”

For much of the rest of the panel, Altman meandered through his talking points. He knows people are scared of AI, and he thinks we should be scared. So he feels a moral responsibility to show up and answer questions. “It would be super-unreasonable not to,” he said. He believes we need to work together, as a species, to decide what AI should and should not do.

By Altman’s own assessment — discernible in his many blog posts, podcasts, and video events — we should feel good but not great about him as our AI leader. As he understands himself, he’s a plenty-smart-but-not-genius “technology brother” with an Icarus streak and a few outlier traits. First, he possesses, he has said, “an absolutely delusional level of self-confidence.” Second, he commands a prophetic grasp of “the arc of technology and societal change on a long time horizon.” Third, as a Jew, he is both optimistic and expecting the worst. Fourth, he’s superb at assessing risk because his brain doesn’t get caught up in what other people think.

On the downside: He’s neither emotionally nor demographically suited for the role into which he’s been thrust. “There could be someone who enjoyed it more,” he admitted on the Lex Fridman Podcast in March. “There could be someone who’s much more charismatic.” He’s aware that he’s “pretty disconnected from the reality of life for most people.” He is also, on occasion, tone-deaf. For instance, like many in the tech bubble, Altman uses the phrase “median human,” as in, “For me, AGI” — artificial general intelligence — “is the equivalent of a median human that you could hire as a co-worker.”

Altman – “I love burning man. Best week of the year in SF.” This picture shows him giving a talk in BM 2022.

At Yerba Buena, the moderator pressed Altman: How did he plan to assign values to his AI?

One idea, Altman said, would be to gather up “as much of humanity as we can” and come to a global consensus. You know: Decide together that “these are the value systems to put in, these are the limits of what the system should never do.”

The audience grew quiet.

“Another thing I would take is for Jack” — Kornfield — “to just write down ten pages of ‘Here’s what the collective value should be, and here’s how we’ll have the system do that.’ That’d be pretty good.”

The audience got quieter still.

Altman wasn’t sure if the revolution he was leading would, in the fullness of history, be considered a technological or societal one. He believed it would “be bigger than a standard technological revolution.” Yet he also knew, having spent his entire adult life around tech founders, that “it’s always annoying to say ‘This time it’s different’ or ‘You know, my thing is supercool.’” The revolution was inevitable; he felt sure about that. At a minimum, AI will upend politics (deep fakes are already a major concern in the 2024 presidential election), labor (AI has been at the heart of the Hollywood writers’ strike), civil rights, surveillance, economic inequality, the military, and education. Altman’s power, and how he’ll use it, is all of our problem now.

Yet it can be hard to parse who Altman is, really; how much we should trust him; and the extent to which he’s integrating others’ concerns, even when he’s on a stage with the intention of quelling them. Altman said he would try to slow the revolution down as much as he could. Still, he told the assembled, he believed that it would be okay. Or likely be okay. We — a tiny word with royal overtones that was doing a lot of work in his rhetoric — should just “decide what we want, decide we’re going to enforce it, and accept the fact that the future is going to be very different and probably wonderfully better.”

This line did not go over well either.

“A lot of nervous laughter,” Altman noted.

Then he waved his hands and shrugged. “I can lie to you and say, ‘Oh, we can totally stop it.’ But I think this is …”

Altman did not complete this thought, so we picked the conversation back up in late August at the OpenAI office on Bryant Street in San Francisco. Outside, on the street, is a neocapitalist yard sale: driverless cars, dogs lying in the sun beside sidewalk tents, a bus depot for a failing public-transportation system, stores serving $6 lattes. Inside, OpenAI is low-key kinda-bland tech corporate: Please help yourself to a Pellegrino from the mini-fridge or a sticker of our logo.

In person, Altman is more charming, more earnest, calmer, and goofier — more in his body — than one would expect. He’s likable. His hair is flecked with gray. He wore the same waffle henley, a garment quickly becoming his trademark. I was the 10-billionth journalist he spoke to this summer. As we sat down in a soundproof room, I apologized for making him do yet one more interview.

He smiled and said, “It’s really nice to meet you.”

On Kornfield: “Someone said to me after that talk, ‘You know, I came in really nervous about the fact that OpenAI was gonna make all of these decisions about the values in the AI, and you convinced me that you’re not going to make those decisions,’ and I was like, ‘Great.’ And they’re like, ‘Nope, now I’m more nervous. You’re gonna let the world make these decisions, and I don’t want that.’”

Even Altman can feel it’s perverse that he’s on that stage answering questions about global values. “If I weren’t in on this, I’d be, like, Why do these fuckers get to decide what happens to me?” he said in 2016 to The New Yorker’s Tad Friend. Seven years and much media training later, he has softened his game. “I have so much sympathy for the fact that something like OpenAI is supposed to be a government project.”

The new nice-guy vibe can be hard to square with Altman’s will to power, which is among his most-well-established traits. A friend in his inner circle described him to me as “the most ambitious person I know who is still sane, and I know 20,000 people in Silicon Valley.”

Still, Altman took an aw-shucks approach to explaining his rise. “I mean, I am a midwestern Jew from an awkward childhood at best, to say it very politely. And I’m running one of a handful …” He caught himself. “You know, top few dozen of the most important technology projects. I can’t imagine that this would have happened to me.”

ltman grew up the oldest of four siblings in suburban St. Louis: three boys, Sam, Max, and Jack, each two years apart, then a girl, Annie, nine years younger than Sam. If you weren’t raised in a midwestern middle-class Jewish family — and I say this from experience — it’s hard to imagine the latent self-confidence such a family can instill in a son. “One of the very best things my parents did for me was constant (multiple times a day, I think?) affirmations of their love and belief that I could do anything,” Jack Altman has said. The stores of confidence that result are fantastical, narcotic, weapons grade. They’re like an extra valve in your heart.

The story that’s typically told about Sam is that he was a boy genius — “a rising star in the techno whiz-kid world,” according to the St. Louis Post-Dispatch. He started fixing the family VCR at age 3. In 1993, for his 8th birthday, Altman’s parents — Connie Gibstine, a dermatologist, and Jerry Altman, a real-estate broker — bought him a Mac LC II. Altman describes that gift as “this dividing line in my life: before I had a computer and after.”

The Altman family ate dinner together every night. Around the table, they’d play games like “square root”: Someone would call out a large number. The boys would guess. Annie would hold the calculator and check who was closest. They played 20 Questions to figure out each night’s surprise dessert. The family also played Ping-Pong, pool, board games, video games, and charades, and everybody always knew who won. Sam preferred this to be him. Jack recalled his brother’s attitude: “I have to win, and I’m in charge of everything.” The boys also played water polo. “He would disagree, but I would say I was better,” Jack told me. “I mean, like, undoubtedly better.”

Sam, who is gay, came out in high school. This surprised even his mother, who had thought of Sam “as just sort of unisexual and techy.” As Altman said on a 2020 podcast, his private high school was “not the kind of place where you would really stand up and talk about being gay and that was okay.” When he was 17, the school invited a speaker for National Coming Out Day. A group of students objected, “mostly on a religious basis but also just, like, gay-people-are-bad basis.” Altman decided to give a speech to the student body. He barely slept the night before. The last lines, he said on the podcast, were “Either you have tolerance to open community or you don’t, and you don’t get to pick and choose.”

Read full story here…

Spread the love