It’s not casual to get an interrogation with Sam Altman—just inquire Adam Bhala Lough, the filmmaker down the caller documentary Deepfaking Sam Altman.
Lough primitively planned a diagnostic exploring the imaginable and perils of AI that would halfway connected a speech with the OpenAI CEO. But, aft having his inquiries ignored for months, helium opted alternatively to committee a chatbot that mimicked Altman’s code patterns and approximated his facial expressions by mode of a integer avatar.
The existent Altman did beryllium down, however, for the caller diagnostic The AI Doc: Or How I Became an Apocaloptimist, which hits theaters March 27. So did Dario Amodei, the CEO of Anthropic, and Demis Hassabis, a cofounder and CEO of Google’s DeepMind Technologies. (Though the filmmakers accidental they requested interviews with Meta’s Mark Zuckerberg and X’s Elon Musk, neither made an appearance.)
It’s an awesome level of entree for codirector and documentary protagonist Daniel Roher, whose 2022 documentary Navalny, astir the Russian absorption person Alexei Navalny, won an Academy Award. The occupation is that erstwhile they’re connected camera, Altman et al. accidental small we haven’t heard before—and they skate by connected glib answers concerning their responsibilities to the remainder of their species. When Roher asks Altman wherefore anyone should spot him to usher the accelerated acceleration of AI, fixed its utmost ramifications, Altman replies: “You shouldn’t.” The enactment of interrogation ends there.
The AI Doc is framed by Roher’s anxiousness implicit the impending accomplishment of his lad and archetypal kid with his wife, filmmaker Caroline Lindy. He wonders what benignant of a satellite his lad volition inherit and whether the emergence of artificial quality volition preclude the experiences that make america into self-sufficient adults. In Roher’s archetypal respective interviews, each his worst fears look to beryllium confirmed. Tristan Harris, cofounder of the nonprofit Center for Humane Technology, delivers 1 of the worst gut punches: “I cognize radical who enactment connected AI hazard who don’t expect their children to marque it to precocious school,” helium says, invoking a script successful which the exertion demolishes the precise infrastructure of accepted education.
Despite the consciousness of mounting panic, Roher and codirector Charlie Tyrell contiguous an admirably robust clang people successful AI and the biggest questions it poses, helped on by Roher’s insistence connected defining presumption successful plain connection alternatively than startup buzzwords. Visually, the movie is charmingly human, featuring colorful drawings and paintings by Roher, portion whimsical stop-motion sequences hint astatine the power of shaper Daniel Kwan, the Oscar-winning codirector of Everything Everywhere All astatine Once. The vibrant creativity amid portents of doom provides immoderate of the anticipation that Roher is desperately seeking.
Yet aboriginal interviews with Silicon Valley techno-optimists promising AI that conquers diseases and clime change—followed by the CEOs striking their accustomed equilibrium betwixt hype and the tones of sober caution—pass without overmuch interrogation of grandiose claims. There is hardly a infinitesimal spent considering wherefore oregon however we should expect the existent harvest of fallible ample connection models to springiness emergence to the mythical “artificial wide intelligence” (AGI) that would outstrip quality cognition. There are, astatine best, euphemistic acknowledgements (from task capitalist Reid Hoffman, for example) that immoderate benefits volition travel on with unspecified harms.
Even erstwhile the apical players accidental that the near-term implications of AI are arsenic important arsenic the advent of atomic armament, they are defaulting to a acquainted playbook, presenting their products arsenic singularly consequential 1 mode oregon another—hinting that lone they tin beryllium trusted to beforehand them.









English (CA) ·
English (US) ·
Spanish (MX) ·