Nonprofit group joins Elon Musk’s effort to block OpenAI’s for-profit transition


Encode, the nonprofit org that co-sponsored California’s ill-fated SB 1047 AI safety legislation, has requested permission to file an amicus brief in support of Elon Musk’s injunction to halt OpenAI’s transition to a for-profit.

In a proposed brief submitted to the U.S. District Court for the Northern District of California Friday afternoon, counsel for Encode said that OpenAI’s conversion to a for-profit would “undermine” the firm’s mission to “develop and deploy … transformative technology in a way that is safe and beneficial to the public.”

“OpenAI and its CEO, Sam Altman, claim to be developing society-transforming technology, and those claims should be taken seriously,” the brief read. “If the world truly is at the cusp of a new age of artificial general intelligence (AGI), then the public has a profound interest in having that technology controlled by a public charity legally bound to prioritize safety and the public benefit rather than an organization focused on generating financial returns for a few privileged investors.”

OpenAI was founded in 2015 as a nonprofit research lab. But as its experiments became increasingly capital-intensive, it created its current structure, taking on outside investments from VCs and companies, including Microsoft.

Today, OpenAI has a for-profit org controlled by a nonprofit with a “capped profit” share for investors and employees. But in a blog post this morning, the company said it plans to begin transitioning its existing for-profit into a Delaware Public Benefit Corporation (PBC), with ordinary shares of stock and the OpenAI mission as its public benefit interest.

OpenAI’s nonprofit will remain but will cede control in exchange for shares in the PBC.

Musk filed for a preliminary injunction to halt the company’s transition to a for-profit, which has long been in the works, late in November. He accused OpenAI of abandoning its original philanthropic mission to make the fruits of its AI research available to all and of depriving rivals, including his AI startup, xAI, of capital through anticompetitive means.

OpenAI has called Musk’s complaints “baseless” and simply a case of sour grapes.

Facebook’s parent company and AI rival, Meta, is also supporting efforts to block OpenAI’s conversion. In December, Meta sent a letter to California attorney general Rob Bonta, arguing that allowing the shift would have “seismic implications for Silicon Valley.”

Lawyers for Encode said that OpenAI’s plans to transfer control of its operations to a PBC would “convert an organization bound by law to ensure the safety of advanced AI into one bound by law to ‘balance’ its consideration of any public benefit against ‘the pecuniary interests of [its] stockholders.’”

Encode’s counsel notes in the brief, for example, that OpenAI’s nonprofit has committed to stop competing with any “value-aligned, safety-conscious project” that comes close to building AGI before it does, but that OpenAI as a for-profit would have less incentive to do so. The brief also points out that the nonprofit OpenAI’s board will no longer be able to cancel investors’ equity if needed for safety once the company’s restructuring is completed.

OpenAI continues to deal with an outflow of high-level talent due in part to concerns that the company is prioritizing commercial products at the expense of safety. One former employee, Miles Brundage, a longtime policy researcher who left OpenAI in October, said in a series of posts on X that he worries about OpenAI’s nonprofit becoming a “side thing” that gives license to the PBC to operate as a “normal company” without addressing potentially problematic areas.

“OpenAI’s touted fiduciary duty to humanity would evaporate, as Delaware law is clear that the directors of a PBC owe no duty to the public at all,” Encode’s brief continued. “The public interest would be harmed by a safety-focused, mission-constrained nonprofit relinquishing control over something so transformative at any price to a for-profit enterprise with no enforceable commitment to safety.”

Encode, founded in July 2020 by high school student Sneha Revanur, describes itself as a network of volunteers focused on ensuring voices of younger generations are heard in conversations about AI’s impacts. Encode has contributed to various pieces of AI state and federal legislation in addition to SB 1047, including the White House’s AI Bill of Rights and President Joe Biden’s executive order on AI.


TechCrunch has an AI-focused newsletter! Sign up here to get it in your inbox every Wednesday.



Leave a Comment