Sam Altman, chief government officer of OpenAI Inc., speaks throughout BlackRock’s 2026 Infrastructure Summit in Washington, DC, US, on Wednesday, March 11, 2026.
Daniel Heuer | Bloomberg | Getty Images
When OpenAI CEO Sam Altman took the stage at BlackRock’s U.S. Infrastructure Summit earlier this month, he acknowledged his firm is going through a harsh actuality: data facilities are onerous.
“Anything at this scale, it’s just like so much stuff goes wrong,” Altman mentioned, in a hearth chat on the convention in Washington, D.C.
Altman gave an instance of a extreme climate occasion at a data center campus in Abilene, Texas, that quickly “brought things down.” The facility serves because the flagship website of OpenAI, Oracle and SoftBank’s $500 billion Stargate project. Altman mentioned his firm has additionally been navigating provide chain challenges and stress to fulfill tight deadlines.
The stakes for Altman are rising as he goals to show OpenAI, which was valued at $730 billion in a record fundraising spherical final month, from a non-public market darling into an investable asset for a extra discerning class of public market fund managers. That’s meant retreating from some hefty spending plans, shelving sure formidable tasks and accepting OpenAI’s function as a purchaser of huge quantities of cloud capability relatively than as a builder of mammoth data facilities.
“OpenAI has come to the realization that the market doesn’t necessarily appreciate the reckless approach to growth and spending,” Daniel Newman, CEO of Futurum Group, informed CNBC in an interview. “The market wants to see OpenAI’s revenues rolling at a pace in which the spending can be justified. The pivot, in my opinion, has been to try to show a little bit more fiscal responsibility.”
The strategic shift means OpenAI could need to accept doing much less whereas concurrently making an attempt to compete with Anthropic, Google and a number of different corporations growing AI fashions, apps and options. OpenAI trains and runs AI fashions that require monumental quantities of computational sources, together with chips, processing energy, reminiscence and power. Altman and different OpenAI executives have for years confused that compute is a significant bottleneck for the corporate, which has proceeded to lift astronomical sums of money, together with $110 billion earlier this 12 months, with $50 billion coming from Amazon.
In a post on X in November, Altman wrote that OpenAI and different corporations “have to rate limit our products and not offer new features and models because we face such a severe compute constraint.”

Up to that time, the massive story for OpenAI final 12 months was the extreme lengths Altman went to safe capability. The firm inked a flurry of multibillion-dollar infrastructure offers with corporations together with Nvidia, Advanced Micro Devices and Broadcom. Altman mentioned in his November submit that OpenAI was commitments of roughly $1.4 trillion over the subsequent eight years.
The offers rattled public markets, sparked fears a few potential AI bubble and induced many buyers to query how OpenAI might afford to make such eye-popping commitments with $13.1 billion in income for the 12 months.
OpenAI’s most notable announcement was with Nvidia. The chipmaker, which can also be the world’s most dear firm, agreed in September to take a position as much as $100 billion within the startup over various years, with capital distribution tied to OpenAI’s buildout and use of Nvidia’s know-how. OpenAI mentioned it deliberate to deploy a minimum of 10 gigawatts of Nvidia methods, with the primary $10 billion of funding arriving alongside completion of the primary gigawatt, a unit of energy that is roughly akin to the electrical energy consumption of a mid-sized metropolis.
The press release mentioned the partnership “enables OpenAI to build and deploy at least 10 gigawatts of AI data centers.”
Analysts told CNBC on the time that the deal was harking back to the seller financing that fueled the dot-com bubble within the late Nineties. Altman repeatedly brushed off concerns about OpenAI’s formidable infrastructure plans, suggesting that income would balloon into the lots of of billions by 2030.
But in current months, as the corporate has been gearing up for a potential IPO later this 12 months, OpenAI has tempered expectations and outlined a extra measured technique. OpenAI told investors in February that it is now concentrating on roughly $600 billion in whole compute spend by 2030, a determine that is meant to extra instantly tie to its anticipated income progress.
The firm is emphasizing self-discipline throughout different corners of its enterprise as effectively. In December, OpenAI declared a “code red” to deal with bettering its ChatGPT chatbot within the face of rising competitors from Google and Anthropic.
Fidji Simo, OpenAI’s CEO of functions, held an all-hands meeting with staffers earlier this month in regards to the enterprise enterprise, and mentioned the corporate is “orienting aggressively” in the direction of high-productivity use instances.
“What really matters for us right now is staying focused and executing extremely well,” Simo mentioned, in accordance with a partial transcript of the assembly reviewed by CNBC.
‘This is the race’
The Stargate AI data center in Abilene, Texas, US, on Wednesday, Sept. 24, 2025.
Kyle Grillot | Bloomberg | Getty Images
OpenAI would not at present personal any data facilities, and should not for the foreseeable future, in accordance with folks acquainted with the matter who requested to not be named as a result of they weren’t approved to talk publicly.
Instead, it is opted to lean closely on companions like Oracle, Microsoft and Amazon, making an attempt to piece collectively as a lot capability as doable.
A 12 months in the past, issues appeared very totally different for OpenAI. In January 2025, President Donald Trump unveiled the Stargate mission alongside Altman, SoftBank CEO Masayoshi Son and Oracle Chairman Larry Ellison throughout an occasion on the White House. The corporations pledged to deploy $500 billion over 4 years to construct out new AI infrastructure within the U.S.
OpenAI can be answerable for mission operations, whereas SoftBank can be in command of the funds, in accordance with a blog post on the time. Oracle and Nvidia had been named as key preliminary know-how companions.
“Oracle, Nvidia, and OpenAI will closely collaborate to build and operate this computing system,” the discharge mentioned.
As Stargate bought underway, OpenAI was ready to develop massive parts of the mission itself, and it aimed to instantly lease or personal some data center campuses, in accordance with a report from The Information. But after the corporate got here nose to nose with sensible development points and struggled to safe backing from lenders, it pivoted.
Oracle is leasing Stargate’s data center campus in Abilene, and has been funding the buildout by taking over tens of billions of {dollars} in debt.
OpenAI and Nvidia mentioned of their September launch that the primary gigawatt of Nvidia methods can be deployed within the second half of 2026. Experts mentioned that timeline can be powerful in one of the best of circumstances.
Walid Saad, an engineering professor at Virginia Tech, mentioned constructing a 1-gigawatt data center from begin to end might take wherever from three to 10 years. Challenges can crop up each step of the best way– from discovering a website, securing correct permissions and allowing, accessing energy, setting up the bodily construction, delivering the {hardware} to lastly bringing it on-line.
“There’s regulations, there’s permits, different locations have different processes,” Saad mentioned. “There are processes they cannot control. You never know what pops up.”
Those obstacles have turn out to be very actual for OpenAI, Arun Chandrasekaran, an AI analyst at Gartner, informed CNBC in an interview.
“They’re starting to say, ‘You know what, let’s try to secure the capacity that we can from the providers that are willing to give us that capacity now,'” Chandrasekaran mentioned.
OpenAI did not present a remark for this story.

As part of OpenAI’s $110 billion financing announcement final month, the corporate agreed to eat roughly 2 gigawatts of Trainium capability via Amazon Web Services infrastructure. Trainium is AWS’ customized AI chip. Amazon introduced the newest model, Trainium3, in December.
Nvidia additionally contributed to OpenAI’s funding spherical, investing $30 billion. OpenAI mentioned it expanded its collaboration with Nvidia as a part of the deal, and agreed to make use of 3 gigawatts of devoted inference capability and a couple of gigawatts of coaching capability on Nvidia’s forthcoming Vera Rubin methods.
“OpenAI is doing what it must do, which is gain access to compute at scale,” Futurum Group’s Newman mentioned, including that Meta, Anthropic and Google are doing the identical. “This is the race.”
Nvidia’s funding landed after months of speculation in regards to the standing of the main infrastructure deal that the businesses introduced in September. The chipmaker disclosed in a quarterly filing in November that the $100 billion deal could not come to fruition, and The Wall Street Journal reported in January that the settlement was “on ice.”
Nvidia famous in a February submitting that there was “no assurance” that the corporate will enter into an “investment and partnership agreement with OpenAI or that a transaction will be completed.”
At a convention earlier this month, Nvidia CEO Jensen Huang reined in expectations even additional, and mentioned that the chance to take a position $100 billion in OpenAI might be “not in the cards.”
The newest funding shouldn’t be tied to any deployment milestones, and is distinct from the deal construction the businesses touted six months in the past. Huang mentioned it “might be the last time” Nvidia invests in OpenAI forward of its IPO.
“To their credit, they built an incredible growth story. It’s just – the rest of the ride won’t be a free one,” Newman mentioned of OpenAI. “And because their cost structure is so high, their route to profitability will be scrutinized every step of the way.”
–CNBC’s Kate Rooney contributed to this report
WATCH: OpenAI renews focus on enterprise in all-hands meeting amid IPO push
