OpenAI has struck letters of intent with South Korea’s leading chipmakers Samsung Electronics and SK Hynix to supply advanced memory wafers for its massive “Stargate” AI infrastructure initiative. In Seoul, CEO Sam Altman met with President Lee Jae-myung and chip executives to formalize the arrangement. Under the agreement, Samsung and SK Hynix will jointly target a production scale of up to 900,000 DRAM wafers per month — a demand that could represent as much as 40 percent of anticipated global DRAM output. The partnership also contemplates building AI data centers in South Korea, with OpenAI collaborating with SK Telecom and Samsung affiliates to explore floating data center designs and expanded local campus deployments. The deal reflects the urgent push by OpenAI to secure supply chains for memory — a critical component in scaling its AI compute operations.
Key Takeaways
– The agreement aims to supply OpenAI with up to 900,000 DRAM wafers per month, a massive demand potentially consuming a large share of global DRAM capacity.
– The partnership is more than chip procurement — it includes plans for data center construction and technological integration in South Korea, positioning the country as a strategic node in AI infrastructure.
– Market reaction was swift: shares of Samsung and SK Hynix surged after the announcement, reflecting investor confidence in long-term demand for AI-driven semiconductor supply chains.
In-Depth
OpenAI’s rapid scaling ambitions under its Stargate project demand not just compute power but also reliable, high-throughput memory systems. Memory — specifically DRAM and high-bandwidth memory (HBM) — is a foundational building block for high-performance AI workloads. Thus, securing supply from established leaders in the field is a strategic necessity. Samsung and SK Hynix, already dominant in DRAM and HBM markets, are well-suited to meet those demands.
In Seoul, OpenAI CEO Sam Altman, President Lee Jae-myung, Samsung’s leadership, and SK’s top executives convened to formalize a letter of intent. The agreement tasks Samsung and SK Hynix with scaling advanced memory production to support OpenAI’s AI data center fleet. The targeted throughput — 900,000 wafers monthly — isn’t a casual footnote; analysts suggest that amount could approach 40 percent of global DRAM wafer starts. That scale underscores how staggering the memory demand is when building AI infrastructure at hyperscale.
Beyond chips, the deal outlines collaboration in hardware deployment, logistics, and infrastructure design. OpenAI is exploring partnerships for AI data centers in South Korea, including working with SK Telecom and Samsung’s business units to integrate ChatGPT Enterprise, operate local campus sites, and even build floating data centers (which can help with cooling and land constraints). These steps deepen the relationship from supplier to strategic collaborator.
The deal also bears geopolitical and industrial implications. South Korea gains a prominent role in the global AI supply chain and can attract both investment and talent. For OpenAI, it helps mitigate one of the biggest bottlenecks in scaling AI: memory supply. Investors reacted favorably: SK Hynix shares jumped double digits, and Samsung also saw gains, reflecting market belief that demand for AI-related memory is here to stay.
Still, challenges remain. The timeline for rolling out wafer supply and converting wafers into usable memory modules has not been fully articulated. There remains a need to coordinate with foundries, packaging, quality control, and global logistics. Moreover, absorbing such scale will stress production capacity, requiring balancing with existing customers. In short, while the agreement is a potent signal of commitment and direction, execution will demand flawless engineering, management, and alignment across multiple sectors.

