In the last three years since ChatGPT launched, interest in AI (Artificial Intelligence) and especially in LLMs (Large Language Models) has exploded—driving, if not innovation, plenty of buzz and the movement of money. Just in the US, the top 5 companies by market cap (NVIDIA, Apple, Alphabet, Microsoft, and Amazon) are all very tied into AI—and make up over a quarter of the total U.S. market capitalization. Over the last few years, we’ve seen eye-popping numbers associated with investment dollars towards energy usage of, and capacity projections for a new wave of datacenters focused on powering next-generation AI. While recent months have also increased discussions around whether these numbers point to a bubble or not, for now, there’s no doubt that there are significant work being put into developing a new generation of AI-focused data centers—with plenty of work to be done on the power electronics side.
Back in 2011, an open-source consortium called the Open Computer Project (OCP) was founded. The project began as an internal Facebook working group with the idea of sharing data center designs to streamline their supply chain [1], setting a de facto standard not unlike Tesla’s strategy of building first-to-market 400 V chargers, forcing other manufacturers to follow suit (at least in the United States) [2]. Today, OCP maintains standards related to several aspects of data center design, including those that helped usher in the move from 12 V to 48 V (54 V) rails. This move enabled higher power delivery (3.6 kW, up from 3.2 kW) for new data center builds for a variety of applications.
For more information click here.