Unless you are just returning from your extended vacation on a different planet, you have likely heard a thing or two about the exponential projections for penetration of AI applications in every aspect of everyday life across the full spectrum of consumer to industrial use cases. Like any major hype cycle [1], the excitement (and zealous investment) tends to be driven by the return on investment (ROI) upside that comes with an unbalanced perspective on the opportunities, while giving less attention to the challenges and pragmatism associated with executing to all those fantastic PowerPoint projections (see “vaporware”). I do not believe it is a coincidence that “hype” is the prefix of the word “hyperbole,” but I am no etymologist.
I will have to admit I thought there was more hyperbole than reality when first hearing of these crazy, exponential projections of increasing year-over-year (YoY) data center power footprint demands. To be fair, things do not seem quite as hyperbolic as initially appeared after spending the last 2+ years digging deeper into the rumors from high to low levels. NVIDIA seems poised to enable the >1 MW server rack before the end of the decade, solid-state (even bidirectional!) support is coming to market for line frequency (MV-to-LV) transformers/circuit breakers/relays, all the EV superchargers and battery management system (BMS) are being leveraged to support 800 Vdc (or +/−400 Vdc) power delivery networks within the data center infrastructure, etc.
For more on this article click here.