We power AI

AI is the transformational technology of our time enabling many important use cases like Large Language Models, Health Care or Financial Applications. But we don’t talk enough about what it requires to power all these applications. Why is this topic of such importance? Behind the brilliance of AI lies a compute power and electricity-intensive process with a staggering CO2 footprint. We at Infineon are proud to work on the forefront of energy efficient, robust solutions to power AI from the grid to the core. Dive into the key challenges AI brings and explore the different ways to address them, guided by our team of experts.

Learn more on Infineon solutions for data centers here.


Any questions or comments? Write us at wepowerAI@infineon.com

We power AI with Adam White

Everybody talks about AI, but what are the consequences of such data generation and processing surge? Our expert Adam will dive into what GenAI means in terms of power requirements and environmental impact, introducing possible solutions.

The power struggle: tackling AI data centers’ energy requirements

Is power becoming a bottleneck in AI servers? In this second episode of our “We power AI” podcast series, Athar Zaidi, Senior Vice President and Business Line Head of Power ICs and Connectivity Systems at Infineon’s Power & Sensor Systems Division, explains how data centers’ infrastructures shall adapt to sustain the increasing power requirements of GenAI, outlining different strategies to boost efficiency.  

Optimizing AI server power flow with 48 V architectures and vertical power delivery

How is genAI impacting power design? In this third episode of our “We power AI” podcast series, Carl Smith, Senior Director of Information & Communication Technologies application explains how data centers’ infrastructures are shifting from the traditional 12 V power distribution architecture and lateral core rail power delivery to more efficient and high-density 48 V architecture and vertical power delivery.

The power of efficiency: the sweet spot for Si, SiC, and GaN in data centers

How can we improve power conversion efficiency, especially at the AC-DC stage? In this fourth episode of our “We Power AI” podcast series, Gerald Deboy, Fellow and Head of the PSS Innovation Lab at Infineon, dives into the different stages of power conversion and explains why data centers should embrace a combination of Si, SiC and GaN semiconductors to support GenAI requirements.

Robustness, efficiency and quality in data centers

Why are robustness and quality so important when it comes to data centers? In this fifth episode of the “We Power AI” podcast series, our expert Danny Clavette, Distinguished System Architecture Engineer, will provide some very interesting insights and examples on the server rack structure and failure rates in data center components, highlighting how Infineon solutions can support a better reliability in the GenAI world.

48 V: topologies, benefits, and applications

What are the main trade-offs to consider when designing a server power supply? When is it better to use unregulated topologies over the regulated ones? Find the answer and delve into the different 48 V topologies and applications with the expert guidance of our Principal Engineer Roberto Rizzolatti in the sixth episode of the "We Power AI" podcast series.