Microsoft Azure's AI inference accelerator Maia 200 aims to outperform Google TPU v7 and AWS Inferentia with 10 Petaflops of FP4 compute power.
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the economics of AI token generation. Maia 200 is an AI inference powerhouse: an ...
China’s race to build homegrown artificial intelligence chips has collided head on with Nvidia’s H200, the United States company’s latest workhorse for training and running large models. The result is ...
Raspberry Pi has started selling the AI HAT+ 2, an add-on board that represents a significant upgrade over the AI HAT+ model launched in 2024. While ...