Hyperscaler leverages a two-tier Ethernet-based topology, custom AI Transport Layer & software tools to deliver a tightly integrated, low-latency platform ...
Calling it the highest performance chip of any custom cloud accelerator, the company says Maia is optimized for AI inference on multiple models.
Microsoft recently announced Maia 200, a new AI accelerator specifically designed for inference workloads. According to ...
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the ...
The company said Maia 200 offers three times the compute performance of Amazon Web Services Inc.’s most advanced Trainium processor on certain popular AI benchmarks, while exceeding Google LLC’s ...
Microsoft’s Maia 200 AI chip highlights a growing shift towards a model of vertical integration where one company designs and ...
Microsoft has committed to “paying its way” to ensure its data centers will not ramp up residential utility rates, becoming ...
The Chinese AI start-up says its latest OCR model delivers stronger performance after adopting an Alibaba-developed ...
The developments have turned up focus on earnings from Microsoft (MSFT) and Meta Platforms (META), which have invested heavily in an AI-focused data center buildout underpinned by those chips. The ...
Discover the top tech skills for 2026. Learn about cloud computing, data science, web development, and more to future-proof ...
Robin Rowe talks about coding, programming education, and China in the age of AI feature TrapC, a memory-safe version of the ...