Following up Apple’s new initiative to move its own Arm-based chips, Amazon yesterday said that the AWS data centers for its cloud-based Alexa and facial recognition services will also be moving from Nvidia processors to custom Amazon chips.
This marks the latest in a trend of major tech companies choosing to take their processing in-house. It’s also not the first time Amazon’s made this move, as most of its other AWS servers now run on the company’s Arm-based Graviton 2 chips.
Instead of Graviton 2, however, Alexa and the company’s facial recognition service (named Rekognition) will now use the “Inferentia” chip, which focuses on speeding up machine learning tasks like translating text-to-speech and recognizing images.
According to Reuters, “Amazon said the shift to the Inferentia chip has resulted in 25% better latency…at a 30% lower cost.”
Currently, Nvidia supplies the chips for Alexa data centers, though Amazon hasn’t disclosed what chips Rekognition uses at the moment. Which raises an interesting question- we don’t know whether the Inferentia chip is Arm-based yet, but given that Nvidia recently bought Arm, that would mean Nvidia would still be providing the base Inferentia hardware.
Amazon currently hasn’t announced when we can expect new Inferentia-powered AWS services, but Rekognition might not be a top priority at the moment. The service has come under fire because of its use by law enforcement, and after the police killing of George Floyd this June, Amazon announced that police use of the tool would face a one-year moratorium.
Still, the shift to Inferentia is part of a new trend from major tech companies to ditch pre-developed chips in exchange for their own solutions. Currently, data center sales are Nvidia’s largest moneymaker, so we’re curious to see how the company will adapt.