Main menu

Pages

Amazon is making the most of AI

 

Amazon

Amazon develops artificial intelligence systems


According to CNBC, Amazon is now working on two CPUs specifically for AI operations.


According to the source, one of the chips is known as Inferentia and the other as Trainium.


Amazon is attempting to compete with NVIDIA, the top producer of artificial intelligence chips, whose superior Grace Hopper chips are the most significant.


The creation of the new chips is being managed by Amazon Web Services, a division of Amazon Cloud Computing.


The Nitro chip, of which Amazon claims each of its servers contains at least one, was created by the company 10 years ago, thus it is interesting that Amazon has experience creating customized electronic chips.


With the new Amazon chipset, clients of Amazon will be able to create LLMs on their AWS servers using their own CPUs instead of NVIDIA chips.


However, some of the cloud computing services offered to clients by Amazon themselves now rely on the NVIDIA H100 Tensor Core processor.


According to Amazon, one of its main benefits is that millions of customers have already used AWS servers and are accustomed to the servers and their capabilities.


According to May-Lan Thomsen, vice president of technology at AWS, "how quickly companies can move to develop generative AI applications is driven by starting first with the data they have on AWS servers and using our computing and machine learning tools."


More than 100,000 clients, according to Amazon, are currently using AWS services for machine learning.


All of the company's businesses currently have numerous generative AI programs, according to Andy Jassy, CEO of Amazon.

Comments