Amazon Rushes Out Latest AI Chip to Take On Nvidia, Google

Amazon is accelerating the development of its own AI chips to reduce reliance on Nvidia and improve capital expenditure efficiency, aiming to compete with industry leaders like Nvidia and Google in the AI hardware space. Meanwhile, Apple risks falling behind due to a slower AI strategy and lack of strong partnerships, which could impact its competitiveness in both hardware and software innovation over the next few years.

The video discusses Amazon’s strategic move to develop its own AI chips to compete with industry giants like Nvidia and Google. Amazon, being the largest cloud provider with nearly 50% market share, has traditionally relied on Nvidia GPUs for AI training, similar to most other players except Google. However, Amazon is now following Google’s example by designing proprietary chips to reduce dependency on Nvidia and cut down capital expenditure, which currently accounts for 20 to 25% of their spending on Nvidia’s hardware. While Google has seen more success with this approach, Amazon is accelerating its efforts to catch up in the AI chip race.

The conversation highlights the growing competition among hyperscalers such as Amazon and Google, who are becoming formidable players in the AI hardware space. The discussion touches on the future landscape of AI, particularly looking ahead to 2026, where the winners will be those who optimize their cloud infrastructure and hardware investments efficiently. Unlike companies like Meta, which have different business models, Amazon and Google operate large public cloud businesses, making capital expenditure efficiency a critical metric for success in the AI arms race.

Capital expenditure efficiency is expected to become a central focus for cloud providers in the coming years. The video explains that while the past couple of years saw a “gold rush” to acquire GPUs and build AI training capabilities, the next phase will emphasize optimizing spending and improving the cost-effectiveness of AI infrastructure. Companies that can deliver high-performance AI training and inference capabilities while managing their capital expenses effectively will have a competitive advantage in the evolving market.

The video also touches on Apple’s position in the AI landscape, noting concerns about the company’s slower progress in developing a coherent AI strategy. Apple has lagged behind other major players in investing in AI models and integrating AI capabilities into its operating systems. Recent developments from competitors like BBC and Bytedance, which are releasing advanced AI models that can run natively on operating systems, highlight the urgency for Apple to catch up. The lack of clear partnerships with AI leaders such as OpenAI or Google further complicates Apple’s position.

Finally, the video warns that while Apple’s hardware sales have not yet been significantly affected, the absence of a strong AI strategy could impact the company negatively in the next two years. If Apple fails to develop native AI models and integrate them effectively into its ecosystem, it risks losing ground in both hardware and software innovation. The overall message underscores the importance of AI integration and capital expenditure efficiency as key factors shaping the future competitiveness of major tech companies in the AI era.