Amazon CEO Andy Jassy: Cost of AI still more expensive than it should and will be

Amazon CEO Andy Jassy discussed the high costs of AI technologies and the company’s commitment to expanding its infrastructure to meet growing demand, despite potential tariff impacts. He emphasized the need to reduce AI costs, particularly in inference and chip development, to make AI more accessible and foster innovation across industries.

In a recent discussion, Amazon CEO Andy Jassy addressed the ongoing demand for AI technologies and the company’s commitment to expanding its infrastructure despite potential tariff impacts. Jassy emphasized that Amazon is not planning to reduce its building of data centers due to high demand, indicating a strong belief in the continued growth of AI applications. He acknowledged the excitement surrounding the introduction of DeepSeek, a new large language model that some believed could reduce the need for extensive data centers and processing power.

Jassy pointed out that while the cost of AI has decreased over time, it remains higher than it should be. He expressed a desire to make AI more affordable for customers, which he believes would lead to broader adoption and innovation. He noted that the challenges faced by companies in deploying AI are consistent, regardless of advancements in models like DeepSeek. Jassy highlighted that lowering AI costs would enable customers to innovate more, similar to the early days of Amazon Web Services (AWS), where reduced computing costs led to increased spending on infrastructure.

The CEO discussed the importance of reducing the cost of AI compute resources, particularly in relation to chips and inference costs. He explained that most current spending in AI is focused on training large models, but as applications scale, the majority of costs will shift to inference, which involves making predictions based on the trained models. Jassy emphasized that addressing these costs is crucial for the future of AI deployment.

To tackle these challenges, Jassy mentioned Amazon’s efforts to develop its own custom AI chips, which are designed to offer better price performance compared to existing GPU instances. He stated that these chips could provide a 30% to 40% improvement in performance, which is essential for reducing overall AI costs. By focusing on both chip development and inference cost reduction, Amazon aims to make AI more accessible and cost-effective for its customers.

In conclusion, Jassy’s insights reflect Amazon’s strategic approach to AI, emphasizing the need for continued investment in infrastructure and technology to drive down costs. He believes that as AI becomes more affordable, it will unlock new opportunities for innovation and growth across various industries. The company’s commitment to improving AI cost structures is seen as a vital step in fostering broader adoption and enhancing the capabilities of AI technologies.