Why AI is slowing down in 2026

The video explains that AI progress is slowing down in 2026 mainly due to physical infrastructure bottlenecks—especially limited energy supply and hardware constraints—rather than research, funding, or regulatory debates. Overcoming these challenges, particularly by expanding energy and hardware capacity, is essential for accelerating future AI development.

The video explores why the progress of artificial intelligence (AI) is slowing down as of 2026, despite immense interest and investment. The main bottleneck is not in research or funding, but in physical and infrastructural constraints, particularly energy. The power grid’s limited capacity, slow interconnection processes, and the need for massive transformers and generation facilities are the primary obstacles. Building new data centers often requires waiting up to seven years for grid connection, and the demand for energy from AI data centers is projected to skyrocket, far outpacing current supply. While solutions like microgrids, on-site generation, and direct nuclear connections are being considered, these require significant time and regulatory intervention.

Another major constraint is the supply chain for AI hardware, especially high-bandwidth memory (HBM) and chip packaging (chip-on-wafer-on-substrate, or CoWoS). While GPU shortages have been largely resolved, memory and packaging are now the limiting factors, with most manufacturing capacity booked by major players like Nvidia. This has led to shortages in other sectors, such as consumer electronics and automotive. The market is expected to eventually resolve these shortages, but it will take time, just as it did with GPUs. In contrast, energy infrastructure issues are less responsive to market forces and require coordinated government action.

Operational friction within enterprises is also slowing AI adoption. Most AI pilot projects fail to reach production due to poor data quality, integration challenges with legacy systems, and a shortage of skilled AI engineers. Additionally, insurance companies are hesitant to cover AI-related risks because they lack sufficient data to price these risks accurately. This insurance gap has become a significant, if somewhat ironic, barrier to broader AI deployment, as many enterprises are unwilling to proceed without adequate coverage.

Public debates about AI safety, ethics, and regulation are largely considered “noise” by the video’s creator, who argues that these issues are not significant barriers to progress. Instead, the real friction comes from geopolitical competition (especially between the US and China), export controls, and regulatory environments like the EU’s AI Act, which stifles innovation in Europe. The US, by contrast, is pulling ahead in compute capacity, and most business adoption of AI is proceeding regardless of public sentiment or regulatory debates.

The current period (2026–2028) is described as a “digestion phase,” where the hype of previous years is giving way to the realities of scaling up infrastructure and integrating AI into the economy. The focus is shifting from building ever-larger models to improving efficiency and making the most of available resources. The video concludes that the future of AI acceleration depends on overcoming physical and operational bottlenecks—especially in energy and hardware—rather than on philosophical or regulatory debates. The message is clear: to speed up AI, society must focus on building and upgrading the physical infrastructure that supports it.