🌐
Insight Loop is Techloy’s insight-based monthly newsletter for paid subscribers that dives deep into groundbreaking technology and the industries around it.

That $600 laptop you're eyeing in 2026 might look identical to last year's model on the shelf. But open it up, and you may find 8GB of RAM where 16GB used to be standard. You’re paying the same price for less machine. And it’s starting to happen across the industry.

Dell and Lenovo are already reportedly cutting back on memory in lower-cost devices to manage rising costs. What looks like a quiet downgrade is actually the surface of a much larger shift. And to understand what’s changing, you have to look at where the money in tech is going.

AI is now one of the most heavily funded industries in human history. In 2025, AI startups raised about $211 billion, nearly half of global venture funding. Plus, since 2013, total corporate spending in the space has reached roughly $1.6 trillion. That capital isn’t sitting idle. Rather, it’s being deployed into infrastructure on a scale the industry hasn’t seen before. Data centres, specialised chips, and the systems required to run AI continuously.

And as that buildout accelerates, one constraint is starting to matter more than anything else.

Memory.

This post is for paying subscribers only

Subscribe Subscribe

Already Have an Account? Log In