Gavin Baker is the Managing Partner and CIO of Atreides Management. We cover the competition at scale among the Magnificent Seven, what he's learned from observing Nvidia's growth over a quarter-century, and how investors should approach the current AI paradigm.
Principles & Lessons:
1) Large-scale AI work depends on fundamental, system-wide efficiency rather than just better chips. Gavin broke down what he called the “unified AI efficiency equation,” which includes software efficiency on the chip, system FLOPS efficiency across networking and storage, checkpointing frequency, and power utilization. “If you have a higher MFU [model FLOPS utilization],” he explained, “you can choose between faster time to market, better quality, or lower cost.” He emphasized that today’s GPUs have grown 50x in speed while the rest of data-center components have only improved by around 5x, and bridging this gap is now the core bottleneck.
2) A race to create a “Digital God” is reshaping the competitive landscape. Gavin noted that leaders at major tech firms—like Sundar at Google or Mark Zuckerberg at Meta—“believe scaling laws are going to continue” and view losing that race as existential. He cited, “Larry Page has evidently said, ‘I am willing to go bankrupt rather than lose this race.’” Because the perceived value of full-scale general AI is so immense, companies are freely pouring huge capital into AI, ignoring short-run ROI concerns.
3) Synthetic data is critical for ongoing model improvement. Gavin acknowledged real data is finite, but “it looks like synthetic data works.” No one fully understands why, yet labs have shown they can refine large models using made-up training sets. This insight reduces the fear of a data ceiling, though Gavin qualified that “nobody knows if it will continue working” as models reach new scales.
4) Specialized AI hardware isn’t the only route; powerful software stacks and data-center design matter. While some assume chips alone dictate performance, Gavin stressed that AMD, for example, had competitive raw FLOPS but much poorer “maximum achievable matrix multiplication FLOPS” in real environments until its software improved. Similarly, a “redesigned data center from first principles,” like Elon is doing with xAI, can link tens of thousands of GPUs coherently. “It’s no longer about who builds the best GPU; it’s also who ties it all together with networking, storage, and reliability.”
5) Device-side inference will eventually merge with cloud-based AI. Despite all the emphasis on centralized infrastructure, Gavin forecasted a shift toward “local when you can, cloud when you must.” He pointed to Apple’s push for on-device AI, noting that phones with more DRAM could run small (but still powerful) models locally, reducing expensive GPU use in the cloud. “For simple questions,” he said, “think of it as like a 100 IQ model” on the phone, with bigger queries going up to the cloud. That suggests a new “super phone” era with local, personalized agents.
6) Robotics may have an even larger near-term impact on society than pure AI assistants. Gavin argued that Tesla’s real-world video data and “scaling laws for Full Self-Driving” are closing in on a breakthrough, describing a step change from FSD version 12.5 on AI 4 hardware. “All the progress of the last 10 years was in that one release,” he said of a recent upgrade. He predicted “abject humiliation” for FSD skeptics within 12 to 18 months and believes humanoid robots, aided by large language models, could automate a wide range of blue-collar tasks soon.
7) Big tech incumbents likely gain more from these transformations—unless a new AI pioneer matches their scale. Gavin highlighted how leading internet platforms (such as Meta, Google, Microsoft, and emerging labs like xAI) have unique data sets, distribution channels, and the capital to train giant models. “It’s not in the lab’s interest to open-source their secret sauce data,” he said, “but they can open-source the model if they already have the distribution advantage and proprietary real-time user feedback.”
8) Investors will need to adapt their own practices and tools. For public markets, Gavin expects that combining fundamental judgment with AI will become critical. “Tools that let me harness large language models on all my research notes have taken my usage from one hour a day to four,” he said. He likened the shift to a new “meta” in a competitive game: everyone using generative AI will raise the bar for alpha, but strong human judgment and domain knowledge still matter. In private markets, he predicted advanced operational help is now essential to stand out at late-stage funding—simply writing checks based on growth metrics is no longer enough.
Transcript