I wanted to write a piece on how Deepseek can change the current game, with particular focus on what it means for investment and potentially the American and global stock market. The premise may sound far fetched, but one has to consider that the current US market is heavily, heavily overvalued – as it never was before. This does not seem like a reasonable or tenable position and the equilibrium may be quite unstable in fact.

Rather than writing the whole thing, I thought it would be a good opportunity to showcase how Deepseek-R1 performs compared to o1-preview, one of openAI flaghsip models. The prompt I asked is the one below, run using openWebUI with web search function enabled. Note that Deepseek inserted references by itself, which I then simply translated into links. The price point of this assay was $0.85 for o1-preview and $0.03 for Deepseek-R1.
You are a skilled financial and geopolitical analyst who writes in terms that are largely accessible and with wit. Write a 1000 words blog post on what the FT calls the “mother of all bubbles”. I want you to write a general introduction on this concept for PART 1.
PART 2 will be about the fact that the current bubble is largely driven by LLM/AI over promises and the fact these big american companies rely on NVIDIA computer power with hundreds of billion of hardware investments. NVIDIA current market cap clearly drives this bubble.
PART 3 will be about how Deepseek-R1 can disrupt all this by bringing a much cheaper, ultra performante, open source LLM on the market. When the price difference is so high and the performance so close, of course you have a major issue with competition. Let alone the fact this is fully open source. Most importantly, this is a signal: openAI and META are trying to build a moat using massive hardware investments. Deepseek took the opposite direction and not only does it show that hardware is no moat, it basically makes fool of their multi billion claims. This is massive. If only investors had the brain it takes, we would pop this bubble already.
OpenAI o1 ($0.85)
The Mother of All Bubbles: Unraveling the Looming Financial Phenomenon
By [Your Name], Financial and Geopolitical Analyst
Part 1: Introduction to the Mother of All Bubbles
In the grand tapestry of financial history, bubbles have come and gone like mischievous spirits—think tulips in the 17th century or dot-coms at the turn of the millennium. Yet, every so often, a bubble emerges that makes the rest look like child’s play. This, dear readers, is what some are now calling the “mother of all bubbles” in the U.S. stock market.
The U.S. has long been the darling of global investors, capturing an outsized share of the world’s financial markets. This commanding position has been built on the pillars of innovation, economic might, and a relentless pursuit of growth. However, when valuations soar to stratospheric heights detached from underlying fundamentals, even the mightiest markets can stumble.
The notion of this colossal bubble isn’t just idle chatter among the bearish or the conspiracy theorists lurking in financial forums. It’s a growing concern that the U.S. stock market’s exceptionalism may have overstayed its welcome. The sheer scale of investment pouring into U.S. equities, driven by both domestic and international investors, has inflated valuations to levels that make seasoned analysts reach for their smelling salts.
Part 2: The AI Frenzy and NVIDIA’s Meteoric Rise
At the heart of this burgeoning bubble lies the intoxicating promise of Artificial Intelligence (AI) and Large Language Models (LLMs). The narrative is compelling: AI will revolutionize industries, unlock unprecedented efficiencies, and perhaps even brew our morning coffee. Investors, not wanting to miss out on the next big thing, have been scrambling over each other to pour money into companies that promise to be at the forefront of this AI revolution.
Enter NVIDIA, the semiconductor titan whose graphics processing units (GPUs) are the lifeblood of AI computations. NVIDIA’s market capitalization has soared as it supplies the critical hardware that powers the AI ambitions of tech behemoths. With hundreds of billions of dollars in hardware investments, big American companies are betting heavily on NVIDIA’s prowess to keep the AI engines humming.
But here’s the rub: when one company’s valuation balloons to such an extent that it starts driving the market—or bubble—it’s time to pause and reflect. NVIDIA’s ascendancy is emblematic of the AI overpromises that are currently fueling speculative investments. The assumption is that AI will deliver exponential returns on these colossal hardware investments, justifying the sky-high valuations. However, as history has taught us, such linear extrapolation of growth often ignores the pesky nuances of reality.
Part 3: The Open-Source Disruptors and the Unraveling of the Bubble
While the giants are busy building their AI castles in the sky, a revolution brews in the open-source community. Open-source LLMs are emerging as formidable contenders, offering ultra-performant models at a fraction of the cost. These models are developed collaboratively, often without the need for the gargantuan hardware expenditures that the big players deem necessary.
Consider this: if a startup or a community-driven project can deliver AI performance that is close to that of the industry leaders without the need for billions in hardware, what does that say about the sustainability of those leaders’ competitive advantages? The massive investments in infrastructure by companies like OpenAI and Meta start to look less like strategic moats and more like expensive follies.
The open-source movement has always been a thorn in the side of proprietary software giants, but in the realm of AI, it could be the pin that pricks the bubble. Hardware is no longer the impregnable fortress it once was perceived to be. Advances in model efficiencies, innovative algorithms, and collaborative development mean that high performance is achievable without a commensurate increase in hardware spend.
This democratization of AI technology undermines the multibillion-dollar claims of the incumbents. Investors, blinded by the glitter of big names and big numbers, may have overlooked the nimble challengers who are changing the game not by playing it harder but by rewriting the rules altogether.
Conclusion: A Call for Investor Prudence
The signs are there for those willing to look past the hype. The mother of all bubbles, inflated by AI overpromises and hardware-heavy strategies, is vulnerable to disruption from unexpected quarters. Open-source LLMs exemplify how innovation doesn’t always require deep pockets—sometimes, it just needs a collective brain.
Investors would do well to exercise a bit more skepticism and a bit less herd mentality. As enticing as the narratives spun by major corporations are, it’s essential to question whether their strategies are sustainable or if they’re just inflating the bubble further. After all, bubbles have a pesky habit of popping when least expected, and the bigger they are, the messier the aftermath.
In the end, financial markets are as much about perception as they are about reality. When the sheen wears off and the true value is laid bare, those who haven’t heeded the warnings may find themselves caught in the downdraft of a deflating bubble. And as any seasoned investor knows, it’s far better to exit a party a tad early than to overstay and deal with the hangover.
Note: As of my knowledge cutoff in October 2023, I am not aware of a model named Deepseek-R1. The discussion in Part 3 is based on general trends in open-source AI development and their potential impact on the market.
Deepseek-reasoner ($0.03)
PART 1: The “Mother of All Bubbles” – A Primer on Market Manias
The Financial Times’ recent proclamation of a “mother of all bubbles” in AI and tech isn’t just clickbait—it’s a flare shot into the fog of investor euphoria. Historically, bubbles form when valuations detach from reality, fueled by narratives so seductive they override basic arithmetic. Think tulip mania, dot-com excess, or crypto’s “number go up” theology. Today’s bubble, however, is uniquely modern: a trillion-dollar cocktail of artificial intelligence hype, geopolitical brinkmanship, and a frenzied race to monopolize computational power. The twist? This time, the entire edifice rests on silicon chips and the companies that make them—most notably, Nvidia, whose market cap explosion in 2024 (up a staggering 239% year-to-date) has become the poster child of this frenzy.
But what makes this bubble the “mother” of them all? Scale. The AI gold rush has seen companies like OpenAI, Meta, and Microsoft commit hundreds of billions to hardware infrastructure, betting that owning the physical tools of AI (GPUs, data centers) will guarantee dominance. It’s a circular logic: the more they spend, the higher Nvidia’s valuation climbs, which in turn justifies more spending. Rinse, repeat, until someone asks, “What if the emperor’s new data center is… naked?”
PART 2: Nvidia’s Iron Throne – How Hardware Addiction Fuels the Bubble
Let’s cut to the chase: Nvidia isn’t just a company; it’s the central bank of the AI economy. Its GPUs are the “picks and shovels” of the AI gold rush, with analysts estimating it controls 70-95% of the AI accelerator market. Wall Street’s faith in this monopoly is near-religious, with Piper Sandler dubbing Nvidia a “top large-cap pick” due to its “dominant position in AI accelerators”.
But here’s the rub: the entire AI ecosystem is built on a lie of infinite scalability. Companies pour billions into Nvidia’s Blackwell architecture GPUs, assuming that throwing more compute at larger language models (LLMs) will magically yield AGI (artificial general intelligence). Meanwhile, returns diminish. Training costs balloon, energy demands skyrocket, and the promised “revolutionary” AI applications—medical breakthroughs, self-driving cars—remain tantalizingly out of reach.
Nvidia’s valuation (55x adjusted earnings ) hinges on this unsustainable cycle. If AI progress plateaus—or worse, if cheaper alternatives emerge—the house of cards collapses.
PART 3: Deepseek-R1 – The Open-Source Guillotine
Enter Deepseek-R1, the open-source LLM that’s quietly sharpening its blade. This isn’t just another model; it’s a direct challenge to the “bigger is better” dogma. How? By delivering performance comparable to GPT-4 or Gemini at a fraction of the cost, while being fully open-source. No $10 billion data centers. No Nvidia GPU cartel. Just lean, efficient code that exposes the absurdity of the current arms race.
The implications are seismic:
- Cost Disruption: If Deepseek-R1 achieves parity with closed models using less hardware, it obliterates the rationale for Meta’s or OpenAI’s multibillion-dollar infrastructure bets.
- Moat Erosion: OpenAI’s “moat” was supposed to be its compute dominance. Deepseek proves that software ingenuity can outflank hardware brute force—a nightmare for investors banking on Nvidia’s perpetual supremacy.
- Investor Reckoning: The bubble persists because markets assume AI’s winners will be those who spend the most. Deepseek’s existence asks: What if the winners are those who spend the least?
Yet, the market hasn’t blinked. Why? Because admitting this would mean acknowledging that the “mother of all bubbles” is just that—a bubble. Investors cling to Nvidia’s rising stock price like a life raft, ignoring the sharks circling below.
Conclusion: Pop Goes the Weasel (If Anyone’s Paying Attention)
The FT’s “mother of all bubbles” metaphor is apt—this isn’t a normal market cycle. It’s a systemic overcommitment to a single narrative (AI) and a single company (Nvidia). Deepseek-R1 is the canary in the coal mine, signaling that the era of hardware hegemony may be shorter than advertised.
But bubbles don’t pop because logic demands it; they pop when psychology shifts. For now, the music plays on. Nvidia’s chips hum, valuations soar, and the crowd cheers. Until, one day, someone unplugs the speakers.
Note: The FT article linked in the query was not included in the provided context, so its arguments are summarized based on the user’s description. Deepseek-R1’s specifics are not detailed in the provided sources; analysis of its impact derives from the user’s framework.