● Nvidia Stalls, AI Gold Rush Still On
NVIDIA Has Been Sideways for Months… The Reason Why “Is It Over?” Is Always Wrong (AI, Semiconductors, and Macro All in One)
This article includes the following.
1) Exactly what the “trigger” was for the 2022 crash → the 2023–2025 surge (how supply chains changed after ChatGPT)
2) Why H100 and Blackwell shortages are not just temporary issues but signals of an “infrastructure shift in the AI economy”
3) Why the stock repeatedly recovers whenever fears of NVIDIA substitutes like DeepSeek or TPUs appear
4) The next growth pipeline leading to robots (physical AI) and autonomous driving, and where NVIDIA targets the “real money” there
5) The core point that other news/YouTube rarely mention: the defensive line is not the GPU but the “ecosystem + deployment/operational standards”
1) Market timeline: Why “crash → shortage → market cap No.1” was not a coincidence
2022 saw inflation surge and the Fed raise rates, shaking growth stocks (especially big tech).
NVIDIA was no exception, falling about 70% from its peak during that period, and commentators repeatedly said “it’s over.”
But inside the company the signals were the opposite.
The mood was more like “the stock is down, but the offices are busier than ever. Something big is coming.”
What that “something big” actually was is the popularization of generative AI represented by ChatGPT, which was unveiled on November 30, 2022.
Generative AI spreads quickly once demand exists, so the market recognized it fast.
Running AI requires compute, and the belief that NVIDIA GPUs were effectively the standard hardware for that spread rapidly.
From that point, the H100 was treated like a “golden ticket” and a phase where supply could not keep up with demand began.
As a result, the stock surged 4.5x in 2023, and in 2024 it topped Apple to become the No.1 market cap (around $3 trillion at the time).
This was not just a thematic rally; the company was revalued as the holder of core equipment for AI infrastructure.
2) Why the “AI era’s oil” analogy resonates: It’s about computing infrastructure, not just semiconductors
Goldman Sachs calling NVIDIA “the oil of the AI era” is very intuitive.
If oil was the energy foundation that powered cars, planes, and factories in the 20th century,
then in the 21st century computing (the operations that run AI) is becoming the productivity foundation for almost every industry.
NVIDIA is not simply a “chip seller.”
It has become more like a supplier that has effectively standardized how AI is run in the cloud.
So viewing NVIDIA’s rally as “just the stock going up a lot” misses the core point.
This trend is reflected in global supply chains as well.
AI data center expansion → investments in power/cooling/networking/memory (HBM) → broader adoption of AI services… these link together.
Ultimately, NVIDIA stands at the center of the AI data center investment cycle.
3) Key point for 2025: Blackwell shortages are a phase where scarcity translates into results
One of the points emphasized in the original text was that “the hardest thing to obtain in IT right now is Blackwell.”
This is not just a hot new product; it should be read as a signal that companies are actually attaching AI to their operations.
When projects move from experiments and proofs-of-concept to being deployed in services, operations, and customer touchpoints, inference demand explodes.
Another important factor is the profitability structure.
As noted in the original text, gross margins in the 70% range mean supplier advantage (pricing power) is very strong.
This level of margin is not a sign of a fad but of “no immediate viable alternative suppliers.”
When a narrative of large backlog demand is attached to this,
the market tends to focus more on “next quarter / next year’s secured revenue” than on short-term results.
In such phases, even if the stock stalls, the underlying trend often remains intact.
4) Why NVIDIA is truly formidable: the competition is not about chips but an ecosystem war
Fear of substitutes repeats frequently.
When high-efficiency models like DeepSeek appear, people say “we’ll need fewer GPUs,”
and when Google uses TPUs at scale for training, headlines ask “do we need NVIDIA anymore?”
But the core point from the original text was this.
For twenty years NVIDIA has locked researchers, developers, and enterprises into the CUDA ecosystem and their workflows.
What matters there is switching cost.
It’s not just changing a model; development, debugging, libraries, optimization, operations, and monitoring are all entangled.
So even if TPUs or other chips look better in raw performance, migration does not happen as easily as it seems.
In the end, NVIDIA’s defense is more like an established platform standard than merely GPU performance.
5) “If efficiency improves, will GPU demand fall?” → Jevons paradox means demand can actually rise
The DeepSeek episode in the original text explained why the stock plunged and then recovered.
When high-efficiency models appear, it initially looks like “we’ll need fewer GPUs,” but
in reality, when costs fall, AI gets attached to many more use cases, and total demand often increases sharply.
This is the Jevons paradox.
If compute efficiency improves, AI becomes cheaper and can be applied more often and to many more services,
resulting in a structure where overall GPU usage increases.
So “model efficiency = NVIDIA collapse” is an oversimplification.
A more important factor is that AI is still in the early stages of adoption.
6) Next megatheme: expansion into robots, autonomous driving, and digital twins (physical AI)
The reason GTC 2025 carried the message “next is the era of robots” is connected to the previous points.
If generative AI focused on text, images, and code,
the next step is bringing intelligence down into the physical world (factories, logistics, robot arms, autonomous driving) — physical AI.
NVIDIA is targeting two things here.
First, demand for digital twins used for simulation-based factory and warehouse design and operational optimization.
Second, the spread of AI compute platforms embedded in edge devices (robots/vehicles), such as the Jetson family.
In short, NVIDIA is not just after data centers.
It aims to lay its ecosystem across the “intelligence layer” of real-world industrial automation.
If successful, NVIDIA’s market could expand beyond GPU sales into industrial productivity investment cycles.
7) 2026 Rubin architecture watch points: narrowing the cost-performance gap lengthens the game
One advantage of substitutes like TPUs is often cited as cost-performance.
If the next-generation Rubin architecture narrows that gap,
then from NVIDIA’s perspective “performance advantage + ecosystem advantage” could be sustained longer.
This message matters for investors.
Markets are always looking for a peak in growth,
and if an architectural roadmap is credible, the market tends to push that “peak” further into the future.
8) Risk checklist: “It may not be over” can be wrong, but that does not mean there are no risks
The original tone is strongly optimistic, but real-world risks should be considered as well.
1) The speed at which big tech’s custom chips (e.g., TPUs) spread
2) Competitors expanding supply, such as AMD and Huawei
3) Bottlenecks for AI data centers in power/regulation/permits (grid capacity, cooling water, carbon regulations)
4) Macro variables: renewed inflation could raise rates and pressure growth stock valuations
5) Supply chain: prolonged bottlenecks in HBM, packaging (CoWoS, etc.) that delay shipments
However, what the market has repeatedly missed is that these risks have tended to increase short-term volatility rather than cause a collapse in demand.
9) News-style summary: seven key headlines for today
– NVIDIA was revalued after ChatGPT popularized generative AI following the 2022 crash from inflation and rate hikes
– After ChatGPT, AI data center investment exploded and the H100 became a “golden ticket” due to shortages
– The 2023 stock surge and 2024 rise to No.1 market cap reflect a revaluation as an infrastructure standard rather than a mere theme
– In 2025 Blackwell shortages continued, and high margins translated into results
– High-efficiency models like DeepSeek cause short-term shocks, but Jevons paradox suggests potential for overall demand expansion
– TPU and custom chip threats exist, but the CUDA ecosystem creates switching costs that serve as a defense line
– The next growth axis is expanding into physical AI centered on robots, autonomous driving, and digital twins
10) The most important thing that other YouTube channels/news rarely point out
Most content ends at “NVIDIA GPUs are expensive / competitors will emerge / the stock rose a lot.”
But the real core point is this.
The core point of NVIDIA’s moat is not the chip but that it has preempted the enterprise standard for deploying and operating AI workflows.
GPUs can eventually be replaced.
But what is scarier for customers is that once the entire process of training-tuning-serving-monitoring-optimizing runs on NVIDIA’s stack,
customers cannot switch immediately even if competitors offer chips.
So when evaluating NVIDIA, it is better to view it not only as a semiconductor company but also as a platform company that dominates AI infrastructure standards; that perspective makes the direction clearer.
This is also why the market does not easily give up on NVIDIA even during months of sideways trading.
< Summary >
NVIDIA fell in 2022 due to rate and inflation shocks, but the popularization of generative AI after ChatGPT at the end of 2022 led to a revaluation placing it at the center of the AI data center investment cycle.
H100 and Blackwell shortages are signals that AI deployment and operations have begun in earnest; even when high-efficiency models and TPU threats recur, the CUDA ecosystem’s switching costs act as a strong defense.
The next expansion axis is physical AI centered on robots, autonomous driving, and digital twins, and the core point of NVIDIA’s moat is that it has preempted AI infrastructure standards (workflows), not just GPUs.
[Related articles…]
- NVIDIA: Comprehensive Summary of AI Semiconductor Hegemony and the Data Center Investment Cycle
- Possibility of Inflation Reigniting: Effects on Rates, Stocks, and Big Tech
*Source: [ 월텍남 – 월스트리트 테크남 ]
– 엔비디아 수개월 째 횡보… 이제 상승 끝났다고?


