AI Demand – No Ceiling

·

·

● AI Demand Is Unstoppable

“Memory efficiency is done, but demand has no ceiling even”…A signal that the ‘next money route’ for AI investment is changing

1) Key takeaway first: Money is moving in the market in the order ‘model → infrastructure → service’

  • The strongest message that came through in this interview is exactly this.
  • The LLM (model) game is over, and infrastructure keeps having the advantage, but there was a flow suggesting that services could draw renewed attention in 2026~2027.
  • And the investment strategy is also framed not as simply “AI will make everything rise,” but as looking at which segment (memory/power/data centers/services) money is gathering in.

2) Three most important expressions (points you must take from this article)

  • There is a ceiling to compression
  • There is simply no ceiling on demand
  • Services are fragmented, so it’s hard to pick the companies that survive (which is why a bundling strategy comes up)

3) Why say that memory efficiency has “come all the way to the end”?

  • On the memory semiconductor/cache side (e.g., KV cache), the story is that a significant degree of efficiency improvement has already been made technically.
  • In particular, innovations like “reducing the number of bits with efficiency technologies like TurboCompute” become a market event that shakes things up once, but at the same time there’s a judgment that logically there’s a limit to how much further it can be reduced.
  • In other words, components that become bottlenecks in an AI ecosystem continue to improve—yet the investment attractiveness of those improvements becomes greater when ‘demand growth’ backs it up.

4) Why demand doesn’t turn downward: the “Jevons paradox” repeating in AI

  • The interview explains “Jevons paradox” by connecting it to AI cost/usage issues.
  • Put simply, when costs fall, usage rises (or when usage becomes easier) and as a result, the total demand increases.
  • So even if “compression (efficiency) reaches its limit,” the logic is that if the actual amount AI is used (billing/usage/workflow) keeps growing, infrastructure demand can remain strong.

5) Where does the “smart money” move?

  • The conclusion is “some of it moves from infrastructure to services.”
  • However, because services also carry a higher risk of failure (fragmentation), the flow appears alongside strategies that bundle and diversify risk rather than betting on specific winners.
  • And the hint people refer to at this point is the “private-market AI fund track” that’s popular in the U.S. these days.

6) Three types of private-market AI fund tracks popular in the U.S. + ‘AI defense’

  • The categories of private funds mentioned in the interview can be broadly organized like this.

6-1) AI infrastructure funds

  • Investing in the costs required to run AI, such as data centers/semiconductors/power/networks

6-2) AI power (electricity) funds

  • With electricity and power infrastructure becoming bottlenecks in the AI era, the viewpoint that “the flow of money goes to power as well”

6-3) AI services funds (diversifying fragmentation risk)

  • Services spread across various industries such as healthcare, forecasting, payments/trading, and polymarkets
  • The core here is “It’s too hard to win in services → you need an ETF/basket-style approach”

6-4) AI futurewar( future war / linked to defense ) themed funds

  • A flow where companies like Palantir bring lists of acquisition targets
  • Expectations reflected that “technology + data + decision-making” could become important in war/national strategy

7) Why services get renewed attention in 2026~2027: “You have to make money at the end”

  • The big frame from the interview is this.
  • The infrastructure/model side has already shown potential, but the place where the money is actually made is the service’s end.
  • So a logic appears: in 2026~2027, services need to show revenue/cash-flow, so that investment in infrastructure or models can keep going.
  • In other words, it’s read as a signal that the important thing is the timing of monetization, not that it just ends as a theme-driven rally in the short term.

8) A realistic answer to “Should SpaceX/Anthropic be added no matter what?”

  • The question was extremely hot: “If there’s an opportunity before listing, do you have to get in no matter what?”
  • The answer was conditional rather than emotional.

8-1) Anthropic (private) secondary deal: ‘If it’s a long-term play, it could make sense’

  • There was mention of private-market trading/secondary share opportunities, and the interviewer/CEO perspective is “if you’re going long-term, it’s worth considering.”
  • The point, however, is “it’s about an individual’s judgment.”
  • The core the CEO expressed is: look not at the urgency (even the naming itself), but at whether your assumptions are actually correct.

8-2) SpaceX is related not only to ‘space’ but also to AI infrastructure

  • In the interview, they viewed SpaceX as having a connection point to AI infrastructure such as data centers/networks, not just launch vehicles.
  • And it’s wrapped up in a way like: “Space may look like fiction, but it’s actually an area where money flows.”

9) How to choose “what the next big tech will be”: the conditions for superstars in a domain

  • The advice here is quite practical.
  • It’s not enough to say “the technology is great”—you have to look at the following.

9-1) Can you solve the problem ‘perfectly’ enough to become a domain superstar?

  • The perspective is that to become a leader in a domain (industry/sector), you need to strongly capture a specific bottleneck/process, not just something ‘general-purpose.’

9-2) If that company stops, can you expand to the next stage (durability)?

  • There’s talk about needing to see team/product/technology potential that can expand beyond a single win—like AlphaMA.

9-3) Unlike infrastructure where “the players are predetermined,” services are ‘fragmented’

  • Infrastructure has relatively clear top players.
  • On the other hand, services have a lower probability of getting it right emphasized because there are too many possible companies that could survive.

10) Two things investors especially should look at: ecosystem lock-in & gateway

  • The interview explains the “reason for investing in Nvidia” using an example.

10-1) Ecosystem lock-in (a structure where switching others increases costs/transition friction)

  • The context isn’t just “GPUs are good.” It’s about whether in the workflow they become an ‘effectively mandatory passage’.

10-2) Gateway (a business you can’t avoid)

  • It’s summarized as a request to check whether there’s a structure that you “must go through.”
  • From this perspective, the argument is that companies that are securing “Mandaaderi (the mandatory path)” are more likely to become the next giant.

11) Why ‘private-market AI ETFs/baskets’ are trending: structurally reducing service risk

  • Here is genuinely hands-on content.
  • Private-market services can’t help but have a lower individual success rate.
  • So strategies appear like “not putting all your wealth into one company, but bundling multiple companies and only holding the top ones by tier.”

11-1) A Korean-style ETF sensibility could become even stronger in private markets too

  • In the interview, using examples like Stripe/DataBricks, they connect the logic that “there are private companies where huge enterprise value has already formed, and they might also stall after listing.”
  • Ultimately, the point is that the basket absorbs variables like a bubble before listing and a correction after listing.

11-2) The structure: even if 8 fail, if 2 succeed the fund goes up

  • Because services are fragmented, winner-takes-all often shows up, and the basket strategy is a way to use that winner-takes-all.
  • So there’s mention of a flow where concepts like KETF (and similar ideas) are becoming popular in the U.S. as well.

12) The most important content in this article that “people don’t really talk about elsewhere” (separate summary)

  • The core is not that ‘AI infrastructure will rise forever,’ but the asymmetric structure of “compression limits (efficiency ceiling) vs infinite demand (usage expansion ceiling)”.
  • Because of this structure,
  • memory/cache efficiency will reach a limit for sure, but
  • if the total amount of AI used (usage enough to offset the cost of TurboCompute) keeps increasing,
  • infrastructure investment is more likely to last even through short-term corrections.
  • And that’s where investment strategy diverges.
  • infrastructure is where “the players are predetermined”
  • services are “fragmented” → so risk diversification via baskets rather than individual bets becomes stronger

13) The main conclusion you should take away (investment perspective summary)

  • Saying “the LLM (model) game is over” may be an exaggeration, but the direction of money movement in the market is clear.
  • Moving forward, it looks like the checklist should be changed along these lines.
  • For semiconductors, power, and data centers, look at both the “efficiency ceiling” and the “demand expansion” at the same time
  • For services, pay attention to the possibility that monetization metrics will become important again in 2026~2027
  • For services, diversify risk with a basket (private-market AI ETF/fund) rather than trying to pick individual winners
  • When choosing the next big tech, check whether it has a “gateway (mandatory passage) + lock-in (ecosystem entrenchment)”

SEO core keywords reflected naturally (5 words you’d be sorry to miss)

  • It’s good to read this article focusing on the following flow.
  • AI semiconductors
  • data centers
  • memory efficiency improvements
  • AI services
  • private-market ETF

< Summary >

  • Memory efficiency improvements have reached a technical limit (the compression ceiling), but the structure emphasizes that total AI usage continues to grow, so the demand (usage ceiling) doesn’t break.
  • Investment money is generally moving from infrastructure toward some services, and especially in 2026~2027, service monetization could draw renewed attention.
  • Because the service market is highly fragmented and it’s hard to choose individual targets, a flow is emerging where risk-diversifying strategies in the form of private-market AI funds/baskets are gaining popularity.
  • When picking the next tech, the core advice is to check the domain-superstar conditions (sharpness in solving a problem) + ecosystem lock-in + gateway (you can’t avoid it).
  • As for pre-IPO opportunities like SpaceX/Anthropic, the conclusion is that you need to choose carefully from the perspective of “long-term/risk/opportunity cost,” not emotion.

[Related articles…]Latest article on Anthropic secondary shares/pre-listing investment strategy
Latest article on AI semiconductor demand/supply cycles and outlook

*Source: [ 티타임즈TV ]

– “메모리 효율화는 올 데까지 왔다. 그런데 수요는 상방이 없다” (조용민 언바운드랩스 대표)


● AI Demand Is Unstoppable “Memory efficiency is done, but demand has no ceiling even”…A signal that the ‘next money route’ for AI investment is changing 1) Key takeaway first: Money is moving in the market in the order ‘model → infrastructure → service’ The strongest message that came through in this interview is exactly…

Feature is an online magazine made by culture lovers. We offer weekly reflections, reviews, and news on art, literature, and music.

Please subscribe to our newsletter to let us know whenever we publish new content. We send no spam, and you can unsubscribe at any time.

Korean