Het zal achteraf duidelijk worden...
Daniel Newman
Daniel Newman11 aug, 03:25
Ik denk niet dat mensen volledig waarderen hoeveel vraag de inferentie zal creëren voor AI-infrastructuur. Het trainen van modellen met triljoen parameters zal verbleken in vergelijking met volumes van triljoenen gelijktijdige tokens die 24/7/365 functioneren in elke onderneming, organisatie en consumentenactiviteit. Ik ben zo optimistisch over deze uitbouw. En natuurlijk doet $NVDA het goed, maar namen in chips, servers, energie, netwerken en agentische platforms zullen allemaal meestijgen. 💪🏻 Markeer het 😎🚀
@ShanuMathew93 @danielnewmanUV ♥️👇🏽
Shanu Mathew
Shanu Mathew4 aug, 23:39
Highlights from AMZN call: -AI Infrastructure Capacity Shortfall: AWS has "more demand than we have capacity" with power being the single biggest constraint; expects supply issues to persist for "several quarters" despite $31.4 billion quarterly CapEx in "chips, data centers and power" -Multi-Year Infrastructure Buildout: Acknowledges building sufficient AI capacity will take "several quarters" but expects improvement each quarter; views AI as "biggest technology transformation in our lifetime" requiring sustained heavy investment -Inference Economics Driving Strategy: Expects 80-90% of AI costs will shift from training to inference at scale; positions custom silicon as critical advantage since customers will anchor for better price performance like they did with CPUs -Massive Revenue Opportunity Constrained: AWS at $123 billion annualized run rate but "could be doing more revenue and helping customers more" if capacity existed; generative AI already a "triple-digit year-over-year percentage multibillion-dollar business"
5,42K