Se hará evidente en retrospectiva...
Daniel Newman
Daniel Newman11 ago, 03:25
No creo que la gente aprecie completamente la cantidad de demanda que la inferencia creará para la infraestructura de IA. Entrenar modelos de un billón de parámetros palidecerá en comparación con volúmenes de billones de tokens concurrentes funcionando 24/7/365 en cada negocio, organización y actividad del consumidor. Estoy muy optimista sobre esta expansión. Y, por supuesto, $NVDA lo hará bien, pero nombres en chips, servidores, energía, redes y plataformas agenticas también subirán con ello. 💪🏻 Márquelo 😎🚀
@ShanuMathew93 @danielnewmanUV ♥️👇🏽
Shanu Mathew
Shanu Mathew4 ago, 23:39
Highlights from AMZN call: -AI Infrastructure Capacity Shortfall: AWS has "more demand than we have capacity" with power being the single biggest constraint; expects supply issues to persist for "several quarters" despite $31.4 billion quarterly CapEx in "chips, data centers and power" -Multi-Year Infrastructure Buildout: Acknowledges building sufficient AI capacity will take "several quarters" but expects improvement each quarter; views AI as "biggest technology transformation in our lifetime" requiring sustained heavy investment -Inference Economics Driving Strategy: Expects 80-90% of AI costs will shift from training to inference at scale; positions custom silicon as critical advantage since customers will anchor for better price performance like they did with CPUs -Massive Revenue Opportunity Constrained: AWS at $123 billion annualized run rate but "could be doing more revenue and helping customers more" if capacity existed; generative AI already a "triple-digit year-over-year percentage multibillion-dollar business"
5,42K