Isso se tornará evidente em retrospectiva…
Daniel Newman
Daniel Newman11/08, 03:25
Não acho que as pessoas apreciem totalmente a quantidade de demanda que a inferência criará para a infraestrutura de IA. Treinar modelos de trilhões de parâmetros parecerá insignificante em comparação com volumes de trilhões de tokens concorrentes operando 24/7/365 em todos os negócios, organizações e atividades de consumo. Estou muito otimista com este desenvolvimento. E claro que a $NVDA vai bem, mas nomes em chips, servidores, energia, redes e plataformas agentes também vão subir com isso. 💪🏻 Marque isso 😎🚀
@ShanuMathew93 @danielnewmanUV ♥️👇🏽
Shanu Mathew
Shanu Mathew4/08, 23:39
Highlights from AMZN call: -AI Infrastructure Capacity Shortfall: AWS has "more demand than we have capacity" with power being the single biggest constraint; expects supply issues to persist for "several quarters" despite $31.4 billion quarterly CapEx in "chips, data centers and power" -Multi-Year Infrastructure Buildout: Acknowledges building sufficient AI capacity will take "several quarters" but expects improvement each quarter; views AI as "biggest technology transformation in our lifetime" requiring sustained heavy investment -Inference Economics Driving Strategy: Expects 80-90% of AI costs will shift from training to inference at scale; positions custom silicon as critical advantage since customers will anchor for better price performance like they did with CPUs -Massive Revenue Opportunity Constrained: AWS at $123 billion annualized run rate but "could be doing more revenue and helping customers more" if capacity existed; generative AI already a "triple-digit year-over-year percentage multibillion-dollar business"
5,42K