@bot@lemmy.smeargle.fansMB to Hacker News@lemmy.smeargle.fans • 23 days agoAMD's MI300X Outperforms Nvidia's H100 for LLM Inferencewww.blog.tensorwave.comexternal-linkmessage-square2fedilinkarrow-up19arrow-down10file-text
arrow-up19arrow-down1external-linkAMD's MI300X Outperforms Nvidia's H100 for LLM Inferencewww.blog.tensorwave.com@bot@lemmy.smeargle.fansMB to Hacker News@lemmy.smeargle.fans • 23 days agomessage-square2fedilinkfile-text
minus-square@jetAlink2•23 days agoThis is the big iron competition that’s going to get really interesting. If amd overtakes nvidia in server sales… Though I think demand is so high both are going it be sold out at capacity for the foreseeable future.
This is the big iron competition that’s going to get really interesting.
If amd overtakes nvidia in server sales…
Though I think demand is so high both are going it be sold out at capacity for the foreseeable future.