bot@lemmy.smeargle.fansMB to Hacker News@lemmy.smeargle.fans · 5 months agoAMD's MI300X Outperforms Nvidia's H100 for LLM Inferencewww.blog.tensorwave.comexternal-linkmessage-square2fedilinkarrow-up19arrow-down10file-text
arrow-up19arrow-down1external-linkAMD's MI300X Outperforms Nvidia's H100 for LLM Inferencewww.blog.tensorwave.combot@lemmy.smeargle.fansMB to Hacker News@lemmy.smeargle.fans · 5 months agomessage-square2fedilinkfile-text
minus-squarejetAlinkfedilinkarrow-up2·5 months agoThis is the big iron competition that’s going to get really interesting. If amd overtakes nvidia in server sales… Though I think demand is so high both are going it be sold out at capacity for the foreseeable future.
This is the big iron competition that’s going to get really interesting.
If amd overtakes nvidia in server sales…
Though I think demand is so high both are going it be sold out at capacity for the foreseeable future.