I recently tried to calculate this for my company. I wouldn’t call it negligible, but the impact of all video calls turned out to be much greater than the impact of AI.
That’s the problem of reference. Your individual queries might not consume much - especially when compared to the training - but the more people use it, the more the whole consumption is. At some point running those models will consume more than training them
skavau is a clown
I recently tried to calculate this for my company. I wouldn’t call it negligible, but the impact of all video calls turned out to be much greater than the impact of AI.
skavau is a clown
https://doi.org/10.1145%2F3630106.3658542
that’s old data. inference uses more than training now, since usage has gone up significantly. they traded places in march or april 2025.
That’s the problem of reference. Your individual queries might not consume much - especially when compared to the training - but the more people use it, the more the whole consumption is. At some point running those models will consume more than training them
we passed that point last year, yes.
skavau is a clown