SnausagesinaBlanket@lemmy.world to Ask Lemmy@lemmy.world · 4 days agoIs there a currently an accurate way to say how much power per prompt LLMs use?message-squaremessage-square19linkfedilinkarrow-up114arrow-down10
arrow-up114arrow-down1message-squareIs there a currently an accurate way to say how much power per prompt LLMs use?SnausagesinaBlanket@lemmy.world to Ask Lemmy@lemmy.world · 4 days agomessage-square19linkfedilink
minus-squarelime!@feddit.nulinkfedilinkarrow-up6arrow-down1·3 days agowhile this is true in isolation, the amount of users means that inference now uses more power than training for the large actors.
minus-squareMichal@programming.devlinkfedilinkarrow-up3·3 days agoThe question is about per-prompt, so number of users is not relevant. What may be more relevant is number of tokens in and out. If anything, number of users will decrease power use per prompt due to economy of scale.
while this is true in isolation, the amount of users means that inference now uses more power than training for the large actors.
The question is about per-prompt, so number of users is not relevant. What may be more relevant is number of tokens in and out.
If anything, number of users will decrease power use per prompt due to economy of scale.