Pro@programming.dev to Technology@lemmy.worldEnglish · 1 day agoThe Collapse of GPT: Will future artificial intelligence systems perform increasingly poorly due to AI-generated material in their training data?cacm.acm.orgexternal-linkmessage-square52linkfedilinkarrow-up1275arrow-down118
arrow-up1257arrow-down1external-linkThe Collapse of GPT: Will future artificial intelligence systems perform increasingly poorly due to AI-generated material in their training data?cacm.acm.orgPro@programming.dev to Technology@lemmy.worldEnglish · 1 day agomessage-square52linkfedilink
minus-squareGrandwolf319@sh.itjust.workslinkfedilinkEnglisharrow-up11·16 hours agoMaybe, but even if that’s not an issue, there is a bigger one: Law of diminishing returns. So to double performance, it takes much more than double of the data. Right now LLMs aren’t profitable even though they are more efficient compared to using more data. All this AI craze has taught me is that the human brain is super advanced given its performance even though it takes the energy of a light bulb.
minus-squareAItoothbrush@lemmy.ziplinkfedilinkEnglisharrow-up3·7 hours agoIts very efficient specifically in what it does. When you do math in your brain its very inefficient the same way doing brain stuff on a math machine is.
minus-squarerottingleaf@lemmy.worldlinkfedilinkEnglisharrow-up6·8 hours ago All this AI craze has taught me is that the human brain is super advanced given its performance even though it takes the energy of a light bulb. Seemed superficially obvious. Human brain is a system optimization of which took energy of evolution since start of life on Earth. That is, infinitely bigger amount of data. It’s like comparing a barrel of oil to a barrel of soured milk.
minus-squareRaptorBenn@lemmy.worldlinkfedilinkEnglisharrow-up2·14 hours agoIf it wasn’t a fledgingling technology with a lot more advancements to be made yet, I’d worry about that.
Maybe, but even if that’s not an issue, there is a bigger one:
Law of diminishing returns.
So to double performance, it takes much more than double of the data.
Right now LLMs aren’t profitable even though they are more efficient compared to using more data.
All this AI craze has taught me is that the human brain is super advanced given its performance even though it takes the energy of a light bulb.
Its very efficient specifically in what it does. When you do math in your brain its very inefficient the same way doing brain stuff on a math machine is.
Seemed superficially obvious.
Human brain is a system optimization of which took energy of evolution since start of life on Earth.
That is, infinitely bigger amount of data.
It’s like comparing a barrel of oil to a barrel of soured milk.
If it wasn’t a fledgingling technology with a lot more advancements to be made yet, I’d worry about that.