“But it also takes a lot of energy to train a human,” Altman said. “It takes like 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you.”

So in his view, the fair comparison is, “If you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question versus a human? And probably, AI has already caught up on an energy efficiency basis, measured that way.”

  • wonderingwanderer@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 month ago

    Take off your rose-colored glasses. The billionaires control the implementation, and they don’t care about universal healthcare or education, and certainly not reducing wealth inequality.

    Also, LLMs will never evolve into AGI, because they’re fundamentally different processes. The human brain has many parts, and the Broca’s area is a very small part that cannot generalize its functions to perform the tasks of all the other areas of the brain.

    And ASI is pure scifi. Circuits will never be sentient, the best they can ever do is mimic sentiency.

    • qualia@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      1 month ago
      • The most common way people give up their power is by thinking they don’t have any. The corrupting nature of power is known by the majority. Billionaires just ignore it because capitalism rewards executives who exhibit psychopathic symptoms.

      • I don’t believe anyone’s arguing LLMs will evolve into AGI.

      • Thats a conclusion of substrate dependency. What’s so special about the matter (not the process) that facilitates human cognition that makes it impossible to happen with other materials?