this post was submitted on 21 Sep 2024
52 points (78.9% liked)

Asklemmy

43945 readers
638 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 5 years ago
MODERATORS
 

Wondering if Modern LLMs like GPT4, Claude Sonnet and llama 3 are closer to human intelligence or next word predictor. Also not sure if this graph is right way to visualize it.

you are viewing a single comment's thread
view the rest of the comments
[–] todd_bonzalez@lemm.ee 3 points 2 months ago (3 children)

Human intelligence created language. We taught it to ourselves. That's a higher order of intelligence than a next word predictor.

[–] Sl00k@programming.dev 5 points 2 months ago

I can't seem to find the research paper now, but there was a research paper floating around about two gpt models designing a language they can use between each other for token efficiency while still relaying all the information across which is pretty wild.

Not sure if it was peer reviewed though.

[–] sunbeam60@lemmy.one 3 points 2 months ago (1 children)

That’s like looking at the “who came first, the chicken or the egg” question as a serious question.

Eggs existed long before chickens evolved.

[–] CanadaPlus@lemmy.sdf.org 1 points 2 months ago

I mean, to the same degree we created hands. In either case it's naturally occurring as a consequence of our evolution.