There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

CeeBee_Eh , (edited )

Like fuck it is. An LLM “learns” by memorization and by breaking down training data into their component tokens, then calculating the weight between these tokens.

But this is, at a very basic fundamental level, how biological brains learn. It’s not the whole story, but it is a part of it.

there’s no actual intelligence, just really, really fancy fuzzy math.

You mean sapience or consciousness. Or you could say “human-level intelligence”. But LLM’s by definition have real “actual” intelligence, just not a lot of it.

Edit for the lowest common denominator: I’m suggesting a more accurate way of phrasing the sentence, such as “there’s no actual sapience” or “there’s no actual consciousness”. /end-edit

an LLM would learn “2+2 = 4” by ingesting tens or hundreds of thousands of instances of the string “2+2 = 4” and calculating a strong relationship between the tokens “2+2,” “=,” and “4,”

This isn’t true. At all. There are math specific benchmarks made by experts to specifically test the problem solving and domain specific capabilities of LLM’s. And you can be sure they aren’t “what’s 2 + 2?”

I’m not here to make any claims about the ethics or legality of the training. All I’m commenting on is the science behind LLM’s.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines