There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

theterrasque , (edited )

LLM’s don’t ingest information as such. The text gets broken into tokens (parts of words, like “catch” can be “cat” + “ch” for example), and then run through training. Training basically learns the statistical likelyhood of which token follow an array of existing tokens. It’s in some ways similar to a markov chain, but of course much more complex. It has layers of statistics, and preprocessors that can figure out which tokens to give higher precedence in the input text.

Basically the more parameters, the more and subtler patterns it can learn. Smaller models are often trained on fewer tokens than bigger ones, but it’s still a massive amount. IIRC it’s something like 1T tokens for 7 and 13, and 1.4T tokens for 33b and 65b. In comparison to the models I linked, ChatGPT 3.5 is rumored to be 175b parameters.

In addition to just parameter size, you have quantization of the numbers. Originally in a model each parameter number is 16bit float, it turns out you can reduce it to 8bit int or even 4 and 3 bit with not too much hit at complexity. There’s different ways to quantize the parameters, with varying impact on the “smartness” of the model. By reducing the resolution of the numbers, the memory needed for the model is reduced, and in some cases the speed of running them is increased.

When it comes to training, the best results have been achieved with full 16bit fp, but there are some techniques to train on quantized models too. The results I’ve seen from that is less impressive, but it’s been a while since last I looked at it.

Edit: I mentioned qlora previously, which is for training quantized models. I think that’s only available for gpu though.

Edit2: This might be a better markov chain explanation than the previous link

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines