There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

theterrasque ,

You can probably run a 7b LLM comfortably in system RAM, maybe one of the smaller 13b ones.

Software to use

Models

In general, you want small GGML models. huggingface.co/TheBloke has a lot of them. There are some superHOT version of models, but I’d avoid them for now. They’re trained to handle bigger context sizes, but it seems that made them dumber too. There’s a lot of new things coming out on bigger context lengths, so you should probably revisit that when you need it.

Each have different strengths, orca is supposed to be better at reasoning, airoboros is good at longer and more storylike answers, vicuna is a very good allrounder, wizardlm is also a notably good allrounder.

For training, there are some tricks like qlora, but results aren’t impressive from what I’ve read. Also, training LLM’s can be pretty difficult to get the results you want. You should probably start with just running them and get comfortable with that, maybe try few-shot prompts (prompts with a few examples of writing styles), and then go from there.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines