There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

0laura ,
@0laura@lemmy.world avatar

I’m talking about LoRA, not LoRa. I’m a fan of both though. I’ve been considering getting a Lilygo T-Echo to run Meshtastic for a while. Maybe build a solar powered RC plane and put a Meshtastic repeater in there, seems like a cool project.

en.wikipedia.org/…/Fine-tuning_(deep_learning)Low-rank adaptation (LoRA) is an adapter-based technique for efficiently fine-tuning models. The basic idea is to design a low-rank matrix that is then added to the original matrix.[13] An adapter, in this context, is a collection of low-rank matrices which, when added to a base model, produces a fine-tuned model. It allows for performance that approaches full-model fine-tuning with less space requirement. A language model with billions of parameters may be LoRA fine-tuned with only several millions of parameters.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines