There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

theluddite , (edited )
@theluddite@lemmy.ml avatar

You can’t use an LLM this way in the real world. It’s not possible to make an LLM trade stocks by itself. Real human beings need to be involved. Stock brokers have to do mandatory regulatory trainings, and get licenses and fill out forms, and incorporate businesses, and get insurance, and do a bunch of human shit. There is no code you could write that would get ChatGPT liability insurance. All that is just the stock trading – we haven’t even discussed how an LLM would receive insider trading tips on its own. How would that even happen?

If you were to do this in the real world, you’d need a human being to set up a ton of stuff. That person is responsible for making sure it follows the rules, just like they are for any other computer system.

On top of that, you don’t need to do this research to understand that you should not let LLMs make decisions like this. You wouldn’t even let low-level employees make decisions like this! Like I said, we know how LLMs work, and that’s enough. For example, you don’t need to do an experiment to decide if flipping coins is a good way to determine whether or not you should give someone healthcare, because the coin-flipping mechanism is well understood, and the mechanism by which it works is not suitable to healthcare decisions. LLMs are more complicated than coin flips, but we still understand the underlying mechanism well enough to know that this isn’t a proper use for it.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines