There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

HumaShah , to technology in Researchers claim GPT-4 passed the Turing test
@HumaShah@mastodon.social avatar

@tourist @vegeta

In the 1970s, psychiatrists couldn't distinguish between PARRY, a chatbot simulating paranoia, and a human patient suffering from schizophrenia. So this 'bot convinces judges that it's a human' phenomenon is not new and tells us more about how humans think.

appassionato , to bookstodon
@appassionato@mastodon.social avatar

The Book of Chatbots: From ELIZA to ChatGPT by Robert Ciesla, 2024

The Book of Chatbots is both a retrospective and a review of current artificial intelligence-driven conversational solutions. It explores their appeal to businesses and individuals as well as their greater social aspects, including the impact on academia. The book explains all relevant concepts for readers with no previous knowledge in these topics.

@bookstodon



mimarek , to academicchatter
@mimarek@universeodon.com avatar

The popularization of AI chatbots has not boosted overall cheating rates in high schools, according to new research from Stanford University.

About 60% to 70% of surveyed students have engaged in cheating behavior in the last month, not higher than before chatbots.

https://edition.cnn.com/2023/12/13/tech/chatgpt-did-not-increase-cheating-in-high-schools/index.html

@academicchatter

estelle , to random
@estelle@techhub.social avatar

The terrible human toll in Gaza has many causes.
A chilling investigation by +972 highlights efficiency:

  1. An engineer: “When a 3-year-old girl is killed in a home in Gaza, it’s because someone in the army decided it wasn’t a big deal for her to be killed.”

  2. An AI outputs "100 targets a day". Like a factory with murder delivery:

"According to the investigation, another reason for the large number of targets, and the extensive harm to civilian life in Gaza, is the widespread use of a system called “Habsora” (“The Gospel”), which is largely built on artificial intelligence and can “generate” targets almost automatically at a rate that far exceeds what was previously possible. This AI system, as described by a former intelligence officer, essentially facilitates a “mass assassination factory.”"

  1. "The third is “power targets,” which includes high-rises and residential towers in the heart of cities, and public buildings such as universities, banks, and government offices."

🧶

estelle OP ,
@estelle@techhub.social avatar

In 2019, the Israeli army created a special unit to create targets with the help of generative AI. Its objective: volume, volume, volume.
The effects on civilians (harm, suffering, death) are not a priority: https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/

@dataGovernance @data @ethics @sociology
@ai @psychology @socialpsych

estelle OP ,
@estelle@techhub.social avatar

In 2019, the Israeli army created a special unit to create targets with the help of generative AI. Its objective: volume, volume, volume.
The effects on civilians (harm, suffering, death) are not a priority: https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/

@ethics @sociology
@ai @psychology @socialpsych @dataGovernance @data

estelle OP ,
@estelle@techhub.social avatar

In 2019, the Israeli army created a special unit to create targets with the help of generative AI. Its objective: volume, volume, volume.
The effects on civilians (harm, suffering, death) are not a priority: https://www.972mag.com/mass-assassination-factory-israel-calculated-bombing-gaza/

@ethics @sociology @ai @psychology @socialpsych @dataGovernance @data

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • lifeLocal
  • goranko
  • All magazines