There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Active

Schnuckster , to random
@Schnuckster@beige.party avatar

This bloke is a long way from home too...

Schnuckster OP ,
@Schnuckster@beige.party avatar

This has reminded N and I of the Bermondsey-born hero of Inherit the Stars, a glorious bit of sci-fi writing which is long overdue a re-issue. 📖 @bookstodon

kim_harding , to random
@kim_harding@mastodon.scot avatar

Academic authors 'shocked' after Taylor & Francis sells access to their research to Microsoft AI
https://www.thebookseller.com/news/academic-authors-shocked-after-taylor--francis-sells-access-to-their-research-to-microsoft-ai
Authors have expressed their shock after the news that academic publisher Taylor & Francis, which owns Routledge, had sold access to its authors’ research as part of an Artificial Intelligence (AI) partnership with Microsoft—a deal worth almost £8m ($10m) in its first year.

NatureMC ,
@NatureMC@mastodon.online avatar

@kim_harding Important for @writing ⬆️

ajsadauskas , to technology
@ajsadauskas@aus.social avatar

It's time to call a spade a spade. ChatGPT isn't just hallucinating. It's a bullshit machine.

From TFA (thanks @mxtiffanyleigh for sharing):

"Bullshit is 'any utterance produced where a speaker has indifference towards the truth of the utterance'. That explanation, in turn, is divided into two "species": hard bullshit, which occurs when there is an agenda to mislead, or soft bullshit, which is uttered without agenda.

"ChatGPT is at minimum a soft bullshitter or a bullshit machine, because if it is not an agent then it can neither hold any attitudes towards truth nor towards deceiving hearers about its (or, perhaps more properly, its users') agenda."

https://futurism.com/the-byte/researchers-ai-chatgpt-hallucinations-terminology

@technology

FlorianSimon ,

Same experience

jerkface ,
@jerkface@lemmy.ca avatar

“Indifference” is a strange word to apply here. What would it mean for ChatGPT to “want” to be accurate?

skinnylatte , to random
@skinnylatte@hachyderm.io avatar

Just read John Rennie Short’s ‘Insurrection’, a good overview into what happened before and during and after Jan 6. Short and concise.

I didn’t need to feel more depressed, but I also felt I needed to know more

RunRichRun ,
@RunRichRun@mastodon.social avatar

@skinnylatte
"Insurrection: What the January 6 Assault on the Capitol Reveals about America and Democracy"; John Rennie Short
— reviewed by Scott McLemee in "Inside Higher Ed":
https://www.insidehighered.com/opinion/views/intellectual-affairs/2024/05/17/review-john-rennie-shorts-insurrection-opinion
@bookstodon

glasspusher ,
@glasspusher@beige.party avatar

@RunRichRun @skinnylatte @bookstodon thanks. I put it on my list

ton , to random
@ton@social.coop avatar

Book meme: 20 books that have had an impact on who you are. One book a day for 20 days. No explanations, no reviews, just book covers. Alt text!

Day 1/20

"Illuminatus!" by Robert Shea and Robert Anton Wilson


@bookstodon

CommonMugwort ,
@CommonMugwort@social.coop avatar

@ton @bookstodon Oh, hell yes.

Lemniscata ,
@Lemniscata@mastodon.social avatar

@ton @bookstodon

I read that at art school, one of the best to understand how comics work. ♥️

cian , to random
@cian@mstdn.science avatar

If the ultimate purpose of memory is to guide our actions in future, what is the point of episodic memory?

Why do we remember details of our past experiences?

RossGayler ,
@RossGayler@aus.social avatar

@cian @cogsci
Just off the top of my head (speculation alert)

"1) assume we have a limited storage capacity so aren't good at raw memorisation"
Maybe everything gets encoded and stored, but we aren't so good at retrieval/recall of specific episodes.
Maybe that poor exact episodic retrieval is a consequence of generalisation at retrieval.

"2) even after learning something, we tend to forget details over time."
Assuming we store new episodic memories over time, the accumulation of new memories might make it harder to retrieve specific old episodes through a generalisation at retrieval mechanism. (Also, even if parts of old episodes were to randomly disappear over time, that wouldn't necessarily stop a generalisation at retrieval mechanism. A good generalisation mechanism should be able to cope with partial records of episodes.)

"Neither of these really apply to ANNs?"
Well, you could include weight decay in an ANN and there is the phenomenon of catastrophic forgetting. However, I take the relevance of most current ANNs (feedforward, weight optimising networks) to cognitive concerns with a fairly large pinch of salt.
IMO the theoretical conceptual framework of most current ANNs doesn't make contact with cognitive concerns so you can't really ask these questions of them.

RossGayler ,
@RossGayler@aus.social avatar

@cian @cogsci

Plus there are cognitive science people who argue that analogy is the core of cognition.

Analogy is a mechanism for generalisation at retrieval and the stored episodes are the input to the analogical mechanism.

tek , to technology
@tek@calckey.world avatar

Switzerland mandates all software developed for the government be open sourced

Switzerland mandates software source code disclosure for public sector: A legal milestone

https://joinup.ec.europa.eu/collection/open-source-observatory-osor/news/new-open-source-law-switzerland

@technology

GregorTacTac ,
@GregorTacTac@lemm.ee avatar

Which country?

nerdschleife ,

India

tek , to technology
@tek@calckey.world avatar

Google faces Italian Antitrust Probe over Bundled/Linked Services

https://en.agcm.it/en/media/press-releases/2024/7/PS12714

@technology

giotras , to science Italian
NEW1_ , to fediverse
@NEW1_@mastodon.social avatar
slazer2au ,

Yep, that sure is a moon.

cabbage ,
@cabbage@piefed.social avatar

Nice natural satellite. Here it is spinning around. Cross your eyes to see it in glorious 3D.

Source: https://social.librem.one/@sab/109593170715381505

analyticus , to science
@analyticus@mastodon.social avatar
wildncrazyguy138 , to showerthoughts

Looking on a relief map, the Iranian plateau and the Himalayas look oddly similar

someguy3 ,

worldle and globle

What’s this?

wildncrazyguy138 OP ,
elonjet , to random
@elonjet@mastodon.social avatar

Landed in Austin, Texas, United States. Apx. flt. time 2 h 19 min.

elonjet OP ,
@elonjet@mastodon.social avatar

~ 1,174 gallons (4,442 liters).
~ 7,865 lbs (3,567 kg) of jet fuel used.
~ $6,572 cost of fuel.
~ 12 tons of CO2 emissions.

NEW1_ , to fediverse
@NEW1_@mastodon.social avatar
NEW1_ , to fediverse
@NEW1_@mastodon.social avatar
  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • lifeLocal
  • goranko
  • All magazines