There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Active

elonjet , to random
@elonjet@mastodon.social avatar

Took off from Austin, Texas, United States.

elonjet , to random
@elonjet@mastodon.social avatar

Landed in Austin, Texas, United States. Apx. flt. time 45 min.

elonjet OP ,
@elonjet@mastodon.social avatar

~ 384 gallons (1,455 liters).
~ 2,577 lbs (1,169 kg) of jet fuel used.
~ $2,153 cost of fuel.
~ 4 tons of CO2 emissions.

elonjet , to random
@elonjet@mastodon.social avatar

Took off near Brownsville, Texas, United States.

Books_of_Jeremiah , to random
@Books_of_Jeremiah@zirk.us avatar

During the Wars in 1912 and 1913, military hospitals for wounded soldiers were opened in all bigger towns in . Most hospitals were in where several thousand wounded soldiers were treated.The number of hospitals in Belgrade increased during the Second Balkan War with the Bulgarians (June-July 1913), even thirty-six in total; two permanent and thirty-four emergency hospitals.

Books_of_Jeremiah OP ,
@Books_of_Jeremiah@zirk.us avatar

The last XXXIV emergency hospital was located in a onetime tavern Smutnikovac at the Topcider hill. The owner of the private hospital was the industrialist, banker and philanthropist Djordje Vajfert and it was run by his nephew, the lawyer and industrialist Ferdinand Gramberg.

Books_of_Jeremiah OP ,
@Books_of_Jeremiah@zirk.us avatar

The producer Djordje Djoka Bogdanovic shot the everyday moments in this hospital as well as the ““ stroll of the wounded soldiers, doctors and the hospital staff.

Subtitles in CC.

Courtesy of the Archive (Jugoslovenska Kinoteka).

https://vimeo.com/229788768

@historikerinnen

elonjet , to random
@elonjet@mastodon.social avatar

Landed near Brownsville, Texas, United States.

simon_lucy ,
@simon_lucy@mastodon.social avatar

@elonjet

Went to get ice cream.

elonjet , to random
@elonjet@mastodon.social avatar

Took off near Port Mansfield, Texas, United States.

elonjet , to random
@elonjet@mastodon.social avatar

Took off from Austin, Texas, United States.

Nike_Leonhard , to random German
@Nike_Leonhard@literatur.social avatar

19.07.:
Würde sich deine Geschichte für eine Umsetzung als Graphic-Novel/Comic eignen? Hättest du Interesse daran?
, oder auch der würden sich gut eignen. Sie sind linear erzählt, haben viel Handlung und interessante Schauplätze. Andere, wie z. B. die Vampirnovelle oder das eignen sich aufgrund der Erzählweise und einem hohen Anteil an Introspektion eher nicht.
Interesse daran hätte ich ...

Nike_Leonhard OP ,
@Nike_Leonhard@literatur.social avatar

Allerdings bin ich mir sehr bewusst, dass ich zwar ganz gut zeichnen kann, aber bei weitem nicht die Fähigkeiten besitze, die es braucht, so ein Projekt adäquat umzusetzen. Wenn aber jemand Interesse an einer Kooperation hätte und wir stilistisch zusammenkommen - sofort!
@buechermachen

LiquidParasyte , to internetfuneral
@LiquidParasyte@pawb.fun avatar
celeste ,

Transgender Transhumanism

GlitchyDigiBun ,
@GlitchyDigiBun@lemmy.world avatar

Because the autism void needs a mask.

Schnuckster , to random
@Schnuckster@beige.party avatar

This bloke is a long way from home too...

Schnuckster OP ,
@Schnuckster@beige.party avatar

This has reminded N and I of the Bermondsey-born hero of Inherit the Stars, a glorious bit of sci-fi writing which is long overdue a re-issue. 📖 @bookstodon

kim_harding , to random
@kim_harding@mastodon.scot avatar

Academic authors 'shocked' after Taylor & Francis sells access to their research to Microsoft AI
https://www.thebookseller.com/news/academic-authors-shocked-after-taylor--francis-sells-access-to-their-research-to-microsoft-ai
Authors have expressed their shock after the news that academic publisher Taylor & Francis, which owns Routledge, had sold access to its authors’ research as part of an Artificial Intelligence (AI) partnership with Microsoft—a deal worth almost £8m ($10m) in its first year.

NatureMC ,
@NatureMC@mastodon.online avatar

@kim_harding Important for @writing ⬆️

ajsadauskas , to technology
@ajsadauskas@aus.social avatar

It's time to call a spade a spade. ChatGPT isn't just hallucinating. It's a bullshit machine.

From TFA (thanks @mxtiffanyleigh for sharing):

"Bullshit is 'any utterance produced where a speaker has indifference towards the truth of the utterance'. That explanation, in turn, is divided into two "species": hard bullshit, which occurs when there is an agenda to mislead, or soft bullshit, which is uttered without agenda.

"ChatGPT is at minimum a soft bullshitter or a bullshit machine, because if it is not an agent then it can neither hold any attitudes towards truth nor towards deceiving hearers about its (or, perhaps more properly, its users') agenda."

https://futurism.com/the-byte/researchers-ai-chatgpt-hallucinations-terminology

@technology

FlorianSimon ,

Same experience

jerkface ,
@jerkface@lemmy.ca avatar

“Indifference” is a strange word to apply here. What would it mean for ChatGPT to “want” to be accurate?

skinnylatte , to random
@skinnylatte@hachyderm.io avatar

Just read John Rennie Short’s ‘Insurrection’, a good overview into what happened before and during and after Jan 6. Short and concise.

I didn’t need to feel more depressed, but I also felt I needed to know more

RunRichRun ,
@RunRichRun@mastodon.social avatar

@skinnylatte
"Insurrection: What the January 6 Assault on the Capitol Reveals about America and Democracy"; John Rennie Short
— reviewed by Scott McLemee in "Inside Higher Ed":
https://www.insidehighered.com/opinion/views/intellectual-affairs/2024/05/17/review-john-rennie-shorts-insurrection-opinion
@bookstodon

glasspusher ,
@glasspusher@beige.party avatar

@RunRichRun @skinnylatte @bookstodon thanks. I put it on my list

ton , to random
@ton@social.coop avatar

Book meme: 20 books that have had an impact on who you are. One book a day for 20 days. No explanations, no reviews, just book covers. Alt text!

Day 1/20

"Illuminatus!" by Robert Shea and Robert Anton Wilson


@bookstodon

CommonMugwort ,
@CommonMugwort@social.coop avatar

@ton @bookstodon Oh, hell yes.

Lemniscata ,
@Lemniscata@mastodon.social avatar

@ton @bookstodon

I read that at art school, one of the best to understand how comics work. ♥️

cian , to random
@cian@mstdn.science avatar

If the ultimate purpose of memory is to guide our actions in future, what is the point of episodic memory?

Why do we remember details of our past experiences?

RossGayler ,
@RossGayler@aus.social avatar

@cian @cogsci
Just off the top of my head (speculation alert)

"1) assume we have a limited storage capacity so aren't good at raw memorisation"
Maybe everything gets encoded and stored, but we aren't so good at retrieval/recall of specific episodes.
Maybe that poor exact episodic retrieval is a consequence of generalisation at retrieval.

"2) even after learning something, we tend to forget details over time."
Assuming we store new episodic memories over time, the accumulation of new memories might make it harder to retrieve specific old episodes through a generalisation at retrieval mechanism. (Also, even if parts of old episodes were to randomly disappear over time, that wouldn't necessarily stop a generalisation at retrieval mechanism. A good generalisation mechanism should be able to cope with partial records of episodes.)

"Neither of these really apply to ANNs?"
Well, you could include weight decay in an ANN and there is the phenomenon of catastrophic forgetting. However, I take the relevance of most current ANNs (feedforward, weight optimising networks) to cognitive concerns with a fairly large pinch of salt.
IMO the theoretical conceptual framework of most current ANNs doesn't make contact with cognitive concerns so you can't really ask these questions of them.

RossGayler ,
@RossGayler@aus.social avatar

@cian @cogsci

Plus there are cognitive science people who argue that analogy is the core of cognition.

Analogy is a mechanism for generalisation at retrieval and the stored episodes are the input to the analogical mechanism.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • lifeLocal
  • goranko
  • All magazines