There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

This profile is from a federated server and may be incomplete. Browse more on the original instance.

Zacryon , to world in German warships to cross Taiwan Strait for first time in 22 years, defying Beijing’s warnings
@Zacryon@feddit.org avatar

“Funding aggression”. Lol. No wonder Germany got fucked in the ass by Russia and China so many times with such bootlickers among the voting population.

Zacryon , to world in German warships to cross Taiwan Strait for first time in 22 years, defying Beijing’s warnings
@Zacryon@feddit.org avatar

2024 - 22 = 2002.
So probably not nazis. No.

Zacryon , to insanepeoplefacebook in Antivax mom is a lying liar.
@Zacryon@feddit.org avatar

It already read like one of those “and then they all clapped” posts.

Zacryon , to news in Republicans threaten a government shutdown unless Congress makes it harder to vote
@Zacryon@feddit.org avatar

Where is the “mah freedum” and “they are taking away our democracy” crowd now?

Zacryon , to lemmyshitpost in Stay motivated
@Zacryon@feddit.org avatar

I wonder how you’re breathing. /j

Zacryon , to greentext in Anon misses something
@Zacryon@feddit.org avatar

Reading through the comments here makes one thing apparent again: clear and direct communication about one’s intentions can solve all of these misunderstandings. Being upfront will avoid that unnecessary “are they into me or not” over-analasys or missing such more or less subtle hints at all.

If you’re interested in someone, go for them! Tell them about your interest. It benefits you both. They’ll know, which can help in case they’re interested as well, and you’ll know what to expect whether they’re interested or not. This can also save you a lot of time, heart- and headache.

Zacryon , to science_memes in Based on a true story
@Zacryon@feddit.org avatar

I feel this. Fell into a similar rabbit hole when I tried to get realtime feedback on the program’s own memory usage, discerning stuff like reserved and actually used virtual memory. Felt like black magic and was ultimately not doable within the expected time constraints without touching the kernel I suppose. Spent too much time on that and had to move on with no other solution than to measure/compute the allocated memory of the largest payload data types.

Zacryon , to memes in adblock on mobile is 95% the only reason I use firefox
@Zacryon@feddit.org avatar

Where do you get this attitude that everything should be provided to you for free and you’re entitled to it?

From (non-capitalistic) utopic ideas, where humans try to be excellent to each other.

Zacryon , to technology in The Irony of 'You Wouldn't Download a Car' Making a Comeback in AI Debates
@Zacryon@feddit.org avatar

My point is, that the following statement is not entirely correct:

When AI systems ingest copyrighted works, they’re extracting general patterns and concepts […] not copying specific text or images.

One obvious flaw in that sentence is the general statement about AI systems. There are huge differences between different realms of AI. Failing to address those by at least mentioning that briefly, disqualifies the author regarding factual correctness. For example, there are a plethora of non-generative AIs, meaning those, not generating texts, audio or images/videos, but merely operating as a classifier or clustering algorithm for instance, which are - without further modifications - not intended to replicate data similar to its inputs but rather provide insights.
However, I can overlook this as the author might have just not thought about that in the very moment of writing.

Next:
While it is true that transformer models like ChatGPT try to learn patterns, the most likely token for the next possible output in a sequence of contextually coherent data, given the right context it is not unlikely that it may reproduce its training data nearly or even completely identically as I’ve demonstrated before. The less data is available for a specific context to generalise from, the more likely it becomes that the model just replicates its training data. This is in principle fine because this is what such models are designed to do: draw the best possible conclusions from the available data to predict the next output in a sequence. (That’s one of the reasons why they need such an insane amount of data to be trained on.)
This can ultimately lead to occurences of indeed “copying specific texts or images”.

but the fact that you prompted the system to do it seems to kind of dilute this point a bit

It doesn’t matter whether I directly prompted it for it. I set the correct context to achieve this kind of behaviour, because context matters most for transformer models. Directly prompting it do do that was just an easy way of setting the required context. I’ve occasionally observed ChatGPT replicating identical sentences from some (copyright-protected) scientific literature when I used it to get an overview over some specific topic and also had books or papers about that on hand. The latter demonstrates again that transformers become more likely to replicate training data the more “specific” a context becomes, i.e., having significantly less training data available for that context than about others.

Zacryon , to technology in The Irony of 'You Wouldn't Download a Car' Making a Comeback in AI Debates
@Zacryon@feddit.org avatar

When AI systems ingest copyrighted works, they’re extracting general patterns and concepts - the “Bob Dylan-ness” or “Hemingway-ness” - not copying specific text or images.

Okay.

https://feddit.org/pictrs/image/83156cd8-7e54-4447-82f5-bc9471fbc594.jpeg

https://feddit.org/pictrs/image/b08458e9-adb7-4519-a21b-f0e214cfc39e.jpeg

Zacryon , to science_memes in "Now everyone will have an easy reference table at hand!"
@Zacryon@feddit.org avatar

But, consider you’re stranded in the wild. All technology lost due to an accident. It’s just you, nature and your skills. How will you know then for how many days the melons you’ve foraged will suffice if you’ve found N of them and eat one a day? /j

Zacryon , to technology in Amazon cloud boss echoes NVIDIA CEO on coding being dead in the water: "If you go forward 24 months from now, it's possible that most developers are not coding"
@Zacryon@feddit.org avatar

Coding is already dead. Most coders I know spend very little time writing new code.

Oh no, I should probably tell this my whole company and all of their partners. We’re just sitting around getting paid for nothing apparently. I’ve never realised that. /s

Zacryon , to technology in Amazon cloud boss echoes NVIDIA CEO on coding being dead in the water: "If you go forward 24 months from now, it's possible that most developers are not coding"
@Zacryon@feddit.org avatar

While I highly doubt that becoming true for at least a decade, we can already replace CEOs by AI, you know? (:

independent.co.uk/…/ai-ceo-artificial-intelligenc…

Zacryon , to science_memes in Publishing Revenue
@Zacryon@feddit.org avatar

researchers are paid by the university

Not necessarily. A lot are paid by external research grants.

Zacryon , to science_memes in Publishing Revenue
@Zacryon@feddit.org avatar

Another one, Frontiers:

www.frontiersin.org

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • lifeLocal
  • goranko
  • All magazines