There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

memes

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

Sgt_choke_n_stroke , in ARE TANKIES CONSPIRING TO MAKE SURE YOU HAVE A BAD TIME ON LEMMY.ML?

The “everyone’s a tankie who disagrees with me” I’m convinced is a new meme.

Cowbee ,
@Cowbee@lemmy.ml avatar

That’s what happens when you defederate from pro-Marxist instances. Zionists and anti-Communists start joining this anti-Marxist space and you have a whole new Red Scare on your hands.

DragonTypeWyvern ,

Shitlibs gonna shitlib

Tankies gonna tank

BuboScandiacus , in How i feel on Lemmy

Can I ask which country you’re from ?

TachyonTele ,

You’re asking an account that hasn’t been active in four months, in an eleven month old thread.

BuboScandiacus ,

Sad

zakobjoa , in Get rich quick
@zakobjoa@lemmy.world avatar

They will eat massive shit when that AI bubble bursts.

umbrella ,
@umbrella@lemmy.ml avatar

one can only hope

r00ty Admin ,
r00ty avatar

I mean if LLM/Diffusion type AI is a dead-end and the extra investment happening now doesn't lead anywhere beyond that. Yes, likely the bubble will burst.

But, this kind of investment could create something else. We'll see. I'm 50/50 on the potential of it myself. I think it's more likely a lot of loud talking con artists will soak up all the investment and deliver nothing.

frezik ,

It’s looking like a dead end. The content that can be fed into the big LLMs has already been done. New stuff is a combination of actual humans and stuff generated by LLMs. It then runs into an ouroboros problem where it just eats its own input.

r00ty Admin ,
r00ty avatar

Yeah, I was thinking more if there's either an evolutionary improvement or revolutionary (or some movement toward AGI). For me it's better if not, so I get to keep my job for a few more years. But, my general feeling is with the cash injection, there's some chance of a breakthrough.

greenskye ,

I mostly agree, with the caveat that 99% of AI usage today just stupid gimmicks and very few people or companies are actually using what LLMs offer effectively.

It kind of feels like when schools got sold those Smart Whiteboards that were supposed to revolutionize teaching in the classroom, only to realize the issue wasn’t the tech, but the fact that the teachers all refused to learn and adapt and let the things gather dust.

I think modern LLMs should be used almost exclusively as an assistive tool to help empower a human worker further, but everyone seems to want an AI that you can just tell ‘do the thing’ and have it spit out a finalized output. We are very far from that stage in my opinion, and as you stated LLM tech is unlikely to get us there without some sort of major paradigm shift.

micka190 ,

only to realize the issue wasn’t the tech

To be fair, electronic whiteboards are some of the jankiest piles of trash I’ve ever had to use. I swear to God you need to re-calibrate them every 5 minutes.

linkhidalgogato ,

bubbles have nothing to do with technology, the tech is just a tool to build the hype. The bubble will burst regardless of the success of the tech at most success will slightly delay the burst, because what is bursting isnt the tech its the financial structures around it.

ssj2marx ,

Well, the employees who were hired to service the bubble and get laid off will eat massive shit, I’m sure NVIDIA and its executives will be fine.

frezik ,

See Sun Microsystems after the .com bubble burst. They produced a lot of the servers that .com companies were using at the time. Shriveled up after and were eventually absorbed by Oracle.

Why did Oracle survive the same time? Because they latched onto a traditional Fortune 500 market and never let go down to this day.

TheRealKuni ,

I doubt it. Regardless of the current stage of machine learning, everyone is now tuned in and pushing the tech. Even if LLMs turn out to be mostly a dead end, everyone investing in ML means that the ability to do LOTS of floating point math very quickly without the heaviness of CPU operations isn’t going away any time soon. Which means nVidia is sitting pretty.

umbrella ,
@umbrella@lemmy.ml avatar

the WWW wasn’t a dead end but the bubble burst anyway. the same will happen to AI because exponential growth is impossible.

DogWater ,

No they won’t, this tech isn’t going to go away Even if it plateaus. All the gpus they make will still get used.

yourgodlucifer ,

The internet didn’t go away but there was still a .com bubble

zakobjoa ,
@zakobjoa@lemmy.world avatar

As far as I understand, the GPUs that LLMs use aren’t exactly interchangeable with your regular GPU. Also, no one needs that many GPUs for any traditional use cases.

Blaster_M ,

It means having a shot at getting a good gaming gpu for cheap

zakobjoa ,
@zakobjoa@lemmy.world avatar

As far as I understand the tech, those things aren’t really interchangeable :(

StaySquared , in Guess how my day started

If you play close attention to the "power’ of your toilets flush… you’ll notice when it’s getting close to a clog. That flush will make you second guess something isn’t right. And if your neglect it, you will sooner or later realize it was in fact on its way to clog.

The life of a homeowner. Many of you have NO idea the amount of chit you need to learn and pay attention to on a daily basis to make sure your home is well maintained. Adulting fkin sucks.

Karyoplasma , in Big scared

Don’t push alpha to production, silly.

puppy ,

So alpha males are males who are unrefined and flawed? 🤔

samus12345 ,
@samus12345@lemmy.world avatar

Beta males must be further along in development, then.

RizzRustbolt ,

Gold males used to be pretty damn good. Now they’re just as buggy as the rest, and usually need a few patches on release day.

samus12345 ,
@samus12345@lemmy.world avatar

“A delayed male is eventually good, but a rushed male is forever bad.”

davel , in ARE TANKIES CONSPIRING TO MAKE SURE YOU HAVE A BAD TIME ON LEMMY.ML?
@davel@lemmy.ml avatar

Reporter: [REDACTED]
Reason: Spam or Abuse

Reporter is having a bad time on lemmy.ml, what more evidence do you need?

Cowbee ,
@Cowbee@lemmy.ml avatar

That’s just sad, haha.

davel , in ARE TANKIES CONSPIRING TO MAKE SURE YOU HAVE A BAD TIME ON LEMMY.ML?
@davel@lemmy.ml avatar
Grayox , in ARE TANKIES CONSPIRING TO MAKE SURE YOU HAVE A BAD TIME ON LEMMY.ML?
@Grayox@lemmy.ml avatar

More pixels plz

myself , in ARE TANKIES CONSPIRING TO MAKE SURE YOU HAVE A BAD TIME ON LEMMY.ML?

Actually can’t tell if this was made by a tankie or a sane person and I love it

Cowbee ,
@Cowbee@lemmy.ml avatar
HEXN3T , in Big scared
@HEXN3T@lemmy.blahaj.zone avatar

Alpha Males 🤝 🌈

Sam_Bass , in Perfect

Be even funnier if the person being txted was right in front of them

ekZepp , in Big scared
@ekZepp@lemmy.world avatar

The idea that wolf packs are led by a merciless dictator, or alpha wolf, comes from old studies of captive wolves. In the wild, wolf packs are simply families

scientificamerican.com/…/is-the-alpha-wolf-idea-a…

UltraMagnus0001 ,

Baboons act more like that, you know primates like us.

zero_spelled_with_an_ecks ,

Bonobos, primates like us and more closely related than baboons, do not.

Shou ,

I mean. They are a matriarchy. So one could say the oldest is the alpha female. Who only accepts a new female in the group if she’s able to sexually satisfy the matriarch. And in case you want to see what bonobo gay sex looks like… don’t. It looks like two sopping tumors rubbing against each other.

zero_spelled_with_an_ecks ,

Oh thank god the tumors aren’t dry.

ekZepp ,
@ekZepp@lemmy.world avatar

…There are a range of animal behaviors out there, and just because humans choose to identify with some more than others doesn’t mean we have to.

discovermagazine.com/…/the-science-of-alpha-males…

Phegan , in Perfect

What, are we 80 years old here. Posting Facebook memes mad the damn kids are always on their damn phones?

I am a travel and outdoors enthusiast, I love being outside and experiencing a new city or the wilderness, but also, phones are powerful technology that allow us to find directions or look up where to go.

The real issue isn’t that we are using it, it’s that they are being used to collect and sell our data for advertising, that algorithms are designed to keep us on our phone instead of experiencing the world.

It’s boomer shit to post this meme and be like, society bad, people on phones.

Maven OP ,

This is a parody of Boomer memes.

The classic meme goes “not a phone in sight. Just people enjoying the moment”. This is a play on that by swapping the words and romanticizing phones instead.

ImplyingImplications , in Get rich quick

Worst one is probably Apple. They just announced “Apple Intelligence” which is just ChatGTP whose largest shareholder is Microsoft. Figure that one out.

dependencyinjection ,

Well, most of the requests are handled on device with their own models. If it’s going to ChatGPT for something it will ask for permission and then use ChatGPT.

So the Apple Intelligence isn’t all ChatGPT. I think this deserves to be mentioned as a lot of the processing will be on device.

Also, I believe part of the deal is ChatGPT can save nothing and Apple are anonymising the requests too.

Blue_Morpho ,

Well, most of the requests are handled on device

Doubt.

Voice recognition, image recognition, yes. But actual questions will go to Apple servers.

dependencyinjection , (edited )

Doubt.

Is this conjecture or can you provide some further reading, in the interest of not spreading misinformation.

Edit: I decided to read the info from Apple.

With Private Cloud Compute, Apple sets a new standard for privacy in AI, with the ability to flex and scale computational capacity between on-device processing, and larger, server-based models that run on dedicated Apple silicon servers. When requests are routed to Private Cloud Compute, data is not stored or made accessible to Apple and is only used to fulfill the user’s requests, and independent experts can verify this privacy.

Additionally, access to ChatGPT is integrated into Siri and systemwide Writing Tools across Apple’s platforms, allowing users to access its expertise — as well as its image- and document-understanding capabilities — without needing to jump between tools.

Say what you will about Apple, but privacy isn’t a concern for me. Perhaps, some independent experts will verify this in time.

Blue_Morpho ,

Which is exactly what I said. It’s not local.

That they are keeping the data you send private is irrelevant to the OP claim that the AI model answering questions is local.

dependencyinjection ,

OP here being me.

Well, most of the requests are handled on device with their own models. If it’s going to ChatGPT for something it will ask for permission and then use ChatGPT.

I feel I was pretty explicit in explaining how some requests will go to ChatGPT.

AdrianTheFrog ,
@AdrianTheFrog@lemmy.world avatar

Apple has published papers on small LLM models and multimodal models already. I would be surprised if they aren’t using them for on-device processing.

Fedizen ,

chatgpt won’t save anything? Doubtful.

dependencyinjection ,

Brother I do not care about your doubts.

I want hard facts here.

Do you think that if you enter into a contract with a company like Apple they’ll just be like, aww shit they weren’t supposed to do that. Anyway let’s carry on.

No. This would open OpenAi up to potential lawsuits.

Even if they did save stuff. It gets anonymised by Apple before even being sent to ChatGPT servers.

Fedizen ,

The hard fact is OpenAI is already exposing itself to lawsuits by training on copyrighted material.

So the proof here should be “what makes them trustworthy this time?”

dependencyinjection ,

Because Apples lawyers will go ham.

I don’t want my comments here to be received as shilling Apple, more that I want them to based on actual information that is provided and not opinion pieces.

The fact is, if they were to caught saving data then Apple would just end the contract. Is it worth it for them to lose out on that cash, for the sake of using it. When they can just use all the other sources where they are allowed to do that.

Anyway, I don’t care what anonymised data they may or may not save. It won’t be tied to me.

Edit: Do you have some information on this existing lawsuits and the contracts they broke?

Blue_Morpho ,

Because Apples lawyers will go ham.

Google pays Apple $20 billion a year to keep their search on Apple devices. The subtext of “search” is Google pays Apple for your search data.

Apple has sold your data for the right price to Google, so there should be no expectation that they won’t do the same with other companies.

dependencyinjection ,

They sell Google the right to keep it as the default, not that they’re selling data.

Again, point me to some proof of it being actually selling data. As to my understanding they pay for the default engine to be Google.

Blue_Morpho ,

That Google is the search engine means Google gets that valuable search data. So they pay to be the default search engine to get your data.

dependencyinjection ,

Sure, but let’s be honest. Even if it wasn’t the majority of people are still using Google anyway.

I prefer Arc Search myself.

micka190 ,

There’s kind of a difference between “we scraped the internet and decided to use copyrighted content anyways because we decided to interpret copyright law as not being applicable to the content we generate using copyrighted content” (omegalul) and “we explicitly agreed to a legally-binding contract with Apple stating we won’t do that”.

linkhidalgogato ,

thing is apple doesnt give a shit about ur privacy

dependencyinjection ,

Finally, a reasonable comment.

I would concede that they want to keep it all for themselves, although a lot of anonymising of data is done.

My point is Apple are not sharing it with every third party on the Earth.

If you’re using Android then you don’t really have a leg to stand on, unless you’re using GrapheneOS and you’ve sandboxed Google services.

I would rather use a device that maybe keeps it all for themselves. Rather than one where it is shared with Everyman and his dog.

Plenty of things you can shit on Apple for, but this isn’t one of them I’m afraid.

ASeriesOfPoorChoices ,

careful, that’s a hardcore tankie troll you replied to.

photonic_sorcerer ,
@photonic_sorcerer@lemmy.dbzer0.com avatar

That’s just not true. Most requests are handled on-device. If the system decides a request should go to ChatGPT, the user is promped to agree and no data is stored on OpenAI’s servers. Plus, all of this is opt-in.

Blue_Morpho ,

Most requests are handled on-device.

Literally impossible.

“Hey Siri, what’s the weather forecast for tomorrow.”

< The Farmer’s Almanac that is in my local model says it will rain tomorrow. >

PassingThrough ,

I think there’s a larger picture at play here that is being missed.

Getting the weather is a standard feature for years now. Nothing AI about it.

What is “AI” is, Hey Siri, what is the weather at my daughter’s recital coming up?

The AI processing, calculated on-device if what they claim is true, is:

  1. the determination of who your daughter is
  2. What is a recital? An event? Are there any upcoming calendar events that match this concept?
  3. Is the “daughter” associated with this event by description or invitation? Yes? OK, what’s the address?
  4. Submit zip code of recital calendar event involving the kid to the weather API, and churn out a reply that includes all this information…

Well {Your phone contact name}, it looks like it will {remote weather response} during your {calendar event from phone} with {daughter from contacts} on {event date}.

That is the idea between on-device and cloud processing. The phone already has your contacts and calendar and does that work offline rather than educating an online server about your family, events and location, and requests the bare minimum from the internet, in this case nothing more than if you opened the weather app yourself and put in a zip code.

Blue_Morpho ,

Nothing AI about it.

Voice processing is AI and was done by Apple servers. Previously, only the keyword “Hey Siri” was local. Onboard AI chips will allow this to be local. The actual queries will go to the servers. Phones do not have the power to run useful LLM locally- at least not with the near instantaneous response times phone users expect. A 56 Watt 128GB RAM M3 Max does around 8.5 tokens/second.

www.nonstopdev.com/llm-performance-on-m3-max/

PassingThrough , (edited )

Onboard AI chips will allow this to be local.

Phones do not have the power to ~~~

Perhaps this is why these features will only be available on iPhone 15 Pro/Max and newer? Gotta have those latest and greatest chips.

It will be fun to see how it all shakes out. If the AI can’t run most queries on the phone with all this advertising of local processing…there’ll be one hell of a lawsuit coming up.

EDIT: Finished looking for what I thought I remembered…

Additionally, Siri has been locally processed since iOS 15.

macrumors.com/…/use-on-device-siri-iphone-ipad/

Blue_Morpho ,

Perhaps this is why these features will only be available on iPhone 15 Pro/Max and newer?

I’m not guessing. I linked to the article about the M3 which is much more powerful than the a17 pro in the 15 pro and has the same NPU.

PassingThrough ,

Forgive me, I’m no AI expert to fully compare the needed tokens per second measurement to relate to the average query Siri might handle, but I will say this:

Even in your article, only the largest model ran at 8/tps, others ran much faster, and none of these were optimized for a task, just benchmarking.

Would it be impossible for Apple to be running an optimized model specific to expected mobile tasks, and leverage their own hardware more efficiently than we can, to meet their needs?

I imagine they cut out most worldly knowledge etc/use a lightweight model, which is why there is still a need to link to ChatGPT or Apple for some requests, would this let them trim Siri down to perform well enough on phones for most requests? They also advertised launching AI on M1-2 chip devices, which are not M3-Max either…

MotoAsh ,

Literally not what people are talking about. It’s the “AI” part of the task that doesn’t leave the device (unless it prompts to ask chat gpt). Not that it can magically gleam live info without making any request to the web…

Jeeze, fucking… get your shit straight, making me defend Apple… Fucking do better.

lolcatnip ,

The “AI” parts are what they’re saying happens on the device. This isn’t a gotcha.

Rai ,

If you think that’s the WORST ONE, you have no idea about any of this

frezik ,

Yeah, if anything, Apple is behind the curve. Nvidia/AMD/Intel have gone full cocaine nose dive into AI already.

ken27238 ,
@ken27238@lemmy.ml avatar

Not true. Most if not all requests are handled by apples own models on device or on their own servers. When it does use OpenAI you need to give it permission each time it does.

10_0 , in ARE TANKIES CONSPIRING TO MAKE SURE YOU HAVE A BAD TIME ON LEMMY.ML?

Tankies keep on posting walls of texts in !memes about communism, or was it the liberals, anyway people i don’t like are posting walls of text about communism in !memes… I love this post, for next time can you draw some conclusions about the survey so I don’t have to think to hard about these numbers, I can’t count past 10.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines