There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

lemmyshitpost

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

saltesc , in The taste of 🦅🇺🇲 Freedom 🇺🇸🦅

A waste of perfectly good cattle. I like meat, but I have common ground with the vegans when it comes to excessive farming for wasted animal lives. It’s hard to argue were the most humane—obviously—predator when our practices are set up for throwing half of it in the bin.

CodexArcanum ,

I saw a statistics that 12% of Americans eat 50% of all beef produced in the USA and I cannot stop thinking about it. Everytime I eat a burger I wonder if I’ve passed into the 12%. When I look at a stack like this, I see a beef 1%er

saltesc ,

Yeah, that’s it. And then I consider the environmental impacts to sustain the meat grinder that sends so much of its output to waste. At some point we lost our feeling of responsibility of eating meat because it just appears in front of us like magic. It’s good to wonder such things, “Wait a second…am I a big fucking part of the problem?”

anarchrist ,

We must sate the demon lust of Guy Fieri

bruhduh , in what then?
@bruhduh@lemmy.world avatar
LodeMike ,

That was my response too.

haerrii , in The taste of 🦅🇺🇲 Freedom 🇺🇸🦅

Incoming heart attack

SteveXVII ,

I was also thinking about a heart attack.

RiikkaTheIcePrincess , in The taste of 🦅🇺🇲 Freedom 🇺🇸🦅
@RiikkaTheIcePrincess@pawb.social avatar

Separate those six burgers, store one, eat one, give the rest to others. Unless someone needs the stored one, then give that one away too. Am not doing great but I’ll manage 🤷

… Seriously want a burger now though v.v Why’d you do this to me? 🙀

sharkfucker420 , in The taste of 🦅🇺🇲 Freedom 🇺🇸🦅
@sharkfucker420@lemmy.ml avatar

Tummy hurt

chagall , in The taste of 🦅🇺🇲 Freedom 🇺🇸🦅

Tums

xlash123 , in what then?
@xlash123@sh.itjust.works avatar

Then his supporters would also have to drop trousers in support. I support this alternate timeline

MehBlah ,

Haven you seen them? Have you seen trump? Aeeeeeiiiiiiii!

Viking_Hippie , (edited ) in Kill, fuck, marry: Biden, Trump, Kennedy?

Kill, kill, fuck, respectively. Being married to a politician doesn’t mix with social anxiety 😛

Edit: Wait a sec, you mean RFK Jr, not JFK? Kill, kill, kill, then 😛

saltesc , in what then?

Fuck you, Shaun. I didn’t need that in my head and there you are putting it in there, knowing full well what you’re doing.

But jokes on you! The image of one fist in the air while the other clutches onto pants as he’s rushed off… A consolation prize.

TheAgeOfSuperboredom , in The taste of 🦅🇺🇲 Freedom 🇺🇸🦅
ekZepp , in The taste of 🦅🇺🇲 Freedom 🇺🇸🦅
@ekZepp@lemmy.world avatar
deranger , in The taste of 🦅🇺🇲 Freedom 🇺🇸🦅

Put your dick in it.

SatansMaggotyCumFart ,

Yeah I’d cut a hole in the middle and shoot my man jam in there.

Like a sexy juicy Lucy.

NegativeLookBehind , in The taste of 🦅🇺🇲 Freedom 🇺🇸🦅
@NegativeLookBehind@lemmy.world avatar

I need you inside me.

unexposedhazard , in The taste of 🦅🇺🇲 Freedom 🇺🇸🦅

You spilled your jpg compression all over it.

MotoAsh , in I'm sorry, little one

Why use commercial graphics accelerators to run a highly limited “AI”-unique work set? There are specific cards made to accelerate machine learning things that are highly potent with far less power draw than 3090’s.

ShadowRam ,

Well yeah, but 10x the price....

MotoAsh ,

Not if it’s for inference only. What do you think the “AI accelerators” they’re putting in phones now are? Do you think they’d be as expensive or power hungry as an entire 3090 for performance if they were putting them in small devices?

ShadowRam ,

Ok,

Show me a PCE-E board that can do inference calculations as fast as a 3090 but is less expensive than a 3090.

RandomlyRight OP ,

I’d be interested (and surprised) too

RandomlyRight OP , (edited )

Yeah show me a phone with 48GB RAM. It’s a big factor to consider. Actually, some people are recommending a Mac Studio cause you can get it with 128GB RAM and more and it’s shared with the AI/GPU accelerator. Very energy efficient, but sucks as soon as you want to do literally anything other than inference

Fuzzypyro ,

I wouldn’t say it particularly sucks. It could be used as a powerhouse hosting server. Docker makes it very easy to do no matter the os now a days. Really though I’d say its competition is more along the lines of ampere systems in terms of power to performance. It even beats amperes 128 core arm cpu at a power to performance ratio which is extremely impressive in the server/enterprise world. Not to say you’re gonna see them in data centers because price to performance is a thing as well. I just feel like it fits right into the niche it was designed for.

RandomlyRight OP ,

How could you solve the problem of storage expansion? I assume there exists some kind of thunderbolt jbod thing or similar

Diabolo96 ,

It’s for inference, not training.

MotoAsh ,

Even better, because those are cheap as hell compared to 3090s.

Diabolo96 ,

But can they run Crysis ?

GBU_28 ,

Huh?

Stuff like llama.cpp really wants a GPU, a 3090 is a great place to start.

mergingapples ,

Because those specific cards are fuckloads more expensive.

d00ery ,

What are you recommending, I’d be interested in something that’s similar in price to 3090.

VeganCheesecake ,
@VeganCheesecake@lemmy.blahaj.zone avatar

Would you link one? Because the only things I know of are the small coral accelerators that aren’t really comparable, and specialised data centre stuff you need to request quotes for to even get a price, from companies that probably aren’t much interested in selling one direct to customer.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines