There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Barzaria ,

I just put GNU/Linux on a Celeron II 4 core single threaded CPU. It’s running along fine. I didn’t even have a use case, but just felt bad to let the old technology go to waste. This was within the last two weeks.

veganpizza69 ,
@veganpizza69@lemmy.world avatar

team Syncthing

r ,
@r@piefed.social avatar

I love this new twist on this template

Smokeydope OP ,
@Smokeydope@lemmy.world avatar

Thanks randint glad you liked it!

craigers ,
@craigers@lemmy.world avatar

So… Not really OC like title suggests… It is more accurate than using old laptop as a music player tho, lol

ArbitraryValue ,

https://sh.itjust.works/pictrs/image/54597984-a97a-4fe2-897e-760be212d0b7.jpeg

Middle right panel is a cock and balls. (OP is into needle play.)

PastryPaul ,

It’s one of those prehensile dolphin dicks.

Laborer3652 ,

Gross. I prefer my cocks posthensile.

Facebones ,

I don’t discriminate.

Landless2029 ,

Y’all laugh but I’m getting into Linux and dusted off an old i7 laptop with 16gb of RAM. Slapped a $40 512GB ssd and linux mint on it to get into !selfhost!

…then promptly forgot about the laptop

Liz ,

My current laptop is an i7 with 16 GB of RAM. Hardware requirements have plateaued pretty hard unless your trying to run something that requires the latest GPU.

Landless2029 ,

Not really it’s just another unfinished project.

I want to play with nextcloud, homeassistant and tabbyml

Buelldozer ,
@Buelldozer@lemmy.today avatar

Be careful with Home Assistant. Once you board that train it’s near impossible to get off!

criticon ,

i7 doesn’t tell you anything without the full model number, at least the gen is super important

Laborer3652 ,

i7

16gb ram

old

One of these is not like the other.

lauha ,

I figure his username is his birth year

pachrist ,

Intel has been on the i3, i5, i7 naming scheme for a while though. I think the oldest ones are probably ~15 years old at this point.

BakedCatboy ,

Yeah I had the i7 7700k which was like 7 years ago, and with like 64GB of ram because I wanted to play with large ramdisks.

morbidcactus ,

I have an Asus ROG laptop I bought in 2013 with a 3rd gen i7, whatever the gtx 660 mobile chip was and 16gb of ram, it’s definitely old by any definition, but swapping for an ssd makes it super useable, it’s the machine that lives in my garage as a shop/lab computer. To be fair, its job is web browsing, CAD touchups, slicing and PDF viewing most of the time, but I bet I could be more demanding on it.

I had been running mint w/ cinnamon on it before as I was concerned about resource usage, was a klipper and octoprint host to printer for a year and a bit. Wiped it and went for Debian with xfce becauae again, was originally concerned about resource usage but ended up swapping to KDE and I don’t notice any difference so it’s staying that way.

I really hate waste so I appreciate just how useable older hardware can be, Yeah there’s probably an era that’s less true but I’ll go out on a limb (based on feeling only) and suggest that anything in the last 15 years this’ll be true for, but that’s going to depend on what you’re trying to do with it, you won’t have all the capability of more modern hardware but frankly a lot of use cases probably don’t need that anyhow (web browsing, word processing, programming, music playback for sure, probably some video playback, pretty much haven’t hit a wall yet with my laptop)

user224 ,
@user224@lemmy.sdf.org avatar

“Old laptop” has a Core 2 Duo and 4GB of DDR2 RAM.

It also has a better keyboard with plenty of travel, on-the-go replaceable battery, easily accessible components likely to get replaced/upgraded/cleaned, large cooler, large selection of I/O, has higher likelihood to survive 2 more years than a brand new laptop and it can be used as a weapon or anchor.

secret300 ,

shit that’s better than my main laptop

Finadil ,

Ollama on a ten year old laptop? Lol, maybe at 1T/s for an 8B.

Smokeydope OP , (edited )
@Smokeydope@lemmy.world avatar

tinyllama 1.1b would probably run reasonably fast. Dumb as a rock for sure. But hey its something! My 2015 t460 thinkpad laptop with an i7 6600U 2.6GhZ duo core was able to do llama 3.1 8B at 1.5T-2T/s which while definitely slow was also just fast enough to still have fun in real time.

abbadon420 ,

Than what are the minimal specs to run ollama (llama3 8b or preferably 27b) at a decent speed?

I have an old pc that now runs my plex and arr suite. Was thinking of upgrading it a bit and running ollama on it as well. It doesn’t have a gpu, so what else does it need? I don’t have a big budget, so no new nvidia card for me.

Smokeydope OP , (edited )
@Smokeydope@lemmy.world avatar

“decent speed” depends on your subjective opinion and what you want it to do. I think its fair to say if it can generate text around your slowest tolerable reading speed thats a bare minimum for real time conversational things. If you want a task done and don’t mind stepping away to get a coffee it can be much slower.

I was pleasantly suprised to get anything at all working on an old laptop. When thinking of AI my mind imagines super computers and thousand dollar rigs and data centers. I don’t think mobile computers like my thinkpad. But sure enough the technology is there and your old POS can adopt a powerful new tool if you have realistic expectations on matching model capacity with specs.

Tiny llama will work on a smartphone but its dumb. llama3.1 8B is very good and will work on modest hardware but you may have to be patient with it if especially if your laptop wasn’t top of the line when it was made 10 years ago. Then theres all the models in between.

The i7 6600U duo core 2.6ghz CPU in my laptop trying to run 8B was jusst barely enough to be passing grade for real time talking needs at 1.2-1.7 T/s it could say a short word or half of a complex one per second. When it needed to process something or recalculate context it took a hot minute or two.

That got kind of annoying if you were getting into what its saying. Bumping the PC up to a AMD ryzen 5 2600 6 core CPU was a night and day difference. It spits out a sentence very quick faster than my average reading speed at 5-6 t/s. Im still working on getting the 4GB RX 580 GPU used for offloading so those numbers are just with the CPU bump. RAM also matters DDR6 will beat DDR4 speed wise.

Heres a tip, most software has the models default context size set at 512, 2048, or 4092. Part of what makes llama 3.1 so special is that it was trained with 128k context so bump that up to 131072 in the settings so it isnt recalculating context every few minutes…

Smokeydope OP ,
@Smokeydope@lemmy.world avatar
  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines