There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

linux

This magazine is from a federated server and may be incomplete. Browse more on the original instance.

nik9000 , in What is the largest file transfer you have ever done?

When I was in highschool we toured the local EPA office. They had the most data I’ve ever seen accessible in person. Im going to guess how much.

It was a dome with a robot arm that spun around and grabbed tapes. It was 2000 so I’m guessing 100gb per tape. But my memory on the shape of the tapes isn’t good.

Looks like tapes were four inches tall. Let’s found up to six inches for housing and easier math. The dome was taller than me. Let’s go with 14 shelves.

Let’s guess a six foot shelf diameter. So, like 20 feet circumference. Tapes were maybe .8 inches a pop. With space between for robot fingers and stuff, let’s guess 240 tapes per shelf.

That comes out to about 300 terabytes. Oh. That isn’t that much these days. I mean, it’s a lot. But these days you could easily get that in spinning disks. No robot arm seek time. But with modern hardware it’d be 60 petabytes.

I’m not sure how you’d transfer it these days. A truck, presumably. But you’d probably want to transfer a copy rather than disassemble it. That sounds slow too.

prole ,

This was your local EPA? Do you mean at the state level (often referred to as “DEP”)? Or is this the federal EPA?

Because that seems like quite the expense in 2000, and I can’t imagine my state’s DEP ever shelling out that kind of cash for it. Even nowadays.

Sounds cool though.

nik9000 ,

I think it was the EPA’s National Compute Center. I’m guessing based on location though.

corsicanguppy ,

Tape robots are fun, but tape isn’t as popular today.

Yes, it’s a truck. It’s always been a truck, as the bandwidth is insane.

data1701d OP ,
@data1701d@startrek.website avatar

If modern LTO drives weren’t so darn expensive…

nichtburningturtle , in Successful move over after years of trial and error
@nichtburningturtle@feddit.org avatar

Welcome

theshatterstone54 , in What is the largest file transfer you have ever done?

80GB, it was 8 hours of (supposedly) 4k content in the MP4 format. www.youtube.com/watch?v=VF5JWdaJlvc Here’s the link (hoping for the piped bot to appear).

Presi300 , in What is the largest file transfer you have ever done?
@Presi300@lemmy.world avatar

I’ve imaged an entire 128GB SSD to my NAS…

possiblylinux127 , in Java uses double ram.

Minetest.net

LazaroFilm ,
@LazaroFilm@lemmy.world avatar

What is this? I Minecraft knockoff?

possiblylinux127 , (edited )

No, it is just a voxel engine with games and mods. Originally it was but not is much now. There are some Minetest games that do try to make a Minecraft clone

LazaroFilm ,
@LazaroFilm@lemmy.world avatar

What’s the connection with the original post?

possiblylinux127 , in 2GB Raspberry Pi 5 on sale now at $50

No thanks

There are so many issues with the RPI

Qkall ,
@Qkall@lemmy.ml avatar

Genuinely curious, can you enlighten me… I haven’t kept up with rpi stuff… Since… hm 3b I think. But my pinephone pro just crapped out…so I was looking for options.

possiblylinux127 ,

Completely proprietary and overpriced.

myersguy ,

Aren’t most CPU’s and chipsets proprietary? Not to mention all of the firmware blobs they require? What are some affordable, non-proprietary options?

possiblylinux127 ,

Most devices will at least boot without proprietary software and a custom kernel. The RPI doesn’t do that because Broadcom.

ScottE , in Java uses double ram.

This is normal behavior. There is much more to the JVMs memory usage beyond what’s allocated to the heap - there are other memory regions as well. There are additional tuning options for them, but it’s a complicated subject and if you aren’t actually encountering out of memory issues you have to ask if this is worth the effort to tune it.

UnRelatedBurner OP ,

That’s disappointing, but makes sense. At least now I know that there isn’t a point trying. Well, this is the easiest solution xd. Thanks.

possiblylinux127 , in What is the largest file transfer you have ever done?

While I haven’t personally had to move a data center I imagine that would be a pretty big transfer. Probably not dd though.

CrabAndBroom ,

I can’t imagine how nerve-wracking it would be to run dd on something like that lol. I still don’t trust myself to copy a USB stick with my unimportant bullshit on it with dd, let alone a server with anything important on it!

eugenia , in 2GB Raspberry Pi 5 on sale now at $50
@eugenia@lemmy.ml avatar

Not enough RAM to be honest (at least not to be useful in the near future). I ran an Emby/Jellyfin server with 180 GB of music (nothing else was running, not even the UI), and it ran out of RAM, and was swapping like crazy at 1 GB of RAM on my Rpi3. In this day and age, you need 2 GB of RAM for servers, but that won’t be enough within a couple of years (and that’s why I don’t suggest this new model with 2 GB of RAM). I personally would only get a new Raspberry Pi if it comes with 16 GB of RAM, so I can run a UI properly. You just can’t ever have enough RAM these days. Linux is using less RAM than Win11, but not by much these days. It’s growing too fast in requirements in the last 3-4 years.

GolfNovemberUniform ,
@GolfNovemberUniform@lemmy.ml avatar

It’s growing too fast in requirements in the last 3-4 years.

I haven’t noticed it tbh.

eugenia ,
@eugenia@lemmy.ml avatar

3 years ago XFCe needed on Debian about 450 MB of RAM (on a clean boot). It now needs 850. And that’s not so much XFce’s fault, it’s all the other stuff underneath that have been growing too much too.

I mean, heck, Cosmic should not need more than 500 MB of RAM overall, having such a clean codebase. And yet it’s the heaviest of them all, at 2.5 GB (even Gnome/KDE boots at 1.3 GB on Debian). And it’s not a matter of optimization because it’s an alpha. That’s a cheap explanation. It’s just heavy. Just as much as Windows in terms of ram usage.

GolfNovemberUniform ,
@GolfNovemberUniform@lemmy.ml avatar

I think it’s the “unused RAM is wasted RAM” technology. Try on a machine with no more than 2 Gb.

ikidd ,
@ikidd@lemmy.world avatar

Not sure what you have going but I have plasmashell running right now at 680MB.

marcie , (edited )
@marcie@lemmy.ml avatar

you should try damnsmalllinux, it had a revival recently. though the absolute smallest modern one is probably Slitaz? or alpine linux

though you can definitely set up debian to use less than 500 ram today, kde/gnome are kinda hogs

ashok36 ,

A raspberry pi isn’t and has never been a good choice for a server.

For an appliance like a pi hole, home assistant, or media center playing files from a real Nas it’s fine.

eugenia ,
@eugenia@lemmy.ml avatar

Did. you not read what I wrote? I used it as a media center and it was swapping like hell. That’s what Emby and Jellyfin are. Media servers.

ashok36 ,

No, you use it as a media server. A media center can also be a media server but often is not.

If your pi is just reading files from the network, it’s fine. If it’s serving files, you’re gonna have a bad time.

Use the right tool for the job.

Matriks404 , in What is the largest file transfer you have ever done?

Probably some vigeo game on that is ~150-200 GiB. Does that count?

corsicanguppy , (edited ) in What is the largest file transfer you have ever done?

My cousin once stuffed an ISO through my mail server in '98. His connection up in Bella Bella restricted non-batched comms back then, so he jammed it through the server as email to get on the batched quota.

It took the data and passed it along without error, albeit with some constipation!

Fijxu , in What is the largest file transfer you have ever done?

~340GB, more than a million small files (~10KB or less each one). It took like one week to move because the files were stored in a hard drive and it was struggling to read that many files.

GenderNeutralBro , in What is the largest file transfer you have ever done?

Probably ~15TB through file-level syncing tools (rsync or similar; I forget exactly what I used), just copying up my internal RAID array to an external HDD. I’ve done this a few times, either for backup purposes or to prepare to reformat my array. I originally used ZFS on the array, but converted it to something with built-in kernel support a while back because it got troublesome when switching distros. Might switch it to bcachefs at some point.

With dd specifically, maybe 1TB? I’ve used it to temporarily back up my boot drive on occasion, on the assumption that restoring my entire system that way would be simpler in case whatever I was planning blew up in my face. Fortunately never needed to restore it that way.

delirious_owl , in What is the largest file transfer you have ever done?
@delirious_owl@discuss.online avatar

Upgraded a NAS for the office. It was reaching capacity, so we replaced it. Transfer was maybe 30 TB. Just used rsync. That local transfer was relatively fast. What took longer was for the NAS to replicate itself with its mirror located in a DC on the other side of the country.

CrabAndBroom ,

Yeah it’s kind of wild how fast (and stable) rsync is, especially when you grew up with the extremely temperamental Windows copying thing, which I’ve seen fuck up a 50mb transfer before.

The biggest one I’ve done in one shot with rsync was only about 1tb, but I was braced for it to take half a day and cause all sorts of trouble. But no, it just sent it across perfectly first time, way faster than I was expecting.

TheUniverseandNetworks ,

Yeah, shout out for rsync also. It’s awesome. Combine it with ssh & it feels pretty secure too.

delirious_owl ,
@delirious_owl@discuss.online avatar

Never dealt with windows. rsync just makes sense. I especially like that its idempotent, so I can just run it twice or three times and it’ll be near instant on the subsequent run.

brygphilomena , in What is the largest file transfer you have ever done?

In the middle of something 200tb for my Plex server going from a 12 bay system to a 36 LFF system. But I’ve also literally driven servers across the desert because it was faster than trying to move data from one datacenter to another.

Dlayknee ,

That’s some RFC 2549 logic, right there.

princessnorah ,
@princessnorah@lemmy.blahaj.zone avatar

Just thinking about how much data you could transfer using this. MicroSD cards makes it a decent amount. Latency would be horrible, but throughput could be pretty good I think.

ayyy ,

Amazon Snowball will send you a semi truck.

ReveredOxygen ,
@ReveredOxygen@sh.itjust.works avatar

Packet loss would be quite costly though

data1701d OP ,
@data1701d@startrek.website avatar

Which desert? I’ve lived in the desert my entire life.

brygphilomena ,

From LA to Vegas. Took the servers down end of business one night, drove it all night, installed it and got it back online before start of business the next day.

data1701d OP ,
@data1701d@startrek.website avatar

As an ex-Vegas resident, I have to ask: why were you moving stuff to Vegas?

brygphilomena ,

It’s got a hell of a datacenter.

https://www.switch.com/las-vegas/

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines