There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

What is the largest file transfer you have ever done?

I’m writing a program that wraps around dd to try and warn you if you are doing anything stupid. I have thus been giving the man page a good read. While doing this, I noticed that dd supported all the way up to Quettabytes, a unit orders of magnitude larger than all the data on the entire internet.

This has caused me to wonder what the largest storage operation you guys have done. I’ve taken a couple images of hard drives that were a single terabyte large, but I was wondering if the sysadmins among you have had to do something with e.g a giant RAID 10 array.

hperrin ,

I transferred my entire NAS storage, which includes all of my backups, cloud files, my family’s backups, and my… Linux ISOs. That was about 12TB.

pixeltree ,
@pixeltree@lemmy.blahaj.zone avatar

I once deleted an 800 gb log file, does that count

Laborer3652 ,

I have to copy ~8 TiB of backup files twice a year.

pete_the_cat , (edited )

I’m currently in the process of transferring about 50 TB from one zpool to another (locally), so I can destroy and recreate it.

I’ve downloaded a few torrents that were around 5 TB each, they’re PS4 and Xbox 360 game collections.

narc0tic_bird ,

Around 15 TB migrating to a new NAS.

Yeahboiiii ,

Largest one I ever did was around 4.something TB. New off-site backup server at a friends place. Took me 4 months due to data limits and an upload speed that maxed out at 3MB/s.

bulwark ,

I mean dd claims they can handle a quettabyte but how can we but sure.

southsamurai ,
@southsamurai@sh.itjust.works avatar

I think 16 terabytes? Might have been twelve. I was consolidating a bunch of old drives and data into a nas for a friend. He just didn’t have the time, between working and school and brought me all the hardware and said “go” lol.

Nomecks ,

I did 100TB, 100 streams of 1TB, all simultaneous with rsync

averyminya ,

I once robocopied 16tb of media

nobleshift ,
@nobleshift@lemmy.world avatar

Professionally, Boeing near Seattle moves mind-boggling data. Personally,1500+ terabytes in a single 6 day jag, aboard a 29’ sailboat running off an inverter and 13 year old Dell hardware, completely off grid on anchor. It was a massive rescue operation, I did what I could with what I had where I was at.

I didn’t sleep well.

boredsquirrel ,
@boredsquirrel@slrpnk.net avatar

Local file transfer?

I cloned a 1TB+ system a couple of times.

As the Anaconda installer of Fedora Atomic is broken (yes, ironic) I have one system originally meant for tweaking as my “zygote” and just clone, resize, balance and rebase that for new systems.

Remote? 10GB MicroWin 11 LTSC IOT ISO, the least garbage that OS can get.

Also, some leaked stuff 50GB over Bittorrent

seaQueue ,
@seaQueue@lemmy.world avatar

I routinely do 1-4TB images of SSDs before making major changes to the disk. Run fstrim on all partitions and pipe dd output through zstd before writing to disk and they shrink to actually used size or a bit smaller. Largest ever backup was probably ~20T cloned from one array to another over 40/56GbE, the deltas after that were tiny by comparison.

ShittyBeatlesFCPres ,

You should ping CERN or Fermilab about this. Or maybe the Event Horizon Telescope team but I think they used sneakernet to image the M87 black hole.

Anyway, my answer is probably just a SQL backup like everyone else.

slazer2au ,

As a single file? Likely 20GB iso.
As a collective job, 3TB of videos between hard drives for Jellyfin.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines