There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

What is the largest file transfer you have ever done?

I’m writing a program that wraps around dd to try and warn you if you are doing anything stupid. I have thus been giving the man page a good read. While doing this, I noticed that dd supported all the way up to Quettabytes, a unit orders of magnitude larger than all the data on the entire internet.

This has caused me to wonder what the largest storage operation you guys have done. I’ve taken a couple images of hard drives that were a single terabyte large, but I was wondering if the sysadmins among you have had to do something with e.g a giant RAID 10 array.

Davel23 ,

Not that big by today's standards, but I once downloaded the Windows 98 beta CD from a friend over dialup, 33.6k at best. Took about a week as I recall.

absGeekNZ ,
@absGeekNZ@lemmy.nz avatar

Yep, downloaded XP over 33.6k modem, but I’m in NZ so 33.6 was more advertising than reality, it took weeks.

pete_the_cat ,

I remember downloading the scene on American Pie where Shannon Elizabeth strips naked over our 33.6 link and it took like an hour, at an amazing resolution of like 240p for a two minute clip 😂

fuckwit_mcbumcrumble ,

Entire drive/array backups will probably be by far the largest file transfer anyone ever does. The biggest I’ve done was a measly 20TB over the internet which took forever.

Outside of that the largest “file” I’ve copied was just over 1TB which was a SQL file backup for our main databases at work.

cbarrick ,

+1

From an order of magnitude perspective, the max is terabytes. No “normal” users are dealing with petabytes. And if you are dealing with petabytes, you’re not using some random poster’s program from reddit.

For a concrete cap, I’d say 256 tebibytes…

Neuromancer49 ,

In grad school I worked with MRI data (hence the username). I had to upload ~500GB to our supercomputing cluster. Somewhere around 100,000 MRI images, and wrote 20 or so different machine learning algorithms to process them. All said and done, I ended up with about 2.5TB on the supercomputer. About 500MB ended up being useful and made it into my thesis.

Don’t stay in school, kids.

mEEGal ,

You should have said no to math, it’s a helluva drug

data1701d OP ,
@data1701d@startrek.website avatar
Urist ,
@Urist@lemmy.ml avatar

I obviously downloaded a car after seeing that obnoxious anti-piracy ad.

d00phy ,

I’ve migrated petabytes from one GPFS file system to another. More than once, in fact. I’ve also migrated about 600TB of data from D3 tape format to 9940.

neidu2 , (edited )

I don’t remember how many files, but typically these geophysical recordings clock in at 10-30 GB. What I do remember, though, was the total transfer size: 4TB. It was kind of like a bunch of .segd files (geophysics stuff), and they were stored in this server cluster that was mounted in a shipping container, and some geophysics processors needed it on the other side of the world. There were nobody physically heading in the same direction as the transfer, so we figured it would just be easier to rsync it over 4G. It took a little over a week to transfer.

Normally when we have transfers of a substantial size going far, we ship it on LTO. For short distance transfers we usually run a fiber, and I have no idea how big the largest transfer job has been that way. Must be in the hundreds of TB. The entire cluster is 1.2PB, bit I can’t recall ever having to transfer everything in one go, as the receiving end usually has a lot less space.

data1701d OP ,
@data1701d@startrek.website avatar
neidu2 ,

The alternative was 5mbit/s VSAT. 4G was a luxury at that time.

ramble81 ,

I’ve done a 1PB sync between a pair of 8-node SAN clusters as one was being physically moved since it’d be faster to seed the data and start a delta sync rather than try to do it all over a 10Gb pipe. M

MajorHavoc ,

I’ll let you know… If it finishes.

avidamoeba ,
@avidamoeba@lemmy.ca avatar

~15TB over the internet via 30Mbps uplink without any special considerations. Syncthing handled any and all network and power interruptions. I did a few power cable pulls myself.

pete_the_cat ,

How long did that take? A month or two? I’ve backfilled my NAS with about 40 TB before over a 1 gig fiber pipe in about a week or so of 24/7 downloading.

avidamoeba ,
@avidamoeba@lemmy.ca avatar

Yeah, something like that. I verified it it with rsync after that, no errors.

Larvitz ,
@Larvitz@burningboard.net avatar

@data1701d downloading forza horizon 5 on Steam with around 120gb is the largest web-download, I can remember. In LAN, I’ve migrated my old FreeBSD NAS to my new one, which was a roughly 35TB transfer over NFS.

data1701d OP ,
@data1701d@startrek.website avatar

How long did that 35TB take? 12 hours or so?

southsamurai ,
@southsamurai@sh.itjust.works avatar

I think 16 terabytes? Might have been twelve. I was consolidating a bunch of old drives and data into a nas for a friend. He just didn’t have the time, between working and school and brought me all the hardware and said “go” lol.

pixeltree ,
@pixeltree@lemmy.blahaj.zone avatar

I once deleted an 800 gb log file, does that count

Krafting ,
@Krafting@lemmy.world avatar

Rsynced 4.2TB of data from one server to another but with multiple files

tedvdb ,
@tedvdb@feddit.nl avatar

Today I’ve migrated my data from my old zfs pool to a new bigger one, the rsync of 13.5TiB took roughly 18 hours. It’s slow spinning disks storage so that’s fine.

The second and third runs of the same rsync took like 5 seconds, blazing fast.

Yeahboiiii ,

Largest one I ever did was around 4.something TB. New off-site backup server at a friends place. Took me 4 months due to data limits and an upload speed that maxed out at 3MB/s.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • random
  • [email protected]
  • lifeLocal
  • goranko
  • All magazines