There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Sorgan71 ,

God you guys cant have any fun. Yeah it uses power but cant we have cool things once and awhile?

xthexder ,
@xthexder@l.sw0.com avatar

So how is the total power over 500x that of the GPU power? If it’s all LEDs, that thing must get brighter than the damn sun.

Jackfinished ,

Funny enough you can barely see it during the day

xthexder ,
@xthexder@l.sw0.com avatar

I don’t know what they need so many GPUs for. There’s 16 displays inside, and the sphere itself has fewer pixels than even 1 of the internal displays. You could probably run the sphere off a laptop if you aren’t trying to do anything fancy.

Maybe they plan on doing crazy live simulations on it or something. I can’t imagine what kind of displayed image would actually use all 150 of them. Nvidia A6000 cards are damn powerful.

shasta ,

I guess the practicality of the decision depends on the finances. Did they actually buy the cards or were they gifted by nvidia for free advertising?

xthexder ,
@xthexder@l.sw0.com avatar

It does seem suspiciously like they picked 150 completely arbitrarily to make the project sound impressive, when they could have easily done it with 20. I’m sure a bunch of people in the middle made a bunch of money off that transaction too. Or like you said, maybe this is Nvidia doing some guerrilla marketing

WhyFlip ,

You don’t know. Full stop.

xthexder ,
@xthexder@l.sw0.com avatar

My job has been to run things on GPUs for almost 10 years now. The only thing anyone practical is doing on that many GPUs is AI training, massive scientific simulations, or crypto mining. 1 or 2 of them is enough to run something like ChatGPT.

Real-time graphics it turns out don’t scale well across multiple GPUs. There’s a reason SLI has gone away for consumer GPUs. At the current ratio, each of those $3000+ GPUs is only driving 8000 pixels (assuming each led puck is being used as 1 pixel, given their size). It makes no sense other than bragging rights

Rolive ,

Pretty sure it’s just for bragging rights.

yardy_sardley ,

Probably have a few cards running the displays and the rest of them mining some sphere-themed memecoin

fender_symphonic584 ,

You all go ooo and aaaa then yell at oil companies for climate change.

SpaceCowboy ,
@SpaceCowboy@lemmy.ca avatar

The power usage wouldn’t be a problem if the electricity were generated in a green way.

If only the energy sector had a workforce experienced in building offshore structures that could build offshore wind farms. And maybe a workforce that had experience in drilling that could develop geothermal energy.

Of course we also need an energy sector that had a lot of financial resources to put into these kinds of investments.

If only the energy sector had these kinds of resources, a big sphere drawing a lot of electricity wouldn’t be a problem.

fender_symphonic584 ,

As an industry insider, can tell you old oil and gas wells are being converted to geothermal where possible. There is lots of innovation in the works!

dan ,
@dan@upvote.au avatar

28,000,000 watts

That’s usually written as 28MW. I know some Americans don’t like metric much, but one of the points of metric is that you don’t ever need to write that many zeroes - you just need to use the right prefix (kilo, mega, giga, tera, etc) on the unit.

rc__buggy ,

you just need to use the right prefix (kilo, mega, giga, tera, etc) on the unit.

Oh, thanks.

Bruh, it’s PC Gamer.

quick edit: Hey! Why aren’t you converting it to Joules?

Remavas ,
@Remavas@programming.dev avatar

Because Joule is the SI unit of energy, meanwhile the Watt is the SI unit of power, equivalent to one Joule per second.

“Converting” joules to watts would be like converting m/s to US dollars.

HerrBeter ,

I liked the analogy but I do think it would be clearer to say something like joules = money in bank account and Watt = spending per second

ipkpjersi ,

True, but 28 million watts really puts things in perspective when your average PSU is less than 1000w.

dan ,
@dan@upvote.au avatar

That’s true.

average PSU is less than 1000w

Unrelated but I wish it was easier to find lower-wattage PSUs. My local PC store doesn’t have anything under 650W. I know modern GPUs use a lot of power, but not all PCs use a GPU! I have a home server where 400W would be more than enough, yet the smallest I could find was 550W, in stock from just one manufacturer (Be Quiet).

tomkatt ,

I mean, it should be fine, just because the PSU can provide more watts doesn’t mean the system is actually using that much power. I have an 800w PSU in my gaming rig, but its average load is only 240 - 320w during gaming (I’ve measured it by powering the system with a portable Ecoflow battery).

dan ,
@dan@upvote.au avatar

It runs fine, it’s just less efficient.

riodoro1 ,

Where are you getting this from? Intuition?

I think the quiescent current and losses are less in a well engineered psu.

hedidwot ,

This is verifiable in manufactures data sheets.

Efficiency at less than 20% and greater than 80% loads isn’t great relative to in between those ends.

This is compounded by lower wattage PSUs being more limited with regard to features and benefits.

If you end up with a 650w PSU and your system idles at 80 watts for the bulk of a working day you spend long periods of time in this less efficient window.

We need to see some quality 300w to 600w designs come back onto the market.

magi ,

Exactly. This is literally a PC gamer article. Writing it out like that really puts it into perspective for the average reader.

calcopiritus ,

Way easier to compare 28MW to 1KW.

randon31415 ,
intensely_human ,

The non-stop anti-humanity propaganda is exhausting.

Tattorack ,
@Tattorack@lemmy.world avatar

Wouldn’t just one GPU be enough to run the Sphere, or a I getting something wrong?

I remember hearing about that it’s not exactly high resolution, each “pixel” being a bunch of pretty large lamps.

ilinamorato , (edited )

Wikipedia says it’s 16,000x16,000 (which is way less than I thought). The way the math works, that’s 16x as big as a 4k monitor, so 16 GPUs would make sense. And there’s a screen inside and one outside, so double that. But I also can’t figure out why it needs five times that. Redundancy? Poor optimization? I dunno.

Tattorack ,
@Tattorack@lemmy.world avatar

But wouldn’t that be only necessary if it needed to render real-time graphics at such a scale? If I’m correct, all its doing is playing back videos.

st14 ,

Live audio visualization in game engines is definitely a thing ex. youtu.be/IZL7VAt97ws?si=H74SwrLZYfsYNTY8

stormeuh ,

Even if it’s just playing back videos, it still should compensate for the distortion of the spherical display. That’s a “simple” 3d transformation, but with the amount of pixels, coordinating between the GPUs and some redundancy, it doesn’t seem like an excessive amount of computing power. The whole thing is still an impressive excess though…

ilinamorato ,

I think it’s doing some non-trivial amount of rendering, since it’s often syncing graphics with music played live.

Anyolduser ,

I’m guessing it’s the department of redundancy department, is my guess.

ilinamorato ,

Someone elsewhere in the thread suggested it might be a marketing thing on Nvidia’s part, and that makes a lot of sense.

markpaskal ,

I work for a digital display company, and it is definitely redundancy. There will be at least two redundant display systems that go to the modules separately so they can switch between them to solve issues. If a component fails on one side they just switch to the other.

ilinamorato ,

Ah, nice. Thank you for bringing your expertise to my nonsense.

umbraroze ,

The way I think it, it’s possible a really small number of GPUs would be enough to render the framebuffer, you’d just need an army of low-power graphics units to receive the data and render it on screens.

Having a high-power GPU for every screen is definitely a loss unless the render job is distributed really well and there’s also people around to admire the results at the distance where the pixel differences no longer matter. Which is to say, not here.

menemen ,
@menemen@lemmy.world avatar

I mean it is cool. But really a testament to why we deserve extinction at this point…

Usernameblankface ,
@Usernameblankface@lemmy.world avatar

Ok, so it’s “capable of drawing” enough power for 20,000 homes in the area. How much does it actually use day to day? Does it dim at night and brighten in the daytime to keep those ads rolling in the sunshine?

SunDevil ,

“Capable of drawing 28,000,000 watts of power” doesn’t tell us anything. As was noted, it should’ve been given in megawatts (28 mW) or kilowatts (28,000 kW). Clickbait aside, how many kilowatt-hours (kWh) does it actually use?

28 mW isn’t that much energy, relatively speaking. As of 2015, Forbes estimated LV uses 8000 mW on an average summer day.

The potential is impressive. I doubt it pulls anywhere near that. Unless I did my math wrong, this seems sensationalist.

Gsus4 , (edited )
@Gsus4@programming.dev avatar

I don’t get it, are they implying that each GPU can draw 200kW? a home is like 10 max. Wtf is a gpu that can consume more power than 20 homes? Mine at home draws peak 300W…

Each of those GPUs feature over 10,752 cores, 48 GB of memory and have a 300 W TDP, for a grand total of 1,612,800 cores, 7,200 GB of GDDR6 memory, and a potential maximum power draw of 45,000 W at full tilt (via Wccftech).

ok, monster gpus, got it.

Zeoic ,

Just Fyi, mW is milliwatts, and MW is megawatts. Agreed though, I doubt it draws that much day to day.

SunDevil ,

Thank you, I stand corrected!

acockworkorange ,

28 MW? What are they doing in there, aluminum?

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines