“Capable of drawing 28,000,000 watts of power” doesn’t tell us anything. As was noted, it should’ve been given in megawatts (28 mW) or kilowatts (28,000 kW). Clickbait aside, how many kilowatt-hours (kWh) does it actually use?
28 mW isn’t that much energy, relatively speaking. As of 2015, Forbes estimated LV uses 8000 mW on an average summer day.
The potential is impressive. I doubt it pulls anywhere near that. Unless I did my math wrong, this seems sensationalist.
I don’t get it, are they implying that each GPU can draw 200kW? a home is like 10 max. Wtf is a gpu that can consume more power than 20 homes? Mine at home draws peak 300W…
Each of those GPUs feature over 10,752 cores, 48 GB of memory and have a 300 W TDP, for a grand total of 1,612,800 cores, 7,200 GB of GDDR6 memory, and a potential maximum power draw of 45,000 W at full tilt (via Wccftech).
Ok, so it’s “capable of drawing” enough power for 20,000 homes in the area. How much does it actually use day to day? Does it dim at night and brighten in the daytime to keep those ads rolling in the sunshine?
Wikipedia says it’s 16,000x16,000 (which is way less than I thought). The way the math works, that’s 16x as big as a 4k monitor, so 16 GPUs would make sense. And there’s a screen inside and one outside, so double that. But I also can’t figure out why it needs five times that. Redundancy? Poor optimization? I dunno.
Even if it’s just playing back videos, it still should compensate for the distortion of the spherical display. That’s a “simple” 3d transformation, but with the amount of pixels, coordinating between the GPUs and some redundancy, it doesn’t seem like an excessive amount of computing power. The whole thing is still an impressive excess though…
I work for a digital display company, and it is definitely redundancy. There will be at least two redundant display systems that go to the modules separately so they can switch between them to solve issues. If a component fails on one side they just switch to the other.
The way I think it, it’s possible a really small number of GPUs would be enough to render the framebuffer, you’d just need an army of low-power graphics units to receive the data and render it on screens.
Having a high-power GPU for every screen is definitely a loss unless the render job is distributed really well and there’s also people around to admire the results at the distance where the pixel differences no longer matter. Which is to say, not here.
That’s usually written as 28MW. I know some Americans don’t like metric much, but one of the points of metric is that you don’t ever need to write that many zeroes - you just need to use the right prefix (kilo, mega, giga, tera, etc) on the unit.
Unrelated but I wish it was easier to find lower-wattage PSUs. My local PC store doesn’t have anything under 650W. I know modern GPUs use a lot of power, but not all PCs use a GPU! I have a home server where 400W would be more than enough, yet the smallest I could find was 550W, in stock from just one manufacturer (Be Quiet).
I mean, it should be fine, just because the PSU can provide more watts doesn’t mean the system is actually using that much power. I have an 800w PSU in my gaming rig, but its average load is only 240 - 320w during gaming (I’ve measured it by powering the system with a portable Ecoflow battery).
Efficiency at less than 20% and greater than 80% loads isn’t great relative to in between those ends.
This is compounded by lower wattage PSUs being more limited with regard to features and benefits.
If you end up with a 650w PSU and your system idles at 80 watts for the bulk of a working day you spend long periods of time in this less efficient window.
We need to see some quality 300w to 600w designs come back onto the market.
The power usage wouldn’t be a problem if the electricity were generated in a green way.
If only the energy sector had a workforce experienced in building offshore structures that could build offshore wind farms. And maybe a workforce that had experience in drilling that could develop geothermal energy.
Of course we also need an energy sector that had a lot of financial resources to put into these kinds of investments.
If only the energy sector had these kinds of resources, a big sphere drawing a lot of electricity wouldn’t be a problem.
I don’t know what they need so many GPUs for. There’s 16 displays inside, and the sphere itself has fewer pixels than even 1 of the internal displays. You could probably run the sphere off a laptop if you aren’t trying to do anything fancy.
Maybe they plan on doing crazy live simulations on it or something. I can’t imagine what kind of displayed image would actually use all 150 of them. Nvidia A6000 cards are damn powerful.
It does seem suspiciously like they picked 150 completely arbitrarily to make the project sound impressive, when they could have easily done it with 20. I’m sure a bunch of people in the middle made a bunch of money off that transaction too. Or like you said, maybe this is Nvidia doing some guerrilla marketing
My job has been to run things on GPUs for almost 10 years now. The only thing anyone practical is doing on that many GPUs is AI training, massive scientific simulations, or crypto mining. 1 or 2 of them is enough to run something like ChatGPT.
Real-time graphics it turns out don’t scale well across multiple GPUs. There’s a reason SLI has gone away for consumer GPUs. At the current ratio, each of those $3000+ GPUs is only driving 8000 pixels (assuming each led puck is being used as 1 pixel, given their size). It makes no sense other than bragging rights
That article gets stuck so much and makes my (relatively high end) laptop’s fan scream so hard you’d think the website was designed for that kind of hardware.
Vegas is almost entirely powered by the hoover dam. It’s already pretty green as far as energy goes. The question will be where do they get their power from in a few years when lake mead dries up.
That’s not true. The Hoover Dam contributes to Vegas’s power supply, but it’s nowhere near “almost entirely powered” by the dam, except in Fallout: New Vegas.
Fallout: New Vegas is powered by my ever dwindling sanity. I am currently trying to get my mods to play nice.
Also its implied ingame that only the strip is powered by the damn dam and that Freeside and West Vegas get either limited or no power, hence why directing the electricity from Helios One to the area is such a big deal.
In addition to the other thing, dams have a dramatic and disastrous impact on the ecology in the immediate area and the entire riparian system they connect to. It’s “green” in terms of emissions but they’re still harmful and we should be phasing them out for lower impact alternatives as much as possible.
There’s no such thing as “ASAP” for nuclear power. If you had the permits signed off today, it would take 10 years before a single GWh of new nuclear energy goes to the grid.
Instead, maybe we shouldn’t build giant spherical advertising displays?
Its one of the smaller atrocities in Vegas, particularly when compared to the Bellagio Fountain or the food waste generated by all those casino dining halls.
Apples to oranges dude, this is for pure spectacle that wears off after five minutes. Plus any data gained from it was at the lab they prototyped it I believe in Burbank. This aint really a sign of progress, and itll be funny to see what happens to it when it inevitably breaks.