I agree. But also add in the movie industry that’s been complete trash for a while now. Not to mention books. I’m not sure if we’ll ever see another Harry Potter level book again, at least in our lifetimes.
My take is we’ve already left the golden ages of movies, music, and books and probably won’t get another for an extremely long time.
Video games are going through the same downfall which streaming services brought. Physical media left the movie scene as a standard while ago, but video games took longer. Now it’s going to be all streaming and subscriptions where you can never own anything.
Once that happens, enshittification will peak, companies won’t be incentivized to make the games good anymore, standards tank, and people will forget how good things once were.
If your blood plasma helps save somebody’s life, either directly as an infusion or indirectly in research, that’s not a scam. The monetary reward is compensation for time and an incentive to try to meet demand. The donation is free, but the time and energy required to make the donation are an expense. That’s what the compensation covers. It’s only a scam if your donation goes to feed a literal or wannabe vampire or their bathing fetish.
Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.
Nvidia is just playing conservative because it was massively overvalued by the market. The GPU use for AI is a stopover hack until hardware can be developed from scratch. The real life cycle of hardware is 10 years from initial idea to first consumer availability. The issue with the CPU in AI is quite simple. It will be solved in a future iteration, and this means the GPU will get relegated back to graphics or it might even become redundant entirely. Once upon a time the CPU needed a math coprocessor to handle floating point precision. That experiment failed. It proved that a general monolithic solution is far more successful. No data center operator wants two types of processors for dedicated workloads when one type can accomplish nearly the same task. The CPU must be restructured for a wider bandwidth memory cache. This will likely require slower thread speeds overall, but it is the most likely solution in the long term. Solving this issue is likely to accompany more threading parallelism and therefore has the potential to render the GPU redundant in favor of a broader range of CPU scaling.
Human persistence of vision is not capable of matching higher speeds that are ultimately only marketing. The hardware will likely never support this stuff because no billionaire is putting up the funding to back up the marketing with tangible hardware investments. … IMO.
Neo Feudalism is well worth abandoning. Most of us are entirely uninterested in this business model. I have zero faith in the present market. I have AAA capable hardware for AI. I play and mod open source games. I could easily be a customer in this space, but there are no game manufacturers. I do not make compromises in ownership. If I buy a product, my terms of purchase are full ownership with no strings attached whatsoever. I don’t care about what everyone else does. I am not for sale and I will not sell myself for anyone’s legalise nonsense or pay ownership costs to rent from some neo feudal overlord.
Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.
I’m a die hard open source fan but that still feels like a stretch. I remember 10 years ago we were theorizing that windows would get out of the os business and just be a shell over a unix kernel, and that never made it anywhere.
AI still needs a lot of parallelism but has low latency requirements. That makes it ideal for a large expansion card instead of putting it directly on the CPU die.
Multi threading is parallelism and is poised to scale to a similar factor, the primary issue is simply getting tensors in and out of the ALU. Good enough is the engineering game. Having massive chunks of silicon laying around without use are a mach more serious problem. At the present, the choke point is not the parallelism of the math but actually the L2 to L1 bus width and cycle timing. The ALU can handle the issue. The AVX instruction set is capable of loading 512 bit wide words in a single instruction, the problem is just getting these in and out in larger volume.
I speculate that the only reason this has not been done already is because pretty much because of the marketability of single thread speeds. Present thread speeds are insane and well into the radio realm of black magic bearded nude virgins wizardry. I don’t think it is possible to make these bus widths wider and maintain the thread speeds because it has too many LCR consequences. I mean, at around 5 GHz the concept of wire connections and gaps as insulators is a fallacy when capacitive coupling can make connections across all small gaps.
Personally, I think this is a problem that will take on a whole new architectural solution. It is anyone’s game unlike any other time since the late 1970’s. It will likely be the beginning of the real RISC-V age and the death of x86. We are presently at the age of the 20+ thread CPU. If a redesign can make a 50-500 logical core CPU slower for single thread speeds but capable of all workloads, I think it will dominate easily. Choosing the appropriate CPU model will become much more relevant.
I've kind of been needing a "gaming reset" since lately I've just been dabbling here and there without really committing to anything, so I decided to check in on my current save in Sims 3. I forgot how addicting this game can be lol. I'll probably play this for a few days and then finish my second playthrough of Lies of P that's been sitting for weeks now.
@Mr_No_Swearing This post was reported for fatphobia. I understand that you are using it as an example but we do want to avoid attacks on specific people or groups of people. Like you showed in your later posts, people swear at all sorts of things, like the weather, stubbing their toe, and missing the bus.
Do you mind changing the example in your original post? You can say that the example was changed at my request. Especially if you think changing the example will make the comments confusing.
Otherwise, I like your post. It’s an interesting thing to think through and while it seems to be a little unpopular, my thoughts in the shower are often not fully thought through and would stand up poorly to the scrutiny of internet strangers.
Nobody is “my candidate.” I don’t have an agenda. You’ve made up a motivation for me that has nothing to do with what I posted or why. This is just a funny picture.
kbin.life
Oldest