Holodecks are a terrifying technology.
Imagine your friends beam you into a running program while you are sleeping.
Everybody, when something out of the ordinary happens, would at first say: ”Computer, stop program!”
I have to believe an experienced holodeck user would be able to detect some of the telltale signs pretty easily. Like replicated food, if you see it enough you probably notice “holodeck vase #5” showing up scattered around the background of scenes as clutter. Or even minor visual distortions where it switches from 3d to the false horizon.
I’m sorry to disappoint you, but I randomly rewatched ‘Ship in a Bottle’ and even Data only recognizes they’re still in a simulation due to a blank transporter log. No visual clues, no glitch in the matrix, an empty log that could have been empty for any other reason.
it's more convenient for me to put a frozen ready meal in the oven for 30 minutes than it is for me to make dinner, even though the act of making dinner might take less than 30 minutes
but isn't that just based on the time preference of whatever you are giving up during the time you have to actually think about & make dinner when you otherwise would be whilst "cooking" food someone else made.
and that can be about what you enjoy. I like making my computer so shit. Others like fixing engines or playing video games, they're all different ways to scratch that same itch.
And technical expertise, and the ability to use a computer without accessibility aids, and the notion of what a “format” is so that they can open their kids’ halloween homework assignment without the formatting being completely broken, and the ability to solve computer problems on their own without calling Geek Squad or visiting a Genius Bar…
Yeh totally works, I mean, I’m sitting at my desk at work shaking and I can hardly read the screen as random words go variously in and out of focus and despite being hair trigger alert I’m also exhausted and the verge falling face down unconscious on my keyboard and I have to read every email 10 times over before actually understanding it and then somehow still respond in a way that doesn’t quite make total sense. But technically, I’m awake and I’m physically here and nobody can say otherwise.
Oh god. That was me last week. Waking up to “think about things” at 3am, staying awake until six and waking up at eight. I have no tolerance for days like this, I felt like absolute shit.
"Plutocracy" is the term for "Financial Oligarchy" BTW. Worth knowing the term if you live in the U.S. since that's kinda what we have here these days :/
Reminder for everyone that when there are efforts to change the system and have employers pay higher wages instead, the majority of workers are vehemently against it.
You’ll see people in this thread telling you that it’s not the workers’ fault, and that taking it out on the workers by not tipping is not fair, as if they’re victims of the system.
Most pressure to maintain the system (or add tips to new industries) comes from the workers, and I feel that not tipping is entirely appropriate if you want it to change.
When the workers themselves start clamoring for raising wages and getting rid of tipping culture, I will empathize with them more.
People don’t want to constantly pay more fees in the form of “voluntary” tips that are supposed to be a courtesy based on service quality, not a tax and payroll dodge for employees and employers who obviously have no incentive to report cash income like this. And now even more people are jumping on the tip bandwagon, and on top of that they calculate the tip on the total including tax. I’m not giving the government a tip, too. Tips are becoming compulsory in the eyes of far too many service industry employees.
It’s far easier for them to shit on customers than it is to assume any risks associated with fighting employers and the established system for real wages. Leeching off the hard work wages of customers rather than doing the hard work of fighting for a real wage.
We’ve had general purpose computers for decades but every year the hardware requirements for general purpose operating systems keep increasing. I personally don’t think there has been a massive spike in productivity using a computer between when PCs usually had 256-512mb to now where you need at least 8gb to have a decent experience. What has changed are growing protocol specs that are now a bloated mess, poorly optimised programs and bad design decisions.
I personally don’t think there has been a massive spike in productivity using a computer between when PCs usually had 256-512mb to now
For general use/day to day stuff like web browsing, sure, I agree, but what about things like productivity and content creation? Imagine throwing a 4K video at a machine with 512 MiB RAM - it would probably have troubles even playing it, let alone editing/processing.
Video production is something you can do on a general purpose computer because it runs a flexible OS that allows for a wide range of use cases. As opposed to a purpose built embedded system that only performs the tasks for which it was designed. Hence, not general purpose. I believe this was their point anyway, not just like a computer for office work or whatever.
Video production is general purpose computing just like opening a web browser to look at pictures of cats is - it’s just that the former is way more resource intensive; it is done in software that runs on an OS that can run a dozen other things which in turn runs on a CPU that can usually run other OSes - as opposed to a purpose built system meant to do very specific things with software often written specifically for it.
We’ve had video editing software available to most personal computers since at least 1999 with imovie and 2000 with windows movie maker. IMO this is all general computer users need.
Professional level video production is not general computing, it’s very niche. Yes it’s nice that more people have access to this level of software but is it responsible.
The post does raise some real issues, increasing hardware specs is not consequence free. Rapidly increasing hardware requirements has meant most consumers have needed to upgrade their machines. Plenty of these could have still been in operation to this day. There is a long trail of e-waste behind us that is morally reprehensible.
You don’t need to be a “professional” to edit 4k videos at home, people do that every day with videos they took on their effing phone.
And that’s the point. What people do with their computers today requires far more resources than computers did in the late 90s. I’m sorry, but it’s completely idiotic to believe that most people could get by with 256 - 512MB of RAM.
“Morally reprehensible” give me a break, you simply don’t know what you’re talking about. so just stop.
My point is not that we should all go back to using old hardware right now with current the current way we use our tech because that is impossible.
My point is that the way we look at technology is wrong and the way we upgrade without real reason. The average person does not need a 4k camera, it does not make them a better photographer. I’ve used digital cameras with < 15 M sensors, the photos generally sufficed for family/holiday snaps and professional photography. Yet there will be people who have thrown out phones because they unnecessarily want the latest camera tech. Wait till people want 8k recording.
That perfectly working phone that was thrown out is an example of the e-waste I was talking about. Producing computers is not with out societal and environmental cost, and to throw perfectly serviceable machines is morally reprehensible. Current culture would agree with me that its not sustainable, but most people aren’t ready to have to keep their device for 5+ years.
Everyone should keep their current devices as long as possible (either the device breaks or can no longer run work related software) to reduce the upgrading culture. You can shoot 4k now, that’s great! Keep the device even if the latest device supports 8k video. Same applies to other hardware/software features.
Somewhat agree. Manufacturers releasing successive models at less than a year’s interval now is ridiculous and you buying each new one - even more so, but on the other hand using the same phone for 5-6 years just because you can is also a bit drastic (even if you swap the battery midway through, by the time the second one’s dead the phone will be obsolete). Maybe a bit more doable with computers, especially given that you can upgrade one component at a time. 2-3 years seems doable for a phone.
I mean its not that crazy, I’m writing this on a moto Z2 play. It was released June 2017, not long till year 6 bit hope it goes longer. It’s perfectly usable, runs most apps fine, can even run TFT.
Phones haven’t changed that much recently, this model has a great screen, 4gb of ram(more than some laptops that are still being released!), and a decent chip. Only issue is the battery is sub 3000mah but I know of a few models from around the same time went up to 5000mah.
You do get better mileage running an OS like lineage and being degoogled since a lot of their tracking processes kill the battery and slows things down.
My landlord’s two brothers who inherited a bunch of properties from Daddy. One of them lives in Scottsdale and the other in Hawaii. It really gets my goat knowing that 1 out of every 3 dollars I make goes to some overprivileged daddy’s removed boy. I probably pay their golf membership or marina docking fees.
What’s the problem? If you don’t buy a house you need to rent one, houses aren’t free. Yeah those owners never worked for it, but isn’t that the case with every rich kid? Why don’t you buy a crappy house you can fix up yourself?
Most people living paycheck to paycheck don’t qualify for a 20 year loan for land and a home even in the case of crappy manufactured homes. Plus, if they ever defaulted, they would lose the home and probably quite a bit of the equity as well, depending on local laws.
Opals are the superior stone and they actually look awsome. Transparent glass like stones are so boring. They are also much cheaper and not harvested with child labor.
My wife’s is made of a purple sapphire as the main stone with a small diamond on each side. She loves the purple. The diamonds I didn’t pay for, they were her grandmother’s that I got from her sister.
Rings, like engagement/wedding rings, can take quite a beating. You need a hard stone or it won’t really last very long.
I have a lab Ruby in my engagement ring and then lab diamonds around it. The lab Ruby is a good alternative because it’s a hard stone! Sapphires and alexandrite are also just as hard and could be good stones in a ring you’d wear everyday.
Hardness absolutely matters in rings. Not as much in pendants or earrings, but people don’t realize how rough they are with their hands. Most people do not take their rings off to wash their hands, or do their laundry, or, or, or. So many things have unexpected abrasives that may just feel a little rough on your skin, but can significantly damage a soft stone like opal. In a rush and accidentally bang your hand against the door frame? Chipped opal. Back of your hand itches, so you rub it against your jeans briefly? Scratched opal. They’re very fragile stones.
memes
Top
This magazine is from a federated server and may be incomplete. Browse more on the original instance.