Maintaining social order, especially in the form of violent repression against demonstrations, indirectly protects the rich’s properties, so all in a day’s work.
There’s plenty of cases where they don’t look for cars either.
Or the cops themselves just straight up steal the car themselves.
My wife’s car was ordered to be towed by, according to the impound lot, the police.
Neat thing was that there was no ticket with the car, no police station within 3 miles had a record of a ticket for her or the car, and the area she had parked had no signs that suggested it was illegal to park where she did, nor does the city have any ordinance about overnight parking.
Best we can figure, is a cop or the tow company that works with the city, just decided to tow a car for funsies and the 500 bucks it took to get it out of impound.
The police and every organization associated with them are corrupt to the core.
Reading that I almost had a thought like it must have been a mix-up or something, but no, US police will murder people with less thought, so that type of fuckery is completely expected.
Love bikeindex, I actually got my stolen bike back thanks to that site. It was literally two years later but still, the police wouldn’t have even made a report probably in the city I was at, with bike theft so ubiquitous.
Given the number of times I’ve seen cops on police forums and r/protectandserve use terms like “bikefags”, I think it’s just the typical cop disgust of anything they perceive to be weak or effeminate.
Yeah, I don’t get that. Bicycling requires strength and endurance. It exposes you to the elements. Why is sitting in a cushy car something some people think as being more macho? Is it that you’re in control of a heavier and more powerful machine?
So does cleaning a house, but that’s “women’s work”.
Is it that you’re in control of a heavier and more powerful machine?
That’s it. You didn’t get it at first because made the mistake of associating manliness with things like patience, strength, hard work, endurance both of toil and hardship; all things that do make up ideals of manliness to normal people. But you need to approach it from the perspective of a wastrel, a weak, foolish, and lazy person who demands the respect and deference of being manly without putting in the hard work—something he has avoided all his life. He might praise hard work in abstract, but he has no discipline for it and doesn’t respect those who actually do it, he just considers them beneath him. To such a person, the defining aspect of manliness and machismo is mastery, mastery over others and their wills, and since mastery through work is a waste of time to him, he turns to shortcuts.
From there, it’s not hard to see where the thought process goes. Since strength is to him based on control and mastery, he picks something that gives him more command over the road in a direct and in-your-face way. The man who drives a lifted Ram 2500 can confront you by running you the fuck over. By contrast, in his opinion, cyclists are entitled jackasses in miniscule booty shorts who can only confront you on the road by screaming “CRITICAL MASS! FUCKING CAGER!” and throwing sparkplugs at your windows. The difference in power dynamic is proof enough to our friend of who the “real man” is.
To take the mentality to its conclusion, the easiest way to gain mastery in general is through authority, and the easiest way to get that, even easier than joining a gang, is by becoming a cop.
I’m pretty sure any petty theft is very hard to track down. Not just bikes, if someone broke into your house and stole some minor things it’s almost certainly not gonna get found. Bikes are the same, it’s very easy to resell them and repaint, and nobory registers bikes.
Because even if they look for it and find it, whoever is riding just says it theirs and there is literally nothing the police can do unless it was caught on video or there is a meaningful identifying feature like a serial number or something else specific and unique.
Seeing a sketchy guy with a black and red bike with the same bike rack you had isn’t enough to prove anything.
If an officer approached me riding my bike around and asked me to prove it’s mine, I couldn’t either despite not being a thief.
Anything that’s not serialized and recorded is basically impossible to find. If you have serial numbers then they can inform local pawn shops, but even then the shops probably aren’t checking serials for anything under $500.
And if the thief just sells it on craigslist then no one is checking serials.
What you’re entering the third act of your love story and you have to get to the church in time to break up the wedding and declare your love, what’s a little bike theft? The universe will take care of it.
It probably depends a lot on where you live. My wife’s bike got stolen and she was woken up by police coming to check on it (one of the maintenance guys at our apartment noticed a man at 7-Eleven riding it and recognized it; came back running to check if it’s indeed missing and called the police). We fully expected the police would do nothing about it (it was the cheapest Walmart bike), but an hour later they called that they found the bike and have the culprit in custody. It did help that the bike was a girly mint green with a wicker basket, so they instantly recognized it when they saw it.
Then again, in San Francisco, when my wife got her car window smashed and wallet stolen (she was late for class and dropped her wallet under the car seat, didn’t stop to take it; but it wasn’t the wallet that caught the thieves’ attention, it was the breast pump bag that looked like a laptop bag; they threw it on the floor when they saw what it was), we never heard anything back from the police.
Fun fact. Cops on average have lower IQ and often fail literacy tests. Furthermore it appears that critical thinking is discouraged in the job, with candidates being selected who lack critical thinking abilities over those that have them.
Calling 25 years a quarter century is a fun way to make it seem way longer. Precedent is kind of important in law so setting a precedent that says they can discriminate on the basis of IQ is relevant until is overturned. Do you have any articles about a ruling overturning that precedent?
I think it’s more nefarious than that. Many departments want a good 'ol boys club where they’re the ultimate authority and they want their officers to fall in line rather than question department actions.
I’ve bounced off GitHub more than once trying to figure out how to download the .exe file that I assumed must be somewhere. Honestly I still don’t understand the interface and I’ve submitted bug reports for Jeroba on there. I might have even used GitHub for a project once? Every time I look at it it’s overwhelming and confusing and none of it is self-explanatory. But, that’s fairly true for a lot of stuff in programming.
If there is an exe, it’s under the releases link. On desktop it’s on the right sidebar below “About”. On mobile it’s at the bottom after the readme blurb.
It’s not obvious because the code is the main focus and GitHub would much rather people host their releases somewhere else.
And even if releases are hosted on github, there should ideally be a download links page somewhere that presents the different binaries or installation files in an easier to understand format, especially if the software is designed for non-developers.
I'd agree, but the caveat is that github is primarily about an interface for source control and collaboration between developers for projects. The release page is really just an also-ran in terms of importance.
Imo they aren’t even trying, because it’s not that hard to make it better. Doesn’t even have to be a compromise. Most people just need a visible download button for the programs, that’s all.
Or make a shortcut/link in the readme to the newest release of the most popular OS’s.
A decent release page tends to contain all kinds of files for different OS, so ‘regular’ people who just want the .deb or .exe would likely become confused regardless.
There is, it’s literally right there on the home page of the project. You can either copy a URL and download it by cloning the git repo, or you can download the whole project as a zip file. Then you just have to compile it!
That’s not a download button for the program. But there is indeed a link to the release page right on the home page of the project, so you’re still correct.
not only the ux, some devs make it absurdly confusing to find a binary.
I don’t want to throw anyone under the bus, but there’s this one niche app.
their github releases at one point were YEARS out of date, they only linked to the current version in seemingly random issue reports’ comments. And the current versions were some daily build artefacts you could find in a navigation tree many clicks deep in some unrelated website. And you’d better be savvy enough to download a successfully built artefact too. And even then the downloaded .zip contained all kinds of fluff unnescessary for using the app.
The app worked fine, sure, but actually obtaining it was fairly tricky, tbh.
absolutely, but they were in general (IIRC) suggesting them for the main downloads, but just not telling anyone outside the comments, which was the weird part
It’s not black and white. I actually liked a few things better about bit buckets UI. It’s been too long to remember specifics though I think it was concerning PRs and diffs. I still think GitHubs review UI is too complicated. It took me literally years to fully understand it.
I’m not so sure. I seem to be able to find my way around a GitLab project in much fewer moves than a GitHub project. But maybe I’m biased because I use it all the time at work. I know they change the sidebar a lot, though.
Comparing bad to bad doesn’t make any of them better lol
I’ve gone nuts trying to download a single file from the git website on my first interactions with it (because somehow adding a download file button when you’re viewing a file on the site is just too much to handle)
It doesn’t have to be a compromise imo. Most people just need a visible download button on the front pages. Wouldn’t hurt devs at all. I mean, even devs sometimes struggle with this lol.
Do MOST people who use GitHub download .exes? In my experience the VAST majority of people are using it for source and version control, not external releases. The overwhelming majority. FOSS and OSS is a small portion of the overall GitHub user base compared to, say, enterprise companies.
The github project page is for developers, and Github already gives you tons of ways to make a user website. Don’t ask your users to visit github.com/group/project, make them visit group.github.io/project, like any sane person.
Same with Gitlab, BTW.
And if you don’t like the full static site, use the wiki, or guide your users in the first paragraphs of the README so they find the user information if they must.
We’re talking about how to design one of the biggest platforms on the internet. Of course there is a compromise. No one is advocating for removing the button, but arguing that the UI is somehow deficient for people wanting to download binaries is really missing the purpose of GitHub.
Literally everyone? I’ve been a software engineer for ten years. My company doesn’t use it, and no company I’ve worked for has. I guess they are not part of “literally everyone?”
Explain to me how GitHub working on one product feature (releases) has no impact on how much they can work on others. Apparently in your rich enterprise software career you’ve found that resources and time are limitless? Or maybe you think it’s trivial for a platform like GitHub to change their UI.
This smacks of lots junior software engineers I’ve worked with who think problems are simple and solutions are easy because they’ve never actually DONE anything. I get that you’re very convinced that this is easy and cost less but it’s pretty clear to me you have no idea what you’re talking about.
Again. I’ve said before that release downloads are an additional feature. But it’s a feature most people use. Neither did I say it was easy, nor it was cheap. Just that it makes sense and that it doesn’t take anything away from the professionals regarding UI quality or focus.
No, what you mean is YOU use it and you’re assuming most people use GitHub the way you do. GitHub is first and foremost a platform for GIT. Git has nothing to do with releases or file downloads per se. Time spent improving the releases UI is time not spent doing other UI improvements. If you need more proof that it’s not worth it to spend time on the release UI, just take note of the fact that GitHub is not spending time on the release UI. If everyone was using it and it was deficient, do you really think that would be the case?
It makes sense from a pure UX perspective. But of course the real goal of GitHub is to make money, and their paying customers are mostly corporate entities using it for enterprise development. Unless those companies decide that a download button/better release feature is desirable, it’s not likely to happen.
Most corporations tie GitHub into their own build system so such a feature isn’t likely to be considered useful. They pay for GitHub to reduce development costs, which is why GitHub spends so much effort on analytics and the dev experience instead of open source/public users.
Thanks for understanding what I was getting at and your well written ‘realistic’ addition to it. There’s not much I can add besides saying you’re absolutely right.
Why would your company use that? Did they use github for public applications targeted to non-techincal users? Because that’s what that page is for and what a huge chunk of Github users do.
I use it both ways. As a software engineer I use it for various packages, which don’t even need a releases page. But also as an end-user of open source software, I use it to download pre-built binaries of said software. Idk if you know, but there’s a lot of open-source software out there. And github is the most popular platform for hosting it. And when I say software, I mean the kind where you don’t expect your users to know how to build it from code themselves.
If somebody doesn’t have an idea of what they’re talking about (allegedly) then it would be far more productive to explain it than to keep arguing about it without actually solving anything.
So when you just needed software to run on your machinr, you built it yourself. But first read every single line of code to ensure that it’s safe. Did I get that right?
Because if you don’t trust the developer to provide safe binaries then you wouldn’t trust the same developer to provide safe code either.
No, you shouldn’t really be downloading exe’s from github. It is widely being used to spread malware and to pretend that the software is open source when it is not. At least look for a link to the store page(including microsoft store), a distro-specific package or build instructions. Those usually have an AV scan or at least harder to fake.
Yeah a dude I know got hacked by downloading some random github program, the hacker even started taunting him via discord lol.
But I downloaded plenty of shit from github, like prusaslicer, my 3d printer’s firmware and plugins for octoprint. Always stuff that is verified via another page though. Almost never stuff that comes up during a random search, and if I do, I look it up first to see if it’s safe.
But if you want to put a some text and pictures in very specific locations and never worry about them suddenly jumping into random places, Excel is actually better than Word. That’s why people tend to use Excel for all sorts of weird purposes like that. Unlike with Word, things actually stay where you put them.
Yes and there are definitely people who use excel for art. Just like there are people who use GitHub for its releases page. It’s just not the primary use of either program.
I’ve seen some of the impressive pixel artworks people have made in Excel. However, I prefer to do Excel art by writing a bunch of wild functions and drawing a stacked line chart from the resulting data. The graph itself is the artwork, while the cells behind it are just a necessary part of the process.
Worst part is that this used to be a separate tab in the repo navigation. I still cannot conceive of a reason why they would move it from there to some random heading in the middle of the screen, except maybe so they can sell more GitHub trainings.
If you use it as a developer you don’t care about the releases page. You want to see the code and for latest version you just need the git tags. But I’ve also used it for stuff I just needed to run on my machine as an end-user. And for those you turn to the Releases page. That’s where pre-built binaries go.
But it also depends on the target audience. Some projects, even if meant more as software to run than code to import, still target mainly developers or tech users in general and will not have more than just instructions on how to build them. Others, say a Minecraft launcher, or some console emulator, will target a wider audience and provide a good Releases page with binaries for multiple platforms.
After downloading code from GitHub for years I can still take over a minute finding the file I want to download at times. Now that’s not long, but it’s why I’m there 90% of the time.
Honestly, releases and the readme could be the first page on their own, you can push the code to another tab as long as the clone button is there. There’s at most a 5% chance I’m just gonna raw dog the code straight from the browser anyways.
Elon Musk loves to speak confidently about shit he knows nothing about. This leads to him being a confident speaker on every topic… I just wish we could figure out a way to shut him up.
Oh. This post’s image has him talking types in January and the “obligatory” image above has someone saying he’s been talking software in December, so I thought maybe Musk has been spewing about software for a few weeks or something.
December from '22 not '23. The image was from a few months after he took over twitter and was still going on about that stuff and how it was doing all these useless things that needed to be removed or rewritten. I just remembered another one about how he was going on about a single request to twitter causing thousands of RPCs or something? I think that’s not really unheard of in a microservices infrastructure and it’s not like they’d be synchronous. There’s probably tons of calls that go to things like tracking, analytics, or cross DC sharing I would imagine for such a large and high volume service like twitter.
When he took over twitter there was a bunch of stuff he was spouting about things like Twitter’s stack needing a full rewrite and such. Going so far as to fire the engineer that challenged him on it during a live spaces thing if I recall correctly.
One example that stuck with me is that he said some shit along the lines of 80% of Twitter’s microservices being superfluous and he’ll be shutting them off.
Yes, the dev teams just spent 4/5 of their time building shit no one asked for. It just annoys me so much, because anyone with basic reasoning should be able to work out that this cannot possibly be the case, but it’s easy to give it the benefit of the doubt.
Well, except that many, many Twitter outages followed.
Well, except that many, many Twitter outages followed.
Yeah. As a software dev, it was pretty awkward explaining this to colleagues who rely on Twitter/X.
“It sounds like you think Twitter is a software company and that Elon is utterly unqualified to run a software company. That can’t possibly be true, right?”
I’ve heard horror stories on the programming subreddits of incompetent managers that require their employees to write X new lines of code per week. Those code bases probably could have huge chunks taken off.
He also seems to have the idea that the best developer is the one who produces the most code. That shows a pretty major lack of understanding of how software development works. Sometimes the best day is when you produce negative amounts of code.
The rockets are fine. SpaceX has a team specifically designed to distract Musk and keep him away from the actual work on the rockets. Tesla didn't have that though. That's how we ended up with that lame presentation with the weird "S3XY" acromin. That was really the point I realized that he was just an idiot frat boy with too much money. He really is his own worst enemy.
What’s your source on the spacex team distracting him? I can’t find anything supporting that. I do find some interviews from anonymous employees saying it’s calmer now that he’s so focused on twitter.
the thing about spacex is everything they do is because of nasa and government.
the only thing spacex has going for it is the fact that they can spend a billion dollars exploding a rocket five times before it slightly works the sixth whereas the government can’t do that.
As someone who does know about this field, and absolute despise Musk, that’s not quite true. SpaceX is very successful thanks to help from the US government, and despite the influence of Musk, but also because they are a team of very competent people who have actually innovated and pushed the boundaries of launch vehicles. To say they have nothing going for them and are being propped up by the government is not at all accurate, and they have been much more succesful than traditional government contractors.
To say they have nothing going for them and are being propped up by the government is not at all accurate
That isn’t what they’re saying though, is it? They’re saying that SpaceX has the ability to fail more than NASA, because they’re not a government organization funded solely by taxes.
Admittedly I think the biggest failures that hurt NASA were incidents when people, not rockets, blew up. It’ll be interesting to see if things change if/when there is a death from a SpaceX rocket.
People die in work related incidents all the time. The only thing different about deaths from NASA incidents is that they are (usually) spectacular incidents (like massive explosions or cabin fires…not good things, just stunning) and high-profile.
SpaceX does well because they basically ignore Elon.
I was formulating an angry rebuttal in my head, then saw your comment and realised I hadn’t noticed the username. Of course it’s Musk. That’s rebuttal enough.
Because they rest safe in the knowledge that you rarely if ever get taken to court for it. There are millions of web pages, it needs people to take action to do something about it, and just clicking “Yes all of them” to access the content you were just trying to get to is a far better solution in most situations than hiring a lawyer and investing a few years of legal proceedings, nevermind the money.
There is an organization called nyob (I think) pushing back against that and going through the courts to have more sites penalized for their violations. The process is slow, but I see more and more pages adopting the required “reject all” so there seems to be some pressure on them.
IIRC the EU also ruled that burying the rejection options under additional links counts as a violation. Hence why Google now has a Reject button next to the accept button. Most sites still do that.
Do you know if there is a EU-wide place to report such behavior?
The biggest privately owned TV channel in my country not only does that, but actually just redirects you to a pdf file if you want to “manage cookies”. And it’s not like I can submit a complaint on a national level, as the ruling party’s website uses google analytics without a cookie notice at all.
I think you report to your nation’s Data Protection Centre, each member has their own that takes the reports. If I was still in the EU I would have put more time into finding out how reports work.
Yeah this is very common, I don’t know why other people on here are gaslighting like it doesn’t happen. It’s this way for major sites like YouTube/Twitter/Twitch/etc too. Hell even embedding a YouTube video on a site is violating GDPR. It’s a good idea, but needs a version 2.0 patch to fix some exploits.
I mean almost all websites fall foul of that. You often have to bury deep and end up with a palette of complicated choices and acceptances of individual tracking companies. It’s a bloody mess. The EU should just have mandated “do not track” adherence. There’s already a standard; just enforce it.
I understood the post as those webpages only refusing to load, if the user declines Cookies. So, they do still want to benefit off of those EU users, who click “Accept”.
even worse offenders are the ones with tick boxes for “Legitimate Interest”, since legitimate interest is another grounds for processing (just ads freely given consent is one), the fact you got a “tick” box for it makes it NOT legitimate interest within the confines of the GDPR.
it also doesn’t matter what technology you use whether its cookies / urls / images / local storage / spy satellites. its solely about how you use the data…
No, only the first one (supposing they haven’t invented the zeroth law, and that they have an adequate definition of human); the other two are to make sure robots are useful and that they don’t have to be repaired or replaced more often than necessary…
The first law is encoded in the second law, you must ignore both for harm to be allowed. Also, because a violation of the first or second laws would likely cause the unit to be deactivated, which violates the 3rd law, it must also be ignored.
They never were intended to. They were specifically designed to torment Powell and Donovan in amusing ways. They intentionally have as many loopholes as possible.
Remove the first law and the only thing preventing a robot from harming a human if it wanted to would be it being ordered not to or it being unable to harm the human without damaging itself. In fact, even if it didn’t want to it could be forced to harm a human if ordered to, or if it was the only way to avoid being damaged (and no one had ordered it not to harm humans or that particular human).
Remove the second or third laws, and the robot, while useless unless it wanted to work and potentially self destructive, still would be unable to cause any harm to a human (provided it knew it was a human and its actions would harm them, and it wasn’t bound by the zeroth law).
That would allow for like, 2 trillion devices? Feels like a bandaid, my dude. Next you’re gonna suggest a giant ice cube in the ocean once a year to stop global warming.
Hurricanes cannot cross the equator. The equator is an imaginary line, and hence has zero mass. We can end every hurricane using zero point zero energy (0.0).
I thought it was pretty clear with me adding 13.37 that I was making a joke, the earlier post spoke about how just adding one octet would still be too few addresses, so I joked about adding one more octet.
You can use a ULA if you want to. That’s essentially the IPv6 equivalent of a private IP.
Why though? Having the same IP for both internal and external solves a bunch of issues. For example, you don’t need to use split horizon DNS any more (which is where a host name has a different IP on your internal network vs on the internet). You just need to ensure your firewalls are set up properly, which you should do anyways.
You could follow this logic and add 2 alphanumeric digits before 4 numeric octets. E.g. xf.192.168.1.1
This would at least keep it looking like an IP and not a Mac address. Another advantage would be graceful ipv4 handling with a reserved range starting with “ip” like ip.10.10.10.1
Oh yeah, great, let’s change the fundamental protocol on which all the networks in the world are based. Now two third of the devices in the world crashed because you tried to ping 192.168.0.0.1
Please don’t. Use regex to find something that looks like an IP then build a real parser. This is madness, its’s extremely hard to read and a mistake is almost impossible to spot. Not to mention that it’s slow.
Just parse [0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3}.[0-9]{1,3} using regex (for v4) and then have some code check that all the octets are valid (and store the IP as a u32).
Fuck that, if for whatever reason I’m writing an IP validator by hand I’m disallowing leading zeros. Parsers are very inconsistent, some will parse 010 as 10, others as 0o10 == 8 (you can try that right now with a POSIX ping). Talk about a footgun.
True enough for database or dictionary storage, but a lot of times things get implemented in arrays where you still wind up with two copies of the same uint32.
Because 1.2.3.4 and 1.02.003.04 both map to the same number.
But 10.20.30.40 and 010.020.030.040 map to different numbers. It’s often best to reject IPv4 addresses with leading zeroes to avoid the decimal vs. octal ambiguity.
Holy hell yeah you did. How would you go about doing that in a single expression? A bunch of back references to figure out the country? What if that’s not included? Oy.
You wouldn’t. It’s not possible. Which is what I told them.
And why would you want to? Legally if you change the given address, and it fails to get delivered - that is on you. Not them.
Some countries have addresses that are literally ‘Last house on the left by the Big Tree. Bumban(Neighborhood). NN (Country)’. Any US Centric validation would fail this but I assure you - mail gets delivered just fine.
The only valid regex is (.+). Maybe add a separate country field (especially because some Americans wholeheartedly believe that the entire world should understand that “foobar, TX” means “foobar, Texas, United States”) (don’t get me started on states whose abbreviations are also ISO country codes).
Unfortunately I guess business people only care about getting fewer support calls for missing shipping details, not correctness or a couple of calls from customers who live in the boonies. Then the proper answer is a form with a bunch of fields… which Americans will inevitably fuck up by making the “State” field mandatory despite most countries not having an equivalent.
What I’d really do is use one of those services that automatically fill on the address using google maps or whatever. Not perfect, probably not free, but a whole lot less work for presumably way fewer PEBCAKs from customers.
If you’re using one of those services then PLEASE allow manual entry / override because I’ve had forms like that which I were blocked from filing in because it didn’t acknowledge that my address existed.
You haven’t seen the best part yet. They’re holding back security updates, if you don’t do this whole Pro-shit. I really don’t know how much pot their executives smoked to get that awful idea.
And like, to be fair, for personal use, you can get Pro for free, so you ‘just’ need to create an account to get a secure OS.
But yeah, you basically don’t really hear people complaining, because we simply don’t use Ubuntu. Plenty better Linux distros to choose from. I only know this shit, because my work laptop unfortunately comes with it and I’m not necessarily allowed to change it.
Security updates for packages that are so old that they aged out of official support.
Who determines how old a package needs to be before they start charging money for it?
Well they do, of course.
Tune in next year when they turn off free Snap patches.
Kinda a shit take. Canonical is very generous with licensing. They give you 5 free personal licenses per account AND they license per physical host which is practically unheard of now. Like everything is per VM or container or CPUs or sockets etc now. One pro license on an ESXi host could have hundreds of VMs and Canonical is OK with that.
Source: I work with and use ubuntu pro. Canonical’s alright in my book. More than I can say for the RHEL team
Locks community access behind a corporate license agreement
Hands control of community-created content to a corporation
Prevents indexing by web search engines
Antithetical to interoperability
Privacy-hostile
A web forum is far better in most cases. If you can’t manage to run your own, there are plenty of lemmy servers that will do it for you. Even an email list (with searchable archives) would be better than Discord.
If you have collaborative documents that outgrow the forum format, use a wiki.
If real-time chat is needed, irc or matrix.
A project hosting its community on Discord is a project that won’t get my contributions.
I recently went through these exact pains trying to contribute to a project that exclusively ran through Discord and eventually had to give up when it was clear they would never enable issues in their GitHub repos for “reasons.”
It was impossible to discover the history behind anything. Even current information was lost within days, having to rehash aspects that were already investigated and decided upon.
It’s the “see no evil” approach. If you didn’t report the issue while the admin was online, then they aren’t compelled to do anything about it. Convenient for the project maintainer who doesn’t actually like maintaining things. Awful for the rest of us.
It’s sad when a web forum is better than the tool you’re considering. Bumps, aggressive garbage collection, no Resurrection, it’s weird.
I’m old, I guess. I miss NNTP, mainly for the archived posts I could discuss with the authors for an updated take or revised solution or some clarification. And yes, I know there’s a good webUI front-end for an NNTP server as a back-end. ;-)
The worst thing is that the mods can ban you for any or no reason, locking you completely out of the information they’re providing. That is beyond an unreasonable amount of power that they can have over a user, and you just KNOW they’re going to use that for political reasons.
Also the fact they can delete stuff in a way that makes them invisible to law enforcement, so a lot of illegal shit goes down there too. Combine that with the naturally hierarchal structure of discord leads to a lot of people using that power to abuse some of the more vulnerable members and of course once you call it out, poof goes the messages and poof goes your access to their server.
Just as an aside, I’m an American that emigrated to Canada. My province (BC) is currently passing a law to make one attempt at IVF free for everyone (starting midyear in 2025)… laws actually can be used for good.
Lots of us know, but we mostly live in urban centers where life is better (and often a bit less car centric, for example). Our voting and election finance laws erase lots of our voices.
Just be lucky that when motivated, we still vastly outvote the right wing nuts.
California tries its best… There’s a bunch of pro-consumer laws that other states don’t have. There’s the CCPA which is similar to GDPR (including the right to know and the right to be forgotten). You must be able to cancel a service easily online if you can sign up online. Store gift cards aren’t allowed to have expiration dates. Gift cards with less than $10 on them must be redeemable for cash. Stricter laws against false advertising. And a bunch of other useful laws.
Not as good as the Australian Consumer Law, but better than pretty much every other US state.
Actually probably not. Not without major concessions. The pound will have to go which they will never accept unless they have absolutely no other choice
Some joined when the rules stated that you could choose. Some others are just waiting to meet conditions that will allow them to enter the Eurozone (like Croatia did last year)
Apparently it’s dependent on the signing of a certain agreement before a certain date, which the UK did sign, so it’s actually debated on whether or not Brexit made that signature null or not.
That would be such a mistake and only serve to cause more division, because as you say, the UK would never accept it. Neither would multiple countries already in the EU that also use their own currency.
The EU, generally, are pragmatic. They’d much rather get other concessions than wasting political capital on trying to enforce the Euro on the UK.
E: downvote all you like, but that’s realpolitik. The EU isn’t going to pass up the second largest economy in the continent over something so trivial that they don’t even pressure much smaller countries into it. Pure fantasy from people who don’t have a clue.
The UK adopts various EU rules, a lot of stuff even sold in Northern Ireland has to abide by EU rules (so just say that Apple did make separate lightning and USB C phones, they’d have to use separate operations to sell specific ones in parts of the UK and not others, it probably would have been easier for them to just sell the European models)
Yes the compiler/interpreter can figure it out on the fly, that’s what we mean by untyped languages. And as stated both have their merits and their faults.
Elon doesn’t know what the words mean and just chimes in with his AI future BS.
Well that would depend on the definition and what you exactly mean by untyped.
The untyped part is usually referring to the way the programmer interacts with the language, for example not setting a type for variables and parameters. But then there is the question of is the programmer ever allowed to explicitly set the type. And further more, if the programmer explicitly set the type, does this mean the type can’t change at a later point? And another question could be, can the programmer check or enforce what type a variable or parameter is? And the question, if there is only one type of data in the language, would that be a typed or untyped language? But I would consider these to be details and all fall under the untyped umbrella, with untyped just meaning not-typed.
Then there’s the question of the technical implementation of the language. Defining a language is one thing, actually having it run on a real system is another. Usually technical systems at some point require explicit types. Something somewhere needs instructions on how to handle the data and this usually leads to some kind of typing instructions being added along with the data. But depending on how many abstraction layers there are, this can soon become a very pedantic discussion. I feel what matters is the design, definition and intend of a language. The actual technical implementation isn’t what matters in my opinion.
I feel like there are so many programming languages and technical systems at this point, every variation and exception exists. And if you can think of one that doesn’t exist, expect a follow up comment of somebody pointing out it does exist after all, or them having started a project to make it exist in the near future.
From what I know about those I would consider those to be typed languages. Even if the programmer doesn’t explicitly assign the types, he needs to be aware of them and take into account what type something will be. I am familiar with F# and it’s strongly typed for example.
We’re also at the point where traditionally untyped languages can be strictly typed (strict typescript), and typed languages can be weakly typed (Java’s var)
Assembly probably? So low level you kinda just play with bits. That’s all I can think of for an untyped language. Everything else I’m aware of is dynamically or statically typed
I kind of feel like “untyped” is a term that doesn’t really have a proper definition right now. As far as I can tell when people say “untyped” they usually mean it as a synonym for whatever they consider “dynamically typed” to mean (which also seems to vary a bit from person to person, haha). Sometimes people say assembly is untyped exactly for this reason, but you could also consider it to have one type “bits” and all of the operations just do things on bits (although, arguably different sized registers have different types). Similarly, people sometimes consider “dynamically typed languages” to just be “unityped” (maybe monotyped is more easily distinguished from untyped, haha) languages at their core, and if you squint you can just think of the dynamic type checks as a kind of pattern matching on a giant sum type.
In some sense values always have types because you could always classify them into types externally, and you could even consider a value to be a member of multiple types (often programming languages with type systems don’t allow this and force unique types for every value). Because you could always classify values under a type it feels kind of weird to refer to languages as being “untyped”, but it’s also kind of weird to refer to a language as “typed” when there isn’t really any meaningful typing information and there’s no type system checking the “types” of values. Types sort of always exist, but also sort of only exist when you actually make the distinctions and have something that you call a “type system”… In some sense the distinction between static and dynamic typing is sort of an arbitrary implementation detail too (though, of course, it has impacts on the experience of programming, and the language design makes a bit of a difference in terms of what’s decidable :) (and obviously the type system can determine what programs you consider to be “valid”)… But you can absolutely have a mix of static type checking and dynamic typing, for instance… It’s all a little more wishy washy than people tend to think in my opinion).
Well like asembly has “int types” and “float types” as there’s specific instructions for those operations but those instructions don’t actually care if the bits are for a float or an int. Types in a language are used to restrict the valid operations. In a statically typed language you cannot call cat.bark() or dog.meow() because the property’s of the type, what things you can do with it are known before the program runs. In a dynamically typed language such as Python cat.bark() might or might not be valid so it has to check at runtime for a method throwing an error if it doesn’t exist.
Static/Dynamic typing is a difference of when. Java has static typing but you can also just pass raw Objects around and cast when needed. It even throws a runtime exception similar to how Python or JavaScript would fail. However Java is of course ultimately statically typed everything just shares a common parent class and has types at runtime which allows for some some psudo dynamic behavior
There’s operations that treat bits like floats and operations that treat them like various kinds of ints, but the meaning of bits is in the eye of the beholder. There’s even good examples of mixing and matching integer and floating point operations to clever effect, like with the infamous fast inverse square root. I feel like people often think mathematical objects mean something beyond what they are, when often math is kind of just math and it is what it is (if that makes sense… it’s kind of like anthropomorphizing mathematical objects and viewing them through a specific lens, as opposed to just seeing them as the set of axioms that they are). That’s kind of how I feel with this stuff. You can treat the bits however you want and it’s not like integer operations and bitwise operations have no meaning on supposedly floating point values, they do something (and mixing these different types of operations can even do useful things!), it just might not be the normal arithmetic operations you expect when you interpret the number as a float (and enjoy your accidental NaNs or whatever :P).
The difference of static and dynamic typing being when you perform the type checking is partially why I consider it to be a somewhat arbitrary distinction for a language (obviously decidable static type checking is limited, though), and projects like typescript have shown that you can successfully bolt on a static type system onto a dynamic language to provide type checking on specific parts of a program just fine. But obviously this changes what you consider to be a valid program at compile time, though maybe not what you consider to be a valid program overall if you consider programs with dynamic type errors to be invalid too (which there’s certainly precedence for… C programs are arguably only real C programs when they’re well-defined, but detecting UB is undecidable).
I guess “untyped” could mean “weakly typed”, like how shell and DOS batch are, where everything is a string until you say “hey I want to do math on this” at which point the interpreter turns it into a number, does math on the number, and then turns it back into a string before saving it back to the variable
Dynamic vs Static is a huge debate that I’m not qualified to answer. My personal preference is static because I like to know my mistakes at compile time instead of after running and something weird happens. That goes along with my preference that all variables should be declared at the top of a function.
Well, if there is nof fixed (explicit or implicit) type it’s imposible for the compiler to optimise your code. Also imho programming with typed languages is way easier because your IDE can recognize function argumentd before you compile/run. I tried python and found it baffling how anyone can get any work done with it :D
By typed they mean declairing a type for your variables.
In some languages, variables needs to be told what kind of data they can hold. That’s it’s type. For instance a number without decimals would be an integer type. While text might be a string type or a list of character types.
Other languages don’t require types and sometimes don’t even support them. They will just infer the type from the data that’s in the variable.
The fun thing with AI that companies are starting to realize is that there’s no way to “program” AI, and I just love that. The only way to guide it is by retraining models (and LLMs will just always have stuff you don’t like in them), or using more AI to say “Was that response okay?” which is imperfect.
The fallout of image generation will be even more incredible imo. Even if models do become even more capable, training off of post-'21 data will become increasingly polluted and difficult to distinguish as models improve their output, which inevitably leads to model collapse. At least until we have a standardized way of flagging generated images opposed to real ones, but I don’t really like that future.
Just on a tangent, openai claiming video models will help “AGI” understand the world around it is laughable to me. 3blue1brown released a very informative video on how text transformers work, and in principal all “AI” is at the moment is very clever statistics and lots of matrix multiplication. How our minds process and retain information is by far more complicated, as we don’t fully understand ourselves yet and we are a grand leap away from ever emulating a true mind.
All that to say is I can’t wait for people to realize: oh hey that is just to try to replace talent in film production coming from silicon valley
Yeah I read one of the papers that talked about this. Essentially putting AGI data into a training set will pollute it, and cause it to just fall apart. Most LLMs especially are going to be a ton of fun as there were absolutely no rules about what to do, and bots and spammers immediately used it everywhere on the internet. And the only solution is to… write a model to detect it. Which then they’ll make models that bypass that, and there will just be no way to keep the dataset clean.
The hype of AI is warranted - but also way overblown. Hype from actual developers and seeing what it can do when it’s tasked with doing something appropriate? Blown away. Just honestly blown away. However hearing what businesses want to do with it, the crazy shit like “We’ll fire everyone and just let AI do it!” Impossible. At least with the current generation of models. Those people remind me of the crypto bros saying it’s going to revolutionize everything. It might, but you need to actually understand the tech and it’s limitations first.
Building my own training set is something I would certainly want to do eventually. Ive been messing with Mistral Instruct using GPT4ALL and its genuinely impressive how quick my 2060 can hallucinate relatively accurate information, but its also evident of limitations. IE I tell it I do not want to use AWS or another cloud hosting service, it will just return a list of suggested services not including AWS. Most certainly a limit of its training data but still impressive.
Anyone suggesting to use LLMs to manage people or resources are better off flipping a coin on every thought, more than likely companies who are insistent on it will go belly up soon enough
You’re describing an arms race, which makes me wonder if that’s part of the path to AGI. Ultimately the only way to truly detect a fake is to compare it to reality, and the only way to train a model to understand whether it is looking at reality or a generated image is to teach it to understand context and meaning, and that’s basically the ballgame at that point. That’s a qualitative shift, and in that scenario we get there with opposing groups each pursuing their own ends, not with a single group intentionally making AGI.
AIs can be trained to detect AI generated images, so then the race is only whether the AI produced images get better faster than the detector can keep up or not.
More likely as the technology evolves AIs, like a human, will just train real-time-ish from video taken from it’s camera eyeballs.
…and then, of course, it will KILL ALL HUMANS.
It’s definitely a qualitative shift. I suspect most of the fundamental maths of neural network matrices won’t need to change, because they are enough to emulate the lower level functions of our brains. We have dedicated parts of our brain for image recognition, face recognition, language interpretation, and so on, very analogous to the way individual NNs do those same functions. We got this far with biomimicry, and it’s fascinating to me that biomimicry on the micro level is naturally turning into biomimicry on a larger scale. It seems reasonable to believe that process will continue.
Perhaps some subtle tuning of those matrices is needed to really replicate a mind, but I suspect the actual leap will require first of all a massive increase in raw computation, as well as some new insight into how to arrange all of those subsystems within a larger structure.
What I find interesting is the question of whether AI can actually fully replace a person in a job without crossing that threshold and becoming AGI, and I genuinely don’t think it can. Sure it’ll be able to automate some very limited tasks, but without the capacity to understand meaning it can’t ever do real problem solving. I think past that point it has to be considered a person with all of the ethical implications that has, and I think tech bros intentionally avoid acknowledging that, because that would scare investors.
I see this a lot, but do you really think the big players haven’t backed up the pre-22 datasets? Also, synthetic (LLM generated) data is routinely used in fine tuning to good effect, it’s likely that architectures exist that can happily do primary training on synthetic as well.
I’m sure it would be pretty simple to put a simple code in the pixels of the image, could probably be done with offset of alpha channel or whatever, using relative offsets or something like that. I might be dumb but fingerprinting the actual image should be relatively quick forward and an algorithm could be used to detect it, of course it would potentially be damaged by bad encoding or image manipulation that changes the entire image. but most people are just going to be copy and pasting and any sort of error correction and duplication of the code would preserve most of the fingerprint.
I’m a dumb though and I’m sure there is someone smarter than me who actually does this sort of thing who will read this and either get angry at the audacity or laugh at the incompetence.
The best part is they don’t understand the cost of that retraining. The non-engineer marketing types in my field suggest AI as a potential solution to any technical problem they possibly can. One of the product owners who’s more technically inclined finally had enough during a recent meeting and straight up to told those guys “AI is the least efficient way to solve any technical problem, and should only be considered if everything else has failed”. I wanted to shake his hand right then and there.
I heard the same story when I was a kid, but it was about a boilermaker. The rest was for knowing where to tap his hammer to fix their problem.
It’s an obviously apocryphal story with two great messages. First, don’t undervalue your expertise just because the fix was easy (I still have a problem with that). Second, if you don’t know what you’re doing don’t question the expert just because it looked easy.
I know a version with a graphics designer. They designed something in 10 minutes and asked 1000 USD for it. When confronted on why it is so expensive for just 10 minutes of work, the answer is that it’s not just the 10 minutes of work, but also the 10 years of experience that lead to this 10 minutes of work.
At the beginning of the 20th century Henry Ford’s electrical engineers had issues they could not solve with a gigantic generator. Henry Ford called Steimmetz, a genius mathematician working for GE to help them.
When he arrive at the factory he spent 2 days and night listening to the generator and scribbling on his notebook.
After that he asked for a ladder, climbed on it, put a chalk mark on a specific spot and explain to the engineers that they needed to remove the plate and replace sixteen windings behind the plate. After that the generator worked perfectly and Ford received a $10 000 bill.
Ford asked for an itemized bill and Steinmetz sent this
It’s funny reading this, because the way I heard the story was as a railroad story.
The train engine wouldn’t run. The expert was called, he arrived, and after inspecting the train engine, knew exactly were to apply a little bit of oil to make it run again. His bill was challenged as being overly expensive, and he countered with them paying for the knowledge of where to apply to oil, not the oil itself.
There’s like all these different versions of the same philosophy of the story
NGL I apply to places where I use the software. But it’s not one thing, it’s a dozen things I would fix.
I actually never successfully got the job. Probably because during the interview, I come off like a rambling psychopath pointing out extremely specific things.
Part of my previous company’s hiring process included having the candidate use our software, then asking what they thought of the experience and what improvements they thought would have the most impact. It wasn’t entirely useful because devs weren’t in control of prioritizing changes, but it was always interesting to see which pain points stuck out to the candidate.
It does give some insight into how people think. Some people are bothered with UI events and placement, others wanted to reduce the bandwidth it required, we had one girl who approached it focused on the accessibility of the software, and unfortunately for us support was abysmal. You also need thick skin to invite random joe off the street to tell you how your software sucks.
Honestly, anybody with a gender studies degree can get into software developer nowadays no sweat, nowadays the fortune 500 standards are so low that they’ll just hire anyone on the spot without even questioning it. Honestly only started to take note of this the second Biden got into office, the quality of software overall has gone down. Overall, back to open source, I never truly got the open source movement in general, never been my thing. Proprietary software is inheitly more secure which is why most enterprise systems still use windows xp.
That’s a misconception. Farmers lobbied heavily against DST. Their work does not abide by the clock; they milk when cows need milking, and they harvest when there’s enough light, no matter what some clock says.
In Europe, DST as we know it now was first introduced by Germany during WW1 to preserve coal, then abandoned after the war, and widely adopted again in the 70s. In the US it was established federally in the 60s.
This is all glossing over a lot of regional differences and older history. But yeah, US farmers were very much against the idea.
So, this is wrong on so many levels. First of all, DST had nothing to do with farmers, it was to save energy usage in the summer as people were doing more things when the evenings were warmer.
IIRC daylight savings was created way back when electricity really didn’t exist so it allowed the farmers more daylight to harvest their crops.
DST does not increase the amount of daylight on any specific day of the year, it just shifts it later in the day so that people in 8-5 jobs can do more things after work. Farmers don’t work 8-5, they work as needed so if the crops need harvesting they will get harvested based on the weather.
Now with that said there is more technology in today’s farming equipment so DST shouldn’t really exist anymore.
Nowadays farmers have lots of lights and can harvest after the sun goes down, but that has nothing to do with why DST shouldn’t exist. DST shouldn’t exist because it doesn’t save energy due to any populated place having their lights on all night and the actual changing of time leading to negative outcomes like deaths from accidents with no benefits.
Sure, the sun will come up earlier and set later in the summer if we get rid of DST, but the only reason for the time change in the first place was the standard working hours being longer after noon than before.
Farmers don’t care about clocks unless they are scheduling a time to meet and using the clock for clarity.
The sun comes up when it comes up and that is what matters. Farmers don’t care about the clock for what they consider morning, because morning is before the sun is highest in the aky. They are already getting up a few minutes earlier or later depending on whether the days are getting longer or shorter.
This is besides what I was saying, which was again “if anything” and adding another reason why farmers and DST makes no sense. But dude people live in the world. Farmers are not 1000% in their own bubble. They need to go out to stores and get supplies and interact with the world and the supply chain. You are now taking lack of an office schedule or something to a ludicrous degree with your analogy. I wasn’t even disagreeing with your old points, I was saying “if anything” and adding another reason, but you want to go off on seemingly everyone. Perhaps you’re confusing me with the other guy, but whatever. Cheers.
My understanding is DST did still save appreciable energy until we replaced incandescent lights with fluorescent and leds. Longer daylight in the evening when people are awake and less in the early morning when people are asleep means lights aren’t being used as much. The average light bulb used to consume 60 watts or more and also let off significant undesirable heat, so with a house full of lights DST really did cut back energy usage. Now though with led lights low consumption and virtually no heat, it’s not nearly as significant.
Originally being started for WWI and WWII doesn’t contradict my post which talks about the current reasons given to keep it and that it is not saving energy now.
I hate it. I fucking hate it. With every fiber of my being. I spend every winter counting the days until the sun stops setting before I stop working. Our entire lives are scheduled so we are inside under neon light from 9-6, why are we trying to maximize how much of that is during daytime?
On the day that we go back to permanent ST I will turn to hard drugs to make up for the dopamine deficiency. No joke very few things in my life fill me with more dread than having to suffer early evenings for the rest of my life.
Maybe, and hear me out, the problem is that 9 to 6 is the problem, since 2/3 of that time is after noon. Instead of changing reality to appease business, business, work hours could be changed to 8 to 4 with four before and four after which is both more light in the evening than DST and a shorter workday because people are more productive than they ever have been.
But I guess you would rather let business practices determine when noon is for everyone instead of the sun.
Business hours is no more or less of a social construct than DST or the 24 hour clock.
The only difference is that we have a shot at making everyone agree on a timezone shift or permanent DST, but absolutely NO SHOT at getting every business to switch to an 8-4 schedule. None. It’d be a nice sentiment. But it’s not happening, and I don’t care what the number says on the clock when I leave work as long as it’s sunny outside.
Why is it so important that the sun reaches its zenith at noon anyway? Do you often get confused while looking at your antique sundial?
From a development perspective it certainly sounds easier to have one global timezone with DST than a bunch of smaller ones without it. Would that make sense in reality? Probably not but I definitely think timezones take more work to compensate for properly.
What matters is consistency and our time system has tons of crazy inconsistent shit in our. Everyone knows about leap years, but do you know about leap seconds? Imagine trying to write a function to convert unix time to a current date and suddenly all your times are a second off.
Lets just have 2 timezones, Chinese time and EST w/ permanent DST. The most populated timezones for Eurasia and the americas, and they’re both 12 hours apart, so nobody has to do timezone math, just swich AM and PM.
There was actually a really interesting idea I heard to have no time zones. And I actually think it could be a good idea. It’ll never happen because people would need to re-learn time but if it was always the same time everywhere it would make scheduling and business so much easier. No one would need to convert between different zones or be late because of an incorrect conversion. The downside is that times which are conventionally morning or evening etc, would no longer would be so people would have to get used to time just being a construct for scheduling and not a representation of the natural day/night cycle…but it actually doesn’t sound like a half bad idea.
Problem you run into is the areas where we need to tie things to solar days across an area.
You end up with places having to regulate that school starts at 22:00, and gets out 05:00 the next day.
Businesses close for the night at 06:00 and open bright and early later that day at 22:00.
You have places where one calendar day has two different business days in it, so the annoyances faced by people who work overnight shifts spreads to everyone, and worse gets spread to financial calendars, billing systems and the works.
Time is an air bubble trapped under a screen protector. It’s annoying, and you can push it around to try to keep it out of the way, but you can never really fix it.
There’s just too many inherently contradictory requirements for us to end up with a “good” system, and we just need to settle for good enough.
My dream is that we stop changing things. Whatever we have in time zone database today is what we stick with going forwards. No more dst shifts, no more tweaks to the zones, no more weird offsets and shifts, because we don’t get to stop dealing with the old layout when we change, we just add a new one that we think is better.
For the most part, dealing with this stuff is a solved, shitty problem. It’s when we change the rules that problems come up. Worse when we change them retroactively. (Territory disputes between nations have been resolved with the conclusion that land was actually in a different time zone in the past because it was actually in another country. Not a problem usually, unless there’s a major stock exchange in an island that was transferred between nations and retroactively changing what time it was affects what laws were valid at the time certain transactions took place.
Not really. Timezones, at their core (so without DST or any other special rules), are just a constant offset that you can very easily translate back and forth between, that’s trivial as long as you remember to do it. Having lots of them doesn’t really make anything harder, as long as you can look them up somewhere. DST, leap seconds, etc., make shit complicated, because they bend, break, or overlap a single timeline to the point where suddenly you have points in time that happen twice, or that never happen, or where time runs faster or slower for a bit. That is incredibly hard to deal with consistently, much more so that just switching a simple offset you’re operating within.
As I explained in my other comment, there’s no situation where you’re getting any daylight in the evening with DST, that’s just not possible.
Also daylight in the morning sets your day on a high note. The morning you’re spending in the darkness is what turns your life into a winter long depression. Coming home in darkness is inevitable and has a lower impact on your mental health. And with DST effectively removing BOTH morning and evening daylight, you’ll be completely fucked.
Strong disagree, under DST I get to experience some sunlight in then evenings. Under Standard time I get to watch the sun come up through the window and set through the window.
I don’t know what you mean by evening, but it’s already dark at 16:00 during winter. You only get some light in the morning. DST means no more light in the morning and no more light in the evening. Complete depression. DST should not exist.
i still dont even understand what DST even is, as far as i care because i don’t is that DST just means we change the time, because god forbid the time be a little funky.
The real problem is that across the globe there is like 50 different implementations of it. Some places have a fucking half hour, or some goofy shit. Really fun handling time zones with that sprinkled on top.
That’s ST obv. Now let’s convert it to DST, that will be 9:03 - 16:53. Let’s say you work a standard 9-5 job. Well, 9:03 is after you start working and 16:53 is before you finish. Thus you get ZERO daylight during the day in DST. You get almost an hour in the morning with ST.
No wonder Finland has such high suicide rates during winter…
P.S. It is also worth noting that daylight grows the closer you get to the equator and it grows in the morning, not in the evening. You can see from the examples above that their evening difference is smaller than the morning one. There’s just no point having DST.
I’m missing your point. Do you think that moving the clocks is having an effect on the tilt of the earth? Or are you just trying to explain to me how daylength and latitude are related?
I know quite well how dark it gets in the north. I live in the north. Luckily, the sun still rises and sets at very predictable intervals. If I want to enjoy sunlight, I simply need to be awake at some point that coincides with when the sun is up.
You are also aware that not everyone works the exact same hours, right? And windows exist?
Use a different example to make the opposite point: I’d like the sun to be out for at least an hour after I get home from my “9-5”, so if the sun sets at 1700 I’m standard time, I am depressed. But in DST, I get to spend an hour in my garden.
See? The debate is stupid. Do you want more daylight in the morning or afternoon. That’s the only question. The amount of daylight is not affected by clocks.
Wut? If it’s DST during winter, you don’t have any light to enjoy after work. You can only enjoy light in the morning with ST. All the explanation is above, with facts.
programmer_humor
Top
This magazine is from a federated server and may be incomplete. Browse more on the original instance.