I really don’t care about my OS UI since I’m barely actually using it, especially after a few minutes setting up one-click actions. Less than 1% of my time and effort on the computer.
Applications, on the other hand, is where I live and FUCKING HELL!!!
Look, if everyone just decided on a style and everyone went with it within a system I’d be okay with that. It’s not great but at least it wouldn’t be jarring.
But having to live by the whim of 50 different app designers is disgusting. I just want to have a good time, not learn 50 different interfaces.
Though my thoughts on it would also stifle new ideas. So that’s bad.
Enhancement? No, everything I have a problem with is explicitly intended behavior and GNOME devs are infamous for their everyone is stupid except me mentality
Does Gnome/GTK have an issue board where users vote on issues?
Free software development is not a democracy, and does not get driven by polls. Features and bugs are introduced by those who show up, within a community that works towards a shared goal.
I don’t believe the intentional behavior is desirable and would like to see what other users think.
That’s a dick way of saying fuck off but I mean they do provide a free service. If they have a vision and don’t want to deal with random people whining about it that’s their prerogative. Same as yours to find that utterly insufferable.
They do provide a free service (GTK’s file chooser), one that I find horrible and inconsistent (as per the thread) and intentionally so (on issues tangential to example that I found, although the proposed configurable behavior would be nice) - so I won’t even entertain the thought of trying and contributing to it, as it has been suggested.
I don’t know what is insufferable about that, other than the initial criticism…
It’s like getting into a car you haven’t driven before and you hit the wipers instead of the indicator ×1000. Or playing an FPS and E is now F, C is now Ctrl, X is Shift, and you tap+hold instead of tap. WHY?!?! You can remap, but suddenly there’s conflicting keys for shit the tutorial hasn’t even introduced to you yet, so you don’t know what you can or can’t get away with.
Some designer or dev has a personal opinion they think is better than everything else and now we all gotta live with it on the hopes that’ll be the new standard. And there’s so many of those arseholes and their DVORAK layouts and putting “Cancel” on the left and “Confirm” on the right of a dialogue popup. “I think it’s better this way and the world will thank my big brain!”
Wait confirm shouldn’t be on the right? Like I am 99% sure most windows pop-up/modal Dialogs had ok on the left and cancel on the right but I am not entirely sure about Linux (also factorio has them left to right as in “go back and go forward” but I dunno if that is RTL dependent…)
Side by side most of the time. I put them on arms so I can move them around and swivel them. It’s ridiculous having two 34" Ultrawides. But, I can. So I Do. I also run a 3rd 40" 4k display when I need it or want to sit further back.
I just made the switch from 3 24 inch monitor to a single 49 inch super ultrawide. It’s basically 3 monitors with no bezels. A lot of things are annoying though like full-screening videos/games but there are workarounds.
Yeah I went from 1 32" 1440p and two 1080p side monitors to just a single 4k 43" and I’m saying that the time of multi monitor setups has come to an end.
I go out of my way to find components that don’t have RGB lighting on them. When I use my computer, I want to be looking at the screens (the two-monitor part is true,) not the case.
I’ve got a piece of black tape over the power line on my computer, because it is too bright. And I have masking tape over the caps/num/scroll-lock lights on my keyboard; because they are also too bright. (The light is much gentler through the masking tape.)
Apart from nearly 2/3 of Americans polled wanting permanent DST, the massive technological advancement, interconnectedness of the entire world, and an ever-growing proportion of renewable energy?
You’re ignoring the fact that technological advancement is exponential, not linear; world interconnectedness; energy storage; and other renewable energy sources such as geothermal, hydro, and wind.
What is in bad faith? I hold the premise that nothing has specifically changed that would make people actually like the sunrise/sunset time change given it was tried before.
You claimed “technology” without giving specific reasons. You claimed renewable energy despite it only being 20% of today’s energy generation.
You are the one who is making bad faith arguments. Then you are getting mad because you have nothing to support your opinion.
I used to feel this way. Over the course of building out 2 calendar systems in my career (so far) and having to learn the intricacies of date and time-related data types and how they interact with time zones, I don’t have much disdain for time zones. I’d suggest for anyone who feels the same way as this meme read So You Want To Abolish Time Zones.
Also, programmers tend to get frustrated with time zones when they run into bugs around time zone conversion. This is almost always due to the code being written in a way that disregards the existence of times zones until it’s needed and then tacks on the time zone handling as an afterthought.
If any code that deals with time takes the full complexities of time zones into account from the get-go (which isn’t that hard to do), then it’s pretty straightforward to manage.
Time zones are part of it, but also daylight savings is a real pain in the ass. And like you said it gets particularly complicated when you’re dealing with a system that deals with these things as an afterthought, which seems to be a lot of older libraries for time. For instance, the Java date utils are a nightmare and are now considered semi deprecated replaced by a new java.time api. That is, of course, no help for the ridiculous amount of things that depend on these stupid date utils and no one wants to spend the dev hours to refactor.
This did little to convince me that timezones are an unnecessary construct. Pretty much every point made was done from the perspective of someone who had already decided their opinion rather than objectively weighing the pros and cons.
Yeah, the article is written like it’s parodying those who want to abolish timezones, but I’d be interested in specifically what you found unconvincing? I read the main point as being that time zones are an arbitrary social convention but that that arbitrary social conventions are pretty useful for humans.
Like one thing that the article does is repeatedly asking the question “but what time is it in Melbourne?” which I guess sounds pretty silly if you think timezones are unnecessary, since the question would be meaningless if timezones were abolished, and people in different parts of the world would already have centered their day around their respective parts of the clock and you would just look up what the times for everything are in another place. But I think the author was kind of already discarding that idea, because it’s just equivalent to timezones - you have a lookup table for each part of the world to find out what people do at a certain time, except instead of being a single offset you have like a list of times like “school openings”, “typical work hours”, “typical waking hours” (?) etc. This system is basically timezones but harder to use for humans. So the author asking “but what time is it in Melbourne?” is in the context of this table not actually existing, because if it did, then you haven’t actually abolished time zones.
Yeah but also if we’re being honest, from a programmer perspective the timezone has no bearing on what you do, and is hence not a problem at all.
After all, much like you translate the language of your UI when displaying in X, you also add Y hours to all times shown in X. Done. You wouldn’t even need to persist the zoned time data anywhere, given their static nature you could decide the final timestamp shown at display time, purely on a client, visual, level.
OTOH, daylight saving time turns itself - and timezones - into an utter mess and whoever invented them hopefully is proud of the raw amount of grief and harm they caused the world. It causes all kinds of issues with persistence, conversion and temporal shifts in displayed time due to the ephemeral nature of the +X minutes added. Or not. That’s the worst part.
So timezones: Fine, it’s just bling bling on display anyways.
DST: Burn it at the stake.
Yeah, I’m in agreement that DST is kinda pointless and could probably be abolished, but the thread is about abolishing timezones in general (or so I thought).
Abolishing DST doesn’t eliminate all the weird issues with “ephemeral” offsets though. Suppose the user wants to set a reminder for a recurring event at 3pm, and then moves to another country. Do you keep reminding them at 3pm in the new time zone or the old time zone? Maybe the reminder was “walk the dog” and the user meant for it to be at 3pm local time, or maybe it was “attend international meeting” and the user meant it to be at 3pm in the original timezone. (This admittedly only happens to calendar apps so isn’t something that most applications have to deal with, unlike displaying timestamps in general.)
But other than that, I’m of the opinion that as programmers we’re supposed to model the problem space as best we can and write software that fits the problem, rather than change the problem to fit our existing solution. After all, software is written to be used by humans, not the other way round (at least not yet). So if DST is something those wacky humans want and use, then a correct program is one which handles them correctly, and a programmers job is to deal with the complexity.
I disagree about the table - if you’re interacting regularly across timezones you tend to convert everything to your local time anyway - India’s on lunch at 9am, US is starting at 14:00, because that’s how it fits into your day.
Leap years are not as bad as timezones, if you think about it. Timezones try to imperfectly solve a local problem - how to match your clock with the position of the sun. Leap years try to reasonably solve a global problem - how to keep your calendar aligned with the seasons.
The British one has the “color” changed changed to “colour” due to British spelling of color.
The Spanish one has an upside down semi colon because in Spanish you write questions like this: ¿Is this an example question?
The French one is because the French number system makes absolutely no sense and to say 99 you have to say quatre-vingt-dix-neuf (meaning 4 x 20 + 19).
As guy who hate French language and was learning in 1999 I can confirm it was pain to read the topic of lesson and the date. I was so happy when we switched to 2000.
Whole generations of French students that have no idea they escaped having to write “mille neuf cent quatre-vingt dix-neuf” over and over again, in cursive of course.
Joke aside, it’s not taught as 4 × 20 +10 but simply “90 is pronounced quatre-vingt-dix” — which kinda is a mouthful, but you rarely count to 90 as a kid anyway.
Think of a container like a self contained box that can be configured to contain everything a program may need to run.
You can give the box to someone else, and they can use it on their computer without any issues.
So I could build a container that contains my program that hosts cat pictures and give it to you. As long as you have docker installed you can run a command “docker run container X” and it’ll run.
A container is a binary blob that contains everything your application needs to run. All files, dependencies, other applications etc.
Unlike a VM which abstracts the whole OS a container abstracts only your app.
It uses path manipulation and namespaces to isolate your application so it can’t access anything outside of itself.
So essentially you have one copy of an OS rather than running multiple OS’s.
It uses way less resources than a VM.
As everything is contained in the image if it works on your machine it should work the same on any. Obviously networking and things like that can break it.
Sure, but you can still find plenty of info on it by searching for .NET Framework or .NET 4.6. All the documentation is still available. Its just not in the spotlight any more.
Not an intern, but this week I’ve unraveled some mysteries in ASP.NET MVC 5 (framework 4.8). Poked around the internals for a while, figured out how they work, and built some anti-spaghetti helpers to unravel a nested heap of intermingled C#, JavaScript, and handlebars that made my IDE puke. I emulated the Framework’s design to add a Handlebars templating system that meshes with the MVC model binding, e.g.
and some more shit to implement variable-length collection editors. I just wish I could show all this to someone in 2008 who might actually find it useful.
Given that .net was a TLD long before the framework came out, it was a stupid thing to name it. Caused confusion and the inability to Google things right away.
That’s sort of the problem. It’s easy to Google S3 since it’s a distinct (if obnoxiously short) term. Blob is already an overloaded term.
An example of a great name from Microsoft is Excel, it’s relatively short but meaningless so if you Google “Excel Sum” you’ll get wonderful results… “Blob Get” is going to get you a lot of random stuff.
Edit: the top result for blob get is accurate on Google but you’ll also quickly see this result from that site we all hate:
Need help! How do I get the blob fish, basking shark and dwarf whale?
It was pretty smart marketing move. Business people hear ‘dot net’ and nod wisely. Tech people hear ‘dot net’ and scrunch their faces. Either way people keep talking about Microsoft Java.
It makes sense why they did it, but their messed up versioning was the cause to begin with. You should always assume Devs will cut corners in inappropriate ways.
The API is fine. It returns the internal version number (which is 4.0 for Windows 95), not a string. learn.microsoft.com/…/ns-winnt-osversioninfoexa. There’s no built-in API that returns “Windows 95” as a string.
As what often happens, using `` for paths is for backwards compatibility.
Neither CP/M nor MS-DOS 1.0 had folders. When folders were added in MS-DOS 2.0, the syntax had to be backwards compatible. DOS already used forward slashes for command-line options (e.g. DIR /W) so using them for folders would have been ambiguous - does that DIR command have a /W option, or is it viewing the contents of the W directory at the root of the drive? Backslashes weren’t used for anything so they used them for folders.
This is the same reason why you can’t create files with device names like con, lpt1, and so on. DOS 2.0 has to retain backwards compatibility with 1.0 where you could do something like TYPE foo.txt > LPT1 to send a document to a printer. The device names are reserved globally so they can work regardless of what folder you’re in.
An often repeated urban legend that has no basis in reality. Software checking the version of Windows gets “6.1” for Windows 7 and “6.2” for Windows 8. The marketing name doesn’t matter and is different.
I was about to say that most apps should check the NT number but then I remembered that until XP it wasn’t common to run a NT system, but then I remembered NT 4 existed basically in the same timeframe as 95 did, and even if the argument went to “it’s a 9x application”, shouldn’t these OSes at least have some sort of build number or different identifier systems? Because as I said NT systems were around, so they would probably need a check for that
some legacy software checked if the OS name began with “Windows 9” to differentiate between 95 and future versions.
This is a myth. Windows doesn’t even have an API to give you the marketing name of the OS. Internally, Windows 95 is version 4.0 and Windows 98 is 4.1. The API to get the version returns the major and minor version separately, so to check for Windows 95 you’d check if majorVersion = 4 and minorVersion = 0.
Maybe it’s a myth, but it sure sounds plausible. The software that checks the “Windows 9” substring doesn’t even have to exist for this to be reason they chose to skip to version 10 — they just had to be concerned that it might exist.
Sure, maybe there’s no C function that returns the string, but there’s a ver command. It would be trivial to shell out to the command. en.wikipedia.org/wiki/Ver_(command)
This doesn’t prove anything, but there are a TON of examples of code that checks for the substring. It’s not hard to imagine that code written circa 2000 would not be future proof. sourcegraph.com/search?q=context:global+""window…
I have no complaints about just calling it .NET. The distinction between .NET and .NET Framework isn’t much of a problem. It’s the fact that .NET and .NET Core aren’t actually different that’s odd. It underwent a name change without really being a different project, meanwhile the Framework -> Core change was actually a new project.
It underwent a name change without really being a different project
The name difference was only to differentiate the legacy .NET Framework with the new .NET Core while both were being developed concurrently. They never intended to keep the “Core” suffix forever. .NET Core had a lot of missing APIs compared to .NET Framework 4.5., and “.NET 1.0” would have been ambiguous. It was to signify that it was a new API that isn’t fully compatible yet.
Once .NET Core implemented nearly all the APIs from the legacy .NET Framework, the version numbers were no longer ambiguous (starting from .NET 5.0), and the legacy framework wasn’t used as much as it used to be, it made sense to drop the “Core” suffix :)
I have the same issue with Java. Oracle JDK, Open JDK or some other weird distribution? Enteprise Servers or a Framework like Springboot? It’s always easier if you’re familiar with the technology.
I agree, it was mostly a joke. But as the parent commenter explained, “.net is now dot net” is still confusing. They really should just cut ties with the .net name and start fresh. “.net is now MS Interop Framework” or some such. Adopt more sane server versioning moving forward, so searching for information isn’t so wild across all the possible variations and versions of .net, dot net core, dot net framework, asp.net, etc
The reasoning it was to not confuse with .net framework 4.x series, and since they went beyond 4.x, it’s just .net now. I believe .net core moniker was to explicitly distinguish is from framework versions.
It didn’t help the confusion at all, tch. Being a .net guy since 1.0, you just figure it out eventually
programmer_humor
Active
This magazine is from a federated server and may be incomplete. Browse more on the original instance.