Some of this is just because some of these frameworks and technologies have been around for a while and they iterate frequently. I see a ton of Azure content that is obsolete after only a few years.
I just go the official docs even if their old and then switch to the latest version once I’m on the website. Most of the software I use has easy index to switch between versions.
I’ve started relying more on AI-powered tools like Perplexity for many of my search use-cases for this very fact - all results basically warrant a pre-filtering to be useful.
Unfortunately the spam arms race has destroyed any chance of search going back to the good ole days. SEO and AI content farms means we’ll need a whole new system to categorize webpages, as well as filter out human sounding but low effort spam.
Point being, it’s no longer enough to find a page that’s relevant to the topic, it has to be relevant and actually deliver information, which currently the only feasible tech that can differentiate those is LLMs.
It would be interesting tho to use a LLM to spot AI/SEO crap and add whole domains to a search blacklist. In that case we wouldn’t need AI to do the actual search, and this could easily just be a database for end users by the SE’s side (kinda like explicit content filters).
I’d call that option “Bullspam filter” and leave it on “moderate” by default.
This is one solution to the issue, and it seems silly you are being downvoted for it.
Google became what it became, and years of seo optimisation cat & mouse play has reached new heights. Those obviously target Google instead of their competitors for now.
Would that we could have perfect search results, it would be beneficial to google as well.
redundancy, rolling updates or byzantine fault tolerance in a monolith > naïve assumptions that one part of your system going down won’t mess up it’s overall usability by and large just because you’ve used microservices
Micro services alone aren’t enough. You have to have proper observability and automation to be able to gracefully handle the loss of some functionality. Microservice architecture isn’t a silver bullet, but one piece of the puzzle to reliable highly available applications that can handle faults well
the vim-visual-multi plugin tries to do this. It takes some time to get the hang of it, but, even if using only the simplest features, it’s way better than not having the option.
It is not machine learning and LLM that pisses off Lemmy users. Its the application of said technologies. I don’t give a flying fuck what people are doing with ChatGPT, its novel. I want a generative AI that can help me code.
I don’t agree. Every time I have tried to point out that GitHub copilot is a helpful tool, the trolls come out of the woodwork to tell me I’m wrong and a shitty developer
I have luck with it daily. I wouldn’t want to return to writing code without ever having little shortcuts builtin to my workflow throughout every day. It certainly saves me time and I have some other teammates who agree. Some luddites on the team also, but I don’t let their fear bog me down. I think everyone who is strictly against copilot actually has failed to ever try it
programmer_humor
Top
This magazine is from a federated server and may be incomplete. Browse more on the original instance.