The conversations and debates keep circling around one core concept of our civilization that is slowly becoming outdated because it is the main bottleneck in our development and the development of technology.
Capitalism and the money system.
Human needs require all of us to make a bit of money in order to survive.
Human greed demands that we want to go beyond survival and just become enormously wealthy without regard for anything or anyone.
AI is quickly out pacing us and nothing is holding it back because the possibilities are limitless now. The only thing holding it back is our own collective greed. To AI the internet and communications is a place to exchange information not a place to make money.
And to me the problem is the small group of individuals that want to maintain the system of generating all the wealth for them. Because the answer is simple, if wealth were more equally distributed in the world and everyone everywhere were happy and healthy with what they had and they no longer had to worry about surviving, there would be no backlash of worrying about advertising on the internet and in how to compensate people for their work.
We worry about the money system because 90% of humanity constantly has to fight to have a piece of it and 10% of it has complete control of all of it and never wants to let go.
This isn’t a problem of internet advertising and compensating creators … it’s just a symptom of wealth inequality and until we solve that problem, AI will just keep chipping away at civilisation beyond our collective control.
Will we also have to go to a time where we’ll have to buy physical newspapers so that journalists can make a living? Or do we expect them to also share information just for the sake of sharing information?
Part of the compromise was supposed to be that we get functional or entertainment value in return for some amount of ads, but enshittification broke that long ago with ever more intrusive ads and a sense of overwhelming entitlement by advertisers. The current web is useless and aggravating without adblocking, and only preys on the elderly and least technical. Yeah, it’s already broken
This seems like an excellent idea because it’s my app as a tool summarizing information for me. That seems a lot more legitimate than Google profitting from that
The hard part is finding someone still doing good journalism that’s relevant. My local paper is long gone and the nearby major city newspaper is a shadow of its former illustrious self. I do pay a news aggregator but have no idea how much of that goes all the way back
That never left. We’re still buying our local newspaper concerning 60000 people. It is way more relevant than any piece of news you might find on the web.
This is only a temporary “problem”. Eventually, ads will be incorporated into the story, and/or advertising companies will include clauses in their contracts. I imagine those clauses will DEMAND that websites include advertising in AI readers or not get paid for any ads they run.
Think enshittification. AI readers are only ad-free now in order to make them seem like an attractive option, and get people hooked on using them. I bet the numbers have already been calculated and decided on. Once AI readers are used by enough people, the ads will start.
Yup. Just like ads on cable TV, ads on streaming services, now ads in your AI. Even worse, the ads in AI may not even be labeled and just tweak your results slightly to favor certain products and the process hidden from the end user since hey, it’s so complicated even human programmers can’t figure out how to make the AI process transparent and verifiable.
Jeffrey, solemnly took a swig from his delicious cold Coca Cola. “Damn” He thought, smirking. “That tastes great, I should buy it more often.” He then drew his sword and charged the Viking shield wall yelling “This is for the Cola!”
This is what I wondered about a few months ago when people were saying that ChatGPT was a ‘google killer’. So we just have ‘AI’ read websites and sum them up, vs. visiting websites? Why would anyone bother putting information on a website at that point?
We are barreling towards this issue. StackOverflow for example has crashing viewer numbers. But an AI isn’t going to help users navigate and figure out a new python library for example, without data to train on. I’ve already had AIs straight up hallucinate about functions in R that actually don’t exist. It seems to happen primarily in the newer libraries, probably with fewer posts on stackexchange about them
AI isn’t going to help users navigate and figure out a new python library for example
Current AI will not. Future AI should be able to as long as there is accurate documentation. This is the natural direction for advancement. The only way it doesn’t happen is if we’ve truly hit the plateau already, and that seems very unlikely. GPT-4 is going to look like a cheap toy in a few years, most likely.
And if the AI researchers can’t crack that nut fast enough, then API developers will write more machine-friendly documentation and training functions. It could be as ubiquitous as unit testing.
Current AI can already "read" documentation that isn't part of its training set, actually. Bing Chat, for example, does websearches and bases its answers in part on the text of the pages it finds. I've got a local AI, GPT4All, that you can point at a directory full of documents and tell "include that in your context when answering questions." So we're we're already getting there.
Getting there, but I can say from experience that it’s mostly useless with the current offerings. I’ve tried using GPT4 and Claude2 to give me answers for less-popular command line tools and Python modules by pointing them to complete docs, and I was not able to get meaningful answers. :(
Perhaps you could automate a more exhaustive fine-tuning of an LLM based on such material. I have not tried that, and I am not well-versed in the process.
I'm thinking a potentially useful middle ground might be to have the AI digest the documentation into an easier-to-understand form first, and then have it query that digest for context later when you're asking it questions about stuff. GPT4All already does something a little similar in that it needs to build a search index for the data before it can make use of it.
That’s a good idea. I have not specifically tried loading the documentation into GPT4All’s LocalDocs index. I will give this a try when I have some time.
I've only been fiddling around with it for a few days, but it seems to me that the default settings weren't very good - by default it'll load four 256-character-long snippets into the AI's context from the search results, which is pretty hit and miss on being informative in my experience. I think I may finally have found a good use for those models with really large contexts, I can crank up the size and number of snippets it loads and that seems to help. But it still doesn't give "global" understanding. For example, if I put a novel into LocalDocs and then ask the AI about general themes or large-scale "what's this character like" stuff it still only has a few isolated bits of the novel to work from.
What I'm imagining is that the AI could sit on its own for a while loading up chunks of the source document and writing "notes" for its future self to read. That would let it accumulate information from across the whole corpus and cross-reference disparate stuff more easily.
Well people who make content are already suffering for a collapse of ad prices. News sites are shutting down left and right. Not everything is about money, but they need revenue or external support to continue operating.
I see the advent of AI browsers much like ad blockers; the web has become increasingly user-hostile and users are pushing back. Advertising was never sustainable, and that has only become more apparent over the past decade. This is a long-overdue comeuppance. The cost of the advertising economy is extraordinary and cannot be measured in mere dollars.
I miss the internet from the 90s, when sites were information-dense and operated mostly as a public service by enthusiasts, usually for free. Of course, that was not sustainable as the Internet became more popular, because the cost of serving a thousand people was, like, couch-cushion money, but the cost of serving billions of people…well, I don’t have millions of couch cushions to plunder.
But also, the cost of web site operation today is artificially high, largely because of advertising and the incentives that an ad-driven market creates. What was once a few KB of text is now many MB of ads, scripts, layouts, and graphics, or even GB of videos, all for the sake of manipulating users into viewing more ads. Commercial sites do not compete on the quality of information; they compete over ad impressions. This was not borne out of need, but out of economic incentives that are misaligned with the needs of society, individuals, and, yes, even content producers.
This isn’t new, of course. I remember the same conversations back in the 90s and early 2000s. First with Sherlock, then later with Google.
People who make content for money are suffering from a collapse in ad prices. There are people who make content because they enjoy making and sharing content.
That’s not what we’re talking about… we’re talking about news. Real news, with investigative journalism costs money. You need to pay for people to be on the ground, travel expense, etc.
This thought that everything you consume online should be completely free is insane. If everything we consumed online was just someone’s hobby there’d be even more trash.
I have a long-running blog for fun, so you’re preaching to the choir. But some things can’t replace a dedicated journalist, particularly at local level, sitting in city council meetings, chasing leads, and interviewing people.
Thank you to Arc for reminding me how much I enjoy browsing the internet and its many unique pages — these soulless generated results are the opposite of what I want.
Thank you to Arc for reminding me how much I enjoy browsing the internet and its many unique pages — these soulless generated results are the opposite of what I want.
Considering how much of the web is AI-generated now (with it predicted to rise to 90% by the end of 2026) we’ve managed to turn a tool for connecting people to a tool for chatbots to talk to one another.
But it could drive even more sales. Just think of all those articles “nine must-have kitchen tools on sale at Amazon RIGHT NOW”, followed by a list of specific product referrals (embedded in a story across many pages, slideshow style). Currently you can choose to block or at least not follow, but imagine if every search was a similar generated story, and the tools authors got caught up in the referral game
How does it help creators? Without them there is no web…” After all, if a web browser sucked out all information from web pages without users needing to actually visit them, why would anyone bother making websites in the first place
This reminds me of when Mozilla was 0.9 and the web was just taking the baton from Gopher.
When Ben suggests there would be no web without monetization, he seems to forget WHEN HE WAS THERE before the sellout.
engadget.com
Oldest