- cross-posted to:
- technology@lemmy.world
- cross-posted to:
- technology@lemmy.world
This is what I wondered about a few months ago when people were saying that ChatGPT was a ‘google killer’. So we just have ‘AI’ read websites and sum them up, vs. visiting websites? Why would anyone bother putting information on a website at that point?
We are barreling towards this issue. StackOverflow for example has crashing viewer numbers. But an AI isn’t going to help users navigate and figure out a new python library for example, without data to train on. I’ve already had AIs straight up hallucinate about functions in R that actually don’t exist. It seems to happen primarily in the newer libraries, probably with fewer posts on stackexchange about them
How does it help creators? Without them there is no web…” After all, if a web browser sucked out all information from web pages without users needing to actually visit them, why would anyone bother making websites in the first place
This reminds me of when Mozilla was 0.9 and the web was just taking the baton from Gopher.
When Ben suggests there would be no web without monetization, he seems to forget WHEN HE WAS THERE before the sellout.
Definitely not the IT people keeping their ai servers running, that’s for sure.
Thank you to Arc for reminding me how much I enjoy browsing the internet and its many unique pages — these soulless generated results are the opposite of what I want.
More and more of the Internet is being ai generated, so you’ll get to choose from a soulless summary or soulless SEO spam.