The graph is scary, but I think it's conflating two things:
1. Newbies asking badly written basic questions, barely allowed to stay, and answered by hungry users trying to farm points, never to be re-read again. This used to be the vast majority of SO questions by number.
2. Experiencied users facing a novel problem, asking questions that will be the primary search result for years to come.
It's #1 that's being canibalized by LLM's, and I think that's good for users. But #2 really has nowhere else to go; ChatGPT won't help you when all you have is a confusing error message caused by the confluence of three different bugs between your code, the platform, and an outdated dependency. And LLMs will need training data for the new tools and bugs that are coming out.
The newbies vastly outnumber the experienced people (in every discipline), and have more to ask per-capita, and are worse at asking it. Category 2 is much smaller. The volume of Stack Overflow was never going to be sustainable and was not reasonably reflective of its goals.
We are talking about a site that has accumulated more than three times as many questions as there are articles on Wikipedia. Even though the scope is "programming languages" as compared to "literally anything that is notable".
I’m going to argue the opposite. LLMs are fantastic at answering well posed questions. They are like chess machines evaluating a tonne of scenarios. But they aren’t that good at guessing what you actually have on your mind. So if you are a novice, you have to be very careful about framing your questions. Sometimes, it’s just easier to ask a human to point you in the right direction. But SO, despite being human, has always been awful to novices.
On the other hand, if you are experienced, it’s really not that difficult to get what you need from an LLM, and unlike on SO, you don’t need to worry about offending an overly sensitive user or a moderator. LLMs never get angry at you, they never complain about incorrect formatting or being too lax in your wording. They have infinite patience for you. This is why SO is destined to be reduced to a database of well structured questions and answers that are gradually going to become more and more irrelevant as time goes by.
Yes, LLMs are great at answering questions, but providing reasonable answers is another matter.
Can you really not think of anything that hasn't already been asked and isn't in any documentation anywhere? I can only assume you haven't been doing this very long. Fairly recently I was confronted with a Postgres problem, LLMs had no idea, it wasn't in the manual, it needed someone with years of experience. I took them IRC and someone actually helped me figure it out.
Until "AI" gets to the point it has run software for years and gained experience, or it can figure out everything just by reading the source code of something like Postgres, it won't be useful for stuff that hasn't been asked before.
And that is exactly why so many people gripe about SO being "toxic". They didn't present a well posed question. They thought it was for private tutoring, or socializing like on reddit.
All I can say to these is: Ma'am, this is a Wendy's.
So here's an example of SO toxicity. I asked on Meta: "Am I allowed to delete my comments?" question body: "The guidelines say comments are ephemeral and can be deleted at any time, but I was banned for a month for deleting my comments. Is deleting comments allowed?"
For asking this question (after the month ban expired) I was banned from Meta for a year. Would you like to explain how that's not toxic?
Maybe if you haven't used the site since 2020 you vastly underestimated the degree to which it enshittified since then?
I think you overestimate 2 by a longshot most problems only appear novel because they couched in a special field, framework or terminology, otherwise it would be years of incremental work. Some are, they are more appropriately put in a recreational journal or BB.
The reason the "experts" hung around SO was to smooth over the little things. This create a somewhat virtuous cycle, but required too much moderation and as other have pointed out, ultimately unsustainable even before the release of LLMs.
The first actually insightful comment under the OP. I agree all of it.
If SO manages to stay online, it'll still be there for #2 people to present their problems. Don't underestimate the number of bored people still scouring the site for puzzles to solve.
SE Inc, the company, are trying all kinds of things to revitalize the site, in the service of ad revenue. They even introduced types of questions that are entirely exempt from moderation. Those posts feel literally like reddit or any other forum. Threaded discussions, no negative scores, ...
If SE Inc decides to call it quits and shut the place down and freeze it into a dataset, or sell it to some SEO company, that would be a loss.
1. Newbies asking badly written basic questions, barely allowed to stay, and answered by hungry users trying to farm points, never to be re-read again. This used to be the vast majority of SO questions by number.
2. Experiencied users facing a novel problem, asking questions that will be the primary search result for years to come.
It's #1 that's being canibalized by LLM's, and I think that's good for users. But #2 really has nowhere else to go; ChatGPT won't help you when all you have is a confusing error message caused by the confluence of three different bugs between your code, the platform, and an outdated dependency. And LLMs will need training data for the new tools and bugs that are coming out.