92

The CEO [of SE] recently posted a blog post titled CEO Update: Paving the road forward with AI and community at the center.

Stack Exchange staff haven't posted a request-for-comments on it, so... I did here.

What are your thoughts on that blog post?

13
  • 75
    I don't have enough words to describe how unbelievably out of touch that particular blog post is.
    – 0Valt
    Commented May 31, 2023 at 16:48
  • 67
    @OlegValteriswithUkraine Oh, I have a few words. But they'd violate both the old and new CoC.
    – Mast
    Commented May 31, 2023 at 16:59
  • 29
    Nothing indicates community trust quite like plummeting traffic and engagement numbers.
    – Kevin B
    Commented May 31, 2023 at 17:04
  • 3
    @Mast oh, I do have a couple of those too. And they are against any imaginable CoC too...
    – 0Valt
    Commented May 31, 2023 at 17:20
  • 43
    …but… but… we've got new voting arrows.
    – Tetsujin
    Commented May 31, 2023 at 17:30
  • 8
    Well the reason for not asking for feedback this time is obvious, and I find it fair. They are well aware how it would be received, and afraid of it, so prefer to just throw the bomb and run to hide behind a shelter. Commented May 31, 2023 at 17:41
  • 6
    "What are your thoughts..." -- unpublishable (if I don't want to get banned).
    – Dan Mašek
    Commented May 31, 2023 at 17:42
  • 4
    @ShadowTheSpringWizard I don't find it fair at all. If you're going to publish something that offends a bunch of people, they're going to have some viewpoints on that, hence why I posted a discussion post even though SE didn't. Especially after the recent rule change, I wasn't very inclined to allow them to try and sweep the blog post under the rug
    – cocomac
    Commented May 31, 2023 at 17:46
  • 23
    I think it's just the usual business speak. Almost empty phrases with low density information in between. One could probably summarize to: we fired people and we want to make money with AI. It's not more really. If only blog posts wouldn't be so long. Commented May 31, 2023 at 18:05
  • 44
    Whenever I see 'CEO' in the question title nowadays I flinch. This guy seems to have no clue as to what we as a community want/no care as to what we as a community want.
    – CDR
    Commented May 31, 2023 at 18:09
  • In the future please never put the word "AI" before "community" in a sentence. Also consider using the initial capital for community. Communication is important.
    – 0-One-0
    Commented Jun 4, 2023 at 22:22
  • 1
    @starball You could have linked to the answer in the bounty description. Commented Jun 6, 2023 at 13:05
  • 3
    I read ‘CEO’, I immediately assume it’s BS and voila! It is a load of BS. Empty words, sophistry, nothing meaningful. Typical business speak. Next please. Commented Jul 4, 2023 at 21:54

10 Answers 10

214

We have the domain focus and community trust to responsibly deploy generative AI

Citation needed

10
  • 47
    Is this gaslighting? This feels like gaslighting. Commented May 31, 2023 at 17:23
  • 35
    @ThomasMarkov either that, or that the CEO doesn't have any clue about what's happening right now. Or, which is more likely, doesn't give a flying <insert expletive> about it.
    – 0Valt
    Commented May 31, 2023 at 17:24
  • 44
    We could try to get generative AI to stop hallucinating facts. Or we could just get CEOs to hallucinate facts as well, to level the playing field. The easiest way to get an AI with human-level performance is obviously to decrease human-level performance, and Stack Exchange seems to get that. Good to know they're clearly on top of it.
    – Erik A
    Commented May 31, 2023 at 17:56
  • The CEO will have had nothing to do with that post. He probably has no idea what's happening, as you say, but he also didn't actually write that post.
    – terdon
    Commented May 31, 2023 at 20:11
  • 30
    I’d say it is reasonable to assume that an AI generated CEO would not be worse…
    – Jon Custer
    Commented Jun 1, 2023 at 2:14
  • 5
    @terdon The irony of you replying to an answer consisting of nothing other than "Citation needed" with a comment that contains no citation and is desperately in need of one is not lost on me... On what basis can you make the claim that he definitely didn't write the post and had nothing to do with it?
    – Cody Gray
    Commented Jun 1, 2023 at 7:22
  • 2
    @CodyGray Oh, I don't claim to know for a fact, it just seems vanishingly unlikely. This kind of thing is usually written by a marketing/communication department. I'm sure the CEO endorses it and agrees with it, inasmuch as there's anything concrete there to agree with, I just really doubt he would have personally sat down and written it.
    – terdon
    Commented Jun 1, 2023 at 8:43
  • @terdon Then what do you think he does spend his time at? Partying? Commented Jun 1, 2023 at 13:59
  • 3
    @Andreasdetestscensorship I would be quite disappointed if I heard my CEO was wasting time writing blog posts, of all things. I assume he spends his time doing actual work and not creating vacuous marketing documents, that's what marketing departments are for. The company I work for regularly publishes blog posts "by the CTO" but I don't think she's ever written even one of them. Everything in the post has her approval, of course, but she doesn't waste her time actually composing the post!
    – terdon
    Commented Jun 1, 2023 at 16:38
  • 5
    It's completely irrelevant whether he wrote it himself or not - he signed it, and that means that he's responsible for everything in it.
    – Gloweye
    Commented Jun 20, 2023 at 6:54
118
+50

I've probably put in hundreds of hours of unpaid, and admittedly often unsolicited work into this place.

We continue to evolve as an organization, focusing on a path to profitability in addition to navigating a dynamic external market

Whatever that means. The current CEO has been on Stack Overflow since 2019. I've been here over 13 years. In that time we hear the same thing told in many ways. That the company needs to cut back for the good of everyone. Repeatedly. It does not engender trust.

To continue growing, every company needs to do more with the resources it has, and every employee (not just those in Sales) must understand that their actions impact revenue.

Revenue isn't the only real way to see the value of the company. Corporate social responsibility is just as important - we had to lobby for things like hiring into key community roles, and even annual donations to charities. Stack Overflow is what it is because the community put hundreds of hours into answering questions, curation and other things. While we do not make you money directly (well, outside advertising), the entire value of the Stack Overflow brand and the willingness of hundreds of subject matter experts - maybe thousands - to recommend and work with your product would depend on goodwill.

Quite often, cutting back can even result in revenue and value being lost. Where do you see Stack Exchange in 5, 10 or 20 years?

We have the domain focus and community trust

Stack Exchange hasn't really had domain focus on Q&A in years. It slipped into a company that was heavily marketing driven, rather than the rather lean, innovative company that had massive amounts of trust. We had a partnership, and now we have "We have changed the terms of our agreement, pray we do not change it further."

We asked for advice, and were told you trusted us to make our own. We did, working with our individual communities to find solutions we felt worked best. You then decided we were doing it all wrong

There are many instances like this where the community has expressed its views on something that either get ignored or are dismissed.

I guess the question is how badly do you need to poison the goodwill of the community while claiming that we trust you before things become irreparable.

For the community to grow we need the tools and support to build solid cores and feel that we're welcome in the place we built. And every employee, whether it’s marketing, or folks with direct work in the community - and especially the CEO - needs to realise their actions impact the community. One that's been often hurt, dismissed and pushed aside.

Sustainability needs long term planning, not chasing the next shiny thing. I've seen communities I've been a long time part of - sites like Server Fault and Information Security - lose a lot of regular users. My own community spaces - like Super User and Pets - are a lot quieter than they used to be. The current goals seem to focus on the next quarter while I've literally been trying to get things moving, or 'fixed' in certain respects, for years.

I used to argue that Careers was a unsustainable venture, because it was a 'better' product in a hugely competitive space. Everyone is doing generative AI, and... you're competing with Google and Meta - both massively cash-rich companies with significantly better resources.

We've shed 10% of staff and their associated knowledge, and I've heard rumors some engineering teams got hit somewhat worse than that - which seems bizarre. I could be wrong though. And unlike Meta or Twitter, SE didn't make any hugely bad bets, did it?

Stack Overflow, and Stack Overflow for Teams, in particular, is well-suited to the industry-wide shift from “growth-at-all-costs” to profitable, sustainable expansion I mention above.

I'd ask instead - does SE have any intention to ensure the community and network is healthy in the next decade or so, and help preserve the community you keep bringing up as its strength?

Looking forward to growth and the impact of community on AI

I'm not impressed with the impact of AI on the community. We're hurting, and we're literally in the biggest crisis in trust since Monica.

The community is often our biggest champion in the enterprise; its members want to use a tool they know and trust to manage their proprietary information and collaborate with peers who are likely familiar with Stack Overflow as well. Community is our competitive advantage and a key reason we remain insulated from the worst of the business cycle’s ups-and-downs.

Do you really hear us? Folks are angry. And we're really not appreciating that, in the same breath we're essentially held up as a strength, while being treated like we don't matter. The company has not been a great steward of the communities for many years - and has drifted away from actually understanding what the community wants or even needs.

I was asked about what I did as a moderator at a job interview a little back. I froze. Honestly - I couldn't recommend Stack Overflow for Teams or any other product because of how the company has treated the community over the years. I love the platform, and the communities I work with. I don't love the years of getting treated terribly while being told we matter.

The people we work with are wonderful - The community team clearly cares, even if we do disagree with some of the recent rules changes, and they're busy (which makes the downsizing of their team, rather than trying to get them the resources to do their jobs infuriating) We've a few developers who are awesome.

The company as a whole though, hasn't really earned back the trust it has lost over the years completely. We can't be your champions unless you're willing to be ours - and this is something that is in your interests and responsibility to do.


Now that we've seen what Stack Overflow Inc's idea of what GenAI is...

I'm underwhelmed. We've lost 10% of staff, and put 10% of what is left into what turned out to be, at least initially, a deeply flawed and barely hidden wrapper around ChatGPT or a similar LLM platform. At the current point, it's unfit for purpose

In order to do this, the company has succeeded in antagonising a significant part of the active moderation community to the point they are on strike. I've heard many moderators state they're considering quitting because of burn out.

We might have very different definitions for 'putting the community in the center' but as a active community member and someone who's welcome, trusted and actually there, the community is hurting.

I realise that there's a lot of 'sunk cost' but this is a very good time, maybe the best time other than before you started, to re-evaluate the current course of action.

6
  • 5
    « The community team clearly cares». I don’t know… I’m starting to doubt it. They’re quiet, and haven’t talked a whole lot to the community as a whole, lately. Obviously, they’re tied by the company, but some of their posts do not bring trust. Their latest actions in the «don’t suspend for AI» thread, do not foster trust. I have talked to one or two CMs in chat, and while I understand some criticisms against them, the ones I’ve talked to, generally seem like kind people. But they’re absent at this moment. They’re not supporting us (not visibly, at least). I’m not saying they can, though. Commented Jun 1, 2023 at 14:14
  • But at the same time, maybe they do wisely in staying out of the large number of Q/As that spawned the last few days, and leave them for the community to own them. Commented Jun 1, 2023 at 14:15
  • «this is something that is in your interests and responsibility to do.» They’re not going to act on this, because they don’t seem to believe it. It’s hard to grasp that they can be so oblivious, but they clearly ignore us, our needs, and our support. So they need to be convinced, which, seriously, is their own responsibility. But how are we supposed to help them understand, when they don’t even read any of the stuff we write? We’re sending them lots of letters which get instantly dumped in the fireplace. At least it keeps them warm. But it’s a fire. Commented Jun 1, 2023 at 14:19
  • 6
    I'm guessing "sustainable growth" is really code speak for "we can't figure out how to make this profitable, let alone grow the business, but we can hide that behind this 'sustainability' label". The bottom line is brutal; if they can't earn enough money to keep the shop running, that's the end. Even if the site stopped receiving new questions and answers today, they still have a vast amount of contributed content which continues to provide value to many visitors. But they clearly also cannot figure out how to cope with new content and the manual curation that it still requires.
    – tripleee
    Commented Jun 2, 2023 at 6:53
  • 11
    "I couldn't recommend Stack Overflow for Teams or any other product because of how the company has treated the community over the years." <= That should be the main problem for the company, if people invested won't recommend it internally, who will ?
    – Tensibai
    Commented Jun 5, 2023 at 10:31
  • 1
    Very well said.
    – Brad
    Commented Jun 7, 2023 at 3:40
79

I also posted this as a comment on the blog, pending moderation:


Hi Prashanth, you have not mentioned the recent policy change also reported by your VP of Community and broadly criticized by the Stack Exchange/Stack Overflow community: What is the network policy regarding AI Generated content?

This policy change to allow generative AI answers on SE/SO and forbid community moderators from removing this content means that SE/SO will no longer be a useful source of data to train AI. Rather than a place for humans to ask questions to be answered by other humans, SE/SO will now be a place for people to re-post GenAI responses. This adds no value to SE/SO, as askers can produce GenAI answers directly, without SE/SO as a middleman.

Why is your company abandoning the thing that makes SE/SO a higher-quality source for Q&A than these hallucinating GenAI products?


I'll note that this is the same sort of thing I was concerned about in my answer to the previous blog by the CEO: New blog post from our CEO Prashanth: Community is the future of AI

20
  • 20
    It's definitely kinder than the one I submitted ("You’re a liar. You do not have our (the community’s) trust!"). I do admire your steadfast determination to continue explaining and presenting things nicely, but I'm surprised you haven't just given up. They are clearly not listening, and have no idea what they're doing. I feel abused, and that makes me angry, and unwilling to participate in a respectful discourse with them; I fail to see them doing it, so why do it myself? And it hasn't worked so far, so what will work? Commented May 31, 2023 at 17:08
  • 14
    @Andreasdetestscensorship attacking and calling them names has also never worked, why would you continue doing that? If anything does have a chance of getting through, it would be a civil, constructive message like Bryan's. If we all go and give them a piece of our mind, we will quite rightly be ignored as trolls. Just because they are being deceitful and insulting doesn't mean we should be as well; after all, we're claiming the moral high ground here and behaving badly makes us lose it.
    – terdon
    Commented May 31, 2023 at 20:15
  • 3
    "forbid community moderators from removing this content" note that moderators also suspended the users that ran afoul of the policy. It is a heavy handed approach for stuff that we haven't banned users before.
    – Braiam
    Commented Jun 1, 2023 at 20:10
  • 5
    @Braiam Moderators have asked for site changes to make more clear to users that this content is not allowed. Suspensions on SE are always temporary and have no impact on what the user can do once the suspension period ends. Commented Jun 1, 2023 at 20:18
  • 1
    @BryanKrause have you been suspended or otherwise disallowed to participate on a site in your first iteration? You say it as if the system is perfect, but only if you are not in the wrong end of it.
    – Braiam
    Commented Jun 2, 2023 at 17:44
  • 5
    @Braiam I generally spend enough time on a site before I feel comfortable participating in it to know what the expected behaviors are. Commented Jun 2, 2023 at 18:09
  • 2
    @BryanKrause and back to the point I made, people first interaction is not "spend enough time" to feel comfortable. It's their first interaction. They will do what seems to be the lowest bar of participation. There's nothing in the immediate guidelines that would tell you about this. We need to understand that perception is king for a new user. If that perception doesn't align with your values that's a you problem, you need to fix what others perceive of you. It's irrational to do the opposite and try to "fix" your new users.
    – Braiam
    Commented Jun 5, 2023 at 10:45
  • 4
    @terdon Strong language isn't just swearing for swearing's sake, it is often used to convey strong emotion in a way that "civil" language cannot. This is why strong words should be considered more carefully than "civil" ones, not arbitrarily disregarded because of some puritan morality that decrees that everyone has to pretend to be nice to each other when so much is at stake. If you're willing to disregard strong language for the simple fact that it's not "civil", you're part of the problem.
    – Ian Kemp
    Commented Jun 5, 2023 at 11:22
  • 5
    @Braiam Moderators have asked several times for improved guidance on this specific issue for new users, like a site banner or notice on the ask a question page explicitly stating the AI policy. Commented Jun 5, 2023 at 12:53
  • 5
    @Braiam as an aside, I could never understand people who come to a new site/chat/etc and immediately start posting content. I always "lurk" for a while to learn community norms. If you don't, I think that's on you.
    – Esther
    Commented Jun 5, 2023 at 18:54
  • 2
    @Esther "immediately start posting content" that's because you are not one of those. That's fine. Sadly, every human is not the same. I'm one that do not lurk at all, but I also read the documentation presented. If your documentation sucks and it leads me to a bad experience that's entirely on you. I have complained very loudly that we do not give proper expectations to new users.
    – Braiam
    Commented Jun 7, 2023 at 15:10
  • @BryanKrauseisonstrike ok, how about moderators do the same. Give an example of the problem on their sites. Until now only physics.se have done something of the short, and it seems a nothing burger that needs a new rule, since you could use quality standards and a requirement for references to meet those standards enough to deal with these posts.
    – Braiam
    Commented Jun 7, 2023 at 15:12
  • 2
    @Braiam "the documentation" consists of the entire help center. Even just the main help page has a notice about AI-generated content being banned.
    – Esther
    Commented Jun 7, 2023 at 15:36
  • @Esther are you new here? Do you really expect the average user to read all of the help center? For realsies? Frfr, no cap?
    – Braiam
    Commented Jun 15, 2023 at 19:23
  • @Braiam whether or not they do, doesn't mean we can't treat them as if they had, ;)
    – Kevin B
    Commented Jun 15, 2023 at 19:25
56
+50

We continue to evolve as an organization, focusing on a path to profitability in addition to navigating a dynamic external market. Part of this evolution led us to make the difficult decision to reduce our headcount by 10% last month. Through these changes, I remain grateful for our community, which is the basis of everything we do, and for the many Stackers who demonstrate great resilience and live our mission and values each day.

You do indeed seem to be making short-sighted leaps to the nearest money bag. You are clearly not grateful for your community, as you continue to dump very not wanted stuff on us.

To continue growing, every company needs to do more with the resources it has, and every employee (not just those in Sales) must understand that their actions impact revenue.

In all seriousness, this seems to be a complaint at the workforce that you just fired.

This is especially important for developers. Often, a new feature or product is the difference between closing a new customer/growing an existing one and seeing deals slip to the next quarter or fall out of the pipeline entirely. In fact, research from McKinsey shows that companies who innovate through crises not only outperform the market by 10% during the crisis but also realize a 30% average long-term gain.

Stack Overflow is not any other company that just sells products to people. This is not how you run a community-driven platform. Management like this is the downfall of a platform like SO.

When I think about Stack Overflow’s future, what makes me most excited is how we are innovating, and that’s largely based around the work that we’re doing to incorporate GenAI into our products.

You are not innovating, and you have not been doing so, for a while. You take a piece of garbage (AI), relabel it, and push it onto us.

those companies who embrace opportunities to leverage generative AI and automation in their everyday work flows in intentional ways that assist the productivity of their workers will be the most successful in this next phase of the modern workplace.

No, not really. It goes to show that you lack self-control amongst the hype of a useless new technology. I love the future of AI; we're not there now, and your usage of it, is... well, not successful.

Our vision for community and AI coming together means the rise of GenAI is a big opportunity for Stack.

What is your vision, again? To replace us with AI? Use us as free labour for something that we get nothing in return for? But yes, you're correct: this is a big opportunity for Stack Overflow to get pulled to the bottom.

Approximately 10% of our company is working on features and applications leveraging GenAI that have the potential to increase engagement within our public community and add value to customers of our SaaS product, Stack Overflow for Teams.

That's 10% of your company's workforce wasting their time. What it does have the potential to do, is decrease the engagement within the public community. Your recent new policy on allowing AI content is making curators furious.

We believe that the developer community can play a crucial role in how AI accelerates, ultimately helping with the quality coming out of GenAI offerings—and in that, further improving the modern workplace as we know it.

Do not abuse us; we're here for the community, the quality, and the repository of knowledge; we're not here to accelerate your venture into selling AI.

Stack Overflow for Teams is uniquely positioned for this moment. But beyond this clear value prop, what really sets us apart is the strength of the community. The community is often our biggest champion in the enterprise; its members want to use a tool they know and trust to manage their proprietary information and collaborate with peers who are likely familiar with Stack Overflow as well. Community is our competitive advantage and a key reason we remain insulated from the worst of the business cycle’s ups-and-downs.

Then stop insulting us, lying to us, sabotaging us, and generally treating us like shit.

We have the domain focus and community trust to responsibly deploy generative AI to improve our existing suite of products and lead to new solutions and revenue streams.

This is a lie. You do not have the trust of the community.

Second, question reviewers are able to better understand the content of the question, making it easier to suggest edits or improve the post.

Humans have more sophisticated intelligence than your machine learning tools. We are more capable at understanding the question content through a badly written title, than a title formed by an AI which looked at the content of the question.

Finally, end users of Stack Overflow can more easily understand if the question is relevant to their needs.

What is an "end user"?

This tool is one of many that we are launching in coming weeks.

Madness. Sadness. Horrific.

Our community has given us feedback through the evolution of this tool, and their feedback is critical to how it scales

Then listen to our feedback. Stop closing your ears and eyes!

As the AI landscape continues to evolve, the need for communities that can nurture, inform, and challenge these technologies becomes paramount.

We're not here for this! Stop hammering us with stuff we don't want. It's abuse.

These platforms will not only offer the necessary guidance to refine AI algorithms and models but also serve as a space for healthy debate and exchange of ideas, fostering the spirit of innovation and pushing the boundaries of what AI can accomplish.

So in essence, you want to use us as free labour to develop a product you'll make lots of money off from, one of which we get nothing in return from; only burnouts.

We want to be able to continue to invest back into our community, and that’s why we’re also exploring what data monetization looks like in the future. LLMs are trained off Stack Overflow data, which our massive community has contributed to for nearly 15 years. We should be compensated for that data so we can continue to invest back in our community. This also boils down to proper attribution. Our data represents significant effort and hard work stored across our site and LLMs are taking those insights and not attributing to the source of that knowledge. This is about protecting the interest in the content that our users have created as much as it is the fight for data quality and AI advancement.

I agree, but I don't believe you anymore.

Then far down, outside the scope of AI:

The trust of our community is as essential as the trust in the data it produces

So stop abusing that trust. Stop shredding it. Work with us, not against us.



Message to other community members:

I do admire the steadfast determination of our community members to continue explaining and presenting things nicely, but I'm surprised many haven't just given up. SE are clearly not listening, and have no idea what they're doing. I feel abused, and that makes me angry, and unwilling to participate in a respectful discourse with them; I fail to see them doing it. It does not seem to have worked so far, we only get overrun with more and more community-hostile content. So excuse my harsh and angry response.

9
  • 9
    "Then stop insulting us, lying to us, sabotaging us, and generally treating us like shit." -- spot on, although I'm afraid that request is about as likely to be fulfilled as if you asked them to stop breathing. Sadly, for some people, that's just the way of life.
    – Dan Mašek
    Commented May 31, 2023 at 18:35
  • 4
    "So in essence, you want to use us as free labour to develop a product you'll make lots of money off from, one of which we get nothing in return from; only burnouts." - the network sites making profit for them is not new. And it's not fully true to say that everyone here gets "nothing" from this and only burnouts- at least- I am not confident that that is a fact (see also meta.stackexchange.com/q/3742/997587). I'm not saying people aren't getting burned out, but not everyone is.
    – starball
    Commented May 31, 2023 at 19:03
  • 5
    In my view, you tend to exaggerate quite heavily in your writings and use charged language. I'm not confident that that's the most effective way to get SE Inc. to listen. But to each their own.
    – starball
    Commented May 31, 2023 at 19:05
  • 1
    @starball I'm not saying that everybody here gets nothing from the work we currently put into the site. Obviously, we do curation because of our affection for the site, and in return, we get a cleaner site that we find highly useful. Without curation, the site wouldn't serve its purpose, and the intended goals (of the community) could not have been fulfilled. My point is that it's not fair to use us as free labour to train their models, which they will then sell for other purposes. The ML does not benefit the community, and only a community benefit respects and acknowledges our free labour. Commented May 31, 2023 at 22:13
  • 2
    The fact is, living in a symbiotic relationship in which both sides get a mutual benefit, is the state we must aim for. What SE is doing now, and has been for a while, is turning themselves into a parasite, feeding off of us. Commented May 31, 2023 at 22:14
  • 1
    pasting a comment I just made elsewhere: "When you made an account, you implicitly agreed to the ToS, which also gives SE the right to commercially exploit your content. Not that that brings me great joy, but it's a fact, and you need to accept it." (Of course, I'd love it if the money went back to the community).
    – starball
    Commented May 31, 2023 at 22:15
  • 1
    @starball That comment misses the point of my criticism. I am fully aware that they can commercially exploit our contributions, and I never said otherwise. But I'm not willing to do it for free. If the community gets nothing in return, well, other than burdens and disrespectful treatment, there is no point for us to continue. Just because they wrote that in the ToS, doesn't mean they are morally in the clear. Commented May 31, 2023 at 22:19
  • 8
    I have a feeling we're all about to become "end users", in the sense that this is where we end our usership of Stack Overflow and the rest of the platform.
    – Cody Gray
    Commented Jun 1, 2023 at 7:24
  • 1
    @CodyGray Well, yes, I was considering adding a pure speculation about that, to the post, as well as a further speculation about the complaint at fired staff. I should perhaps also make a note that my answer obviously isn’t solely based on the blog post, but everything that’s been going on as of lately. Commented Jun 1, 2023 at 7:45
38
+50

This is the second blog post where CEO talks about AI and Community.

It is beyond obvious that AI is used as a buzzword, because AI is a hot topic at the moment. Combined with the other recent actions, What is the network policy regarding AI Generated content?, that will have negative impact on community and therefore on sites in the SE network, I am pretty sure that the Community is also used by the CEO as a buzzword and nothing more.

CEO does not show any intentions to interact with the Community, not even in the most superficial way (the least he could do would be posting the major announcements himself here), he definitely does not listen to the Community, and he is not even close to having minimal understanding of the Community. And combining all that, he is definitely not part of the Community, nor he has the Community's trust.

Pretty soon CEO will find out that being the CEO of the company that is based on community contributions and ignoring that same community does not work well. The Community is not just a buzzword he can use as he pleases and the Community has the means to strike back.

13
  • 2
    While it's futile indeed, I do support the Resistance. Where can we join? ;) Commented Jun 1, 2023 at 10:20
  • @ShadowTheSpringWizard Here Mithical left the link in comment under Stack Exchange is failing its community Commented Jun 1, 2023 at 10:54
  • 4
    argh... not Discord... Well, I support it morally then. Commented Jun 1, 2023 at 11:27
  • "...nor he has the Community's trust." So you mean either he somehow misinterpreted the amount of trust he has from the community or he isn't completely honest in this blog post? Commented Jun 1, 2023 at 11:44
  • @Trilarion Does it matter which one it is? Or is it mixture of both, most likely.... Commented Jun 1, 2023 at 11:59
  • 3
    @ShadowTheSpringWizard No worries, you can join when (if) action will be announced in the public. Commented Jun 1, 2023 at 12:00
  • 1
    @ResistanceIsFutile Theoretically yes. According to Hanlon's razor I should simply assume he doesn't know enough about the community or it's a misunderstanding of some sort. But then somehow this is almost beyond my imagination. You cannot be CEO for 3 years without knowing what's going on, I think. Commented Jun 1, 2023 at 12:06
  • @Trilarion He has been around for Monica, so I doubt that he is merely misunderstanding, but I prefer giving people benefit of the doubt, even when there is very little evidence that this is just an innocent mistake or misunderstanding. Commented Jun 1, 2023 at 12:13
  • 1
    Discord is really not a replacement for the community’s existing platforms: the Meta sites. I have mentioned this before. I fully support striking back at them, and will join in on a strike when it eventually hits, even if planned on Discord, but you need to involve the full community in this, at some point, before the strike begins, and that must be done on the Meta sites. Commented Jun 1, 2023 at 13:51
  • 1
    @Trilarion another option that is less discussed is that he know very well what's going on, but got his hands tied by the "Board", the people with the money, who force him to go in a very specific way, and forbid him from contacting the "lesser people", i.e. us. It was mentioned vaguely during the Monica case by several staff members. (The fact such board exists, and its huge impact on the management.) Commented Jun 1, 2023 at 14:24
  • @Andreasdetestscensorship Discord is not meant as replacement, but as additional channel of communication which also comes handy in situations like the current one. Commented Jun 1, 2023 at 14:39
  • 1
    @ShadowTheSpringWizard In that case, it’s sad. It’s a possibility I have considered more than once, but he remains the CEO by choice. Commented Jun 1, 2023 at 14:45
  • @Andreas well, classic case of "Maybe I can change their mind at some point". Just that such point never arrive. Commented Jun 1, 2023 at 16:27
26

But most significantly, we accelerated our AI efforts internally and look forward to sharing more this summer.

I'm just confused. The community is trying to ban AI, so why is the org pushing for AI?


We have the domain focus and community trust to responsibly deploy generative AI to improve our existing suite of products and lead to new solutions and revenue streams.

This feels like Animal Farm, where the pigs (the ruling class) are surreptitiously changing the rules and brainwashing the citizens. Plus, the community wants the opposite, so that is ________*


Stack Overflow for Teams, our enterprise, private version of Stack Overflow

what really sets us apart is the strength of the community.

Why get Stack Overflow for Teams if it doesn't have the same strong community as regular SO?


I have no idea what the company named Stack Overflow is trying to do.


*I decided to deter red flags by leaving that part out.
5
  • SO for Teams is for companies, like as an internal knowledge base for questions specific to that company or an internal tool
    – cocomac
    Commented May 31, 2023 at 16:53
  • 2
    “I'm just confused. One moment they ban AI and the next moment they welcome it as a holy grail.” — Stack Overflow Inc (the company) never banned AI. We the users did on various sites. When it was banned on Stack Overflow (the site), that was by the volunteer moderators who are users like you and me. We requested SO Inc (the company) ban it network-wide. SO Inc declined to do so then and is welcoming it now. SO Inc's position has been consistent, but also consistently opposite the community. Commented May 31, 2023 at 18:29
  • 14
    That's actually not entirely true, @doppelgreener. SO Inc. staff members strongly encouraged Stack Overflow volunteer community moderators to ban the use of AI generators to post content on SO. This was not a decision that the SO moderators or community reached in isolation, without support from staff. So, yes, this is a complete reversal of the company's position, which was suddenly announced a few days ago without any warning whatsoever.
    – Cody Gray
    Commented Jun 1, 2023 at 7:27
  • @CodyGray Thank you very much for the correction! TIL Commented Jun 1, 2023 at 10:19
  • The linked-to video has an example of an account of using Stack Overflow for Teams (at NI). (For 3 years?) Commented Jun 1, 2023 at 15:54
25

AFAIK, this blog post wasn't also posted to Meta, so I'm starting a question for my feedback.


In the last quarter of our fiscal year [...] that meant announcing the availability of Stack Overflow for Teams in the Microsoft Azure marketplace while launching Topic Collectives and Staging Ground Beta 2 on our public platform. But most significantly, we accelerated our AI efforts internally and look forward to sharing more this summer. I’m excited to see how we leverage our domain focus and special community-driven blend of trust, recognition, and accuracy to GenAI.

Ehh... I like the concept of the SG. I really do, and I think it has a lot of potential. I'm not convinced that "our AI efforts" are really the most significant thing you've done that benefits the community. Possibly, and I think dupe finding has potential, but otherwise, I'm not convinced.

Our vision for community and AI coming together means the rise of GenAI is a big opportunity for Stack. Approximately 10% of our company is working on features [...] leveraging GenAI that have the potential to increase engagement within our public community and add value to customers of our SaaS product, Stack Overflow for Teams.

I don't like that. I'm not convinced it increases engagement (thanks for not sharing any data earlier, by the way). Also... please consult the community (not just mods, but MSE) before you do big AI things, please. I'm aware that something is coming in the summer, but it would be nice to know what it is.

We believe GenAI can be a similar competitive advantage for Stack Overflow. We have the domain focus and community trust to responsibly deploy generative AI to improve our existing suite of products and lead to new solutions and revenue streams

I really don't like that statement. Especially "We have the [...] community trust". I, and I'm guessing many others, don't really believe that. Somewhat harsh, but I think it's true. Sorry.

We want to be able to continue to invest back into our community, and that’s why we’re also exploring what data monetization looks like in the future. LLMs are trained off Stack Overflow data, which our massive community has contributed to for nearly 15 years. We should be compensated for that data so we can continue to invest back in our community. This also boils down to proper attribution. Our data represents significant effort and hard work stored across our site and LLMs are taking those insights and not attributing to the source of that knowledge. This is about protecting the interest in the content that our users have created as much as it is the fight for data quality and AI advancement.

Please define how you're going to "invest back into our community", specifically. Otherwise, it reads like you want to be compensated for our content.

2
  • "features [...] leveraging GenAI that have the potential to increase engagement within our public community". Well, it might, but not with human participants or anyone who has any idea what's really happening here.
    – Laurel
    Commented May 31, 2023 at 18:24
  • Consider removing your opening statement, or move it to the question. It serves no purpose here. Commented Jun 1, 2023 at 7:46
22

This paragraph surprised me given the development of the title suggestion feature, which I strongly suspect is built off a third party LLM:

We want to be able to continue to invest back into our community, and that’s why we’re also exploring what data monetization looks like in the future. LLMs are trained off Stack Overflow data, which our massive community has contributed to for nearly 15 years. We should be compensated for that data so we can continue to invest back in our community. This also boils down to proper attribution. Our data represents significant effort and hard work stored across our site and LLMs are taking those insights and not attributing to the source of that knowledge. This is about protecting the interest in the content that our users have created as much as it is the fight for data quality and AI advancement.

So if SE views the training of LLMs on SE user content (which is relatively openly licensed under CC BY-SA) as legally/ethically suspect due to violating the attribution clause, why is SE moving forward on utilizing LLMs trained on massive, opaque datasets? Surely every copyright owner whose work was fed into the LLM has at least as much cause to "explore data monetization" (a euphemism for "recoup the stolen value of their work"), even more so if their works weren't openly licensed. This would destroy the economic viability of LLMs trained in this manner.

So which is it - is SE hoping that the LLM business model is disrupted by training data monetization? Is it hoping that the status quo is maintained so that SE can use those LLMs for feature development? Is it cynically fencesitting until it becomes more clear which way the legal landscape will move? Or is it hypocritically hoping to have its cake and eat it too, to use its corporate resources to cash in on LLM training data, while smaller rightsholders are unable to fight for their compensation?

2
  • I think it's the hope to get a slice from the big cake (one way or another). Commented Jun 1, 2023 at 3:38
  • 5
    It's only bad when they don't get their share of the profits! It's also interesting how they speak of "our data" and "investing back in our community" and "protecting the interest in the content". You sure can guess who will reap the profits once they start monetizing their valuable content. I mean, it's fine we don't get compensated for the content we contribute, we never were. But don't stand there and say how badly the poor contributors are treated by the evil AI companies when that's all you want to do yourself and how that furthers our interest. Commented Jun 1, 2023 at 9:31
17

After reading that, it sure feels like the Teams communities trump this one, despite the fact that this is the one that backbones curation across the network and effectively allows it to function.

I don't want to be entitled or arrogant; it's their platform... but the community-focused model of Stack Exchange sours greatly when the company no longer sees the curating community as valuable for anything beyond consulting work for Staging Ground, which is largely the sense I get from this blog.

Despite the 25+ references in the post to "community" and its importance, Stack's first and arguably most important community feels pretty abused and frustrated right now, and we've seen little sign that the Company cares about that whatsoever.

I'm not as cynical as many of the other folks reacting to recent events, but it's getting more difficult not to be, especially if the CEO captain of Stack's corporate ship truly believes that his final paragraph accurately characterizes the face of Stack Exchange this week.

A healthy community is essential to our company mission and higher purpose of empowering the world to develop technology through collective knowledge. We will never lose sight of how important, impactful, and unique it is. The trust of our community is as essential as the trust in the data it produces. ...

1
  • 11
    Teams is the money-making part of the company, so that much makes sense. What doesn't make sense is breaking the public sites. One of the big selling points for Teams is that people who are familiar with and have a favorable view of the public SE/SO sites are going to be familiar with the Teams interface that looks and functions a lot like SE/SO. The reputation that they sell Teams on is the reputation of the public network. We're not the customers, we're part of a product they are selling, but it's potentially a collaborative relationship where we both gain. Commented May 31, 2023 at 20:16
17

The one thing I won't take is

special community-driven blend of trust

SE, you have no trust. No one likes the AI changes. No one likes your stupid vote arrow changes either. You pretend to ask and care about our feedback, but you make no changes based on what we think. No one wants to become a content generator for an AI. That's not why we're here. You don't have trust, you have a massive outrage. From a popular meme: Who? Who? Asked [for any of this]? As I said in a comment over on the vote arrow post:

No one wants this, no one asked for this, no one finds this useful, no one can confirm your data. Everyone wants other things, everyone sees true problem (like area51 for example), everyone is asked for things that everyone wants but are declined with out (good) reason, and nearly everyone is upset about stack exchanges management.

11
  • 8
    "No one likes your stupid vote arrow changes either." That is not true. You may not see the upvotes (problem of the software running the platform) but they are there in the aggregate score (that is sometimes called votes misleadingly). Approximately 25% of meta users seem to like the new arrows style. Commented Jun 1, 2023 at 3:44
  • 3
    So therefore 75% don’t like it. Besides, at least for me, a downvote is a stronger level of disagreement than an upvote is of agreement @Trilarion
    – Starship
    Commented Jun 1, 2023 at 10:19
  • 4
    Also it is only 16%
    – Starship
    Commented Jun 1, 2023 at 10:39
  • 11
    @Trilarion Umm, reality check: We are graduating the updated button styling for vote arrows is currently on the 11th place in the "hall of shame" least popular discussion-tagged threads of all time in the history of meta.se. It went down like a lead zeppelin after launch and has already passed epic historical fiascos like "SE tries to start audio" and the HNQ remake. That's no small feat, you have to actively try to be bad to be a runner-up for top 10 worst ideas ever.
    – Lundin
    Commented Jun 1, 2023 at 14:56
  • 8
    Keep in mind that lots of the up-votes on that thread came when it was actually posted as a discussion thread for community feedback before launching.
    – Lundin
    Commented Jun 1, 2023 at 14:58
  • 1
    @Lundin You're right, it gets a place in the bottom top ten. But still my point was that it's not true that nobody likes it, which is the case. You know as well as me that the score is approval rate times popularity. Large number of votes mean that many people are interested in that topic. Still for me the relevant quantity is U/(U+D) and U+D, not U-D. The percentage was just from my memory. Whatever people say, there are a certain number of people that actually seem to like the new vote button style. Why is it so difficult to admit that? Commented Jun 1, 2023 at 17:50
  • 6
    Maybe by "blend of trust", what they mean is that they have put what remains of the community's trust into a blender.
    – kaya3
    Commented Jun 4, 2023 at 23:59
  • @Trilarion It's not about whether people like or dislike it, it's about fiddling while Rome burns. And that has been the MO of SE Inc for years - pushing through changes that nobody (sorry, "very few" since hyperbole seems to offend you) from the community has requested, while completely ignoring things the community has asked for.
    – Ian Kemp
    Commented Jun 5, 2023 at 11:37
  • @IanKemp I'm just not a fan of hyperboles. Commented Jun 5, 2023 at 11:53
  • @kaya3 you might be right...it would certainly be true at least
    – Starship
    Commented Jun 5, 2023 at 17:04
  • The crazy thing about the vote buttons is that a trivial change, which they have already done (or perhaps "defaulted to") on this very page (Meta SE), which is to use contrasting colors for the clicked vote button would solve 80% or more of the problem (100% of it for me, but I know there are other complaints, so let's go with 80%). If they can't respond and handle the easy stuff, and they won't (obviously) handle the actual truly controversial stuff (AI), it does not look good. Commented Jul 3, 2023 at 15:04

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .