cross-posted from: https://sh.itjust.works/post/1062067

In similar case, US National Eating Disorder Association laid off entire helpline staff. Soon after, chatbot disabled for giving out harmful information.

  • BurningnnTree@lemmy.one
    link
    fedilink
    English
    arrow-up
    10
    ·
    1 year ago

    It’s a little annoying how this article is written as a shitty “look who’s getting dunked on on Twitter today” article, even though it’s actually about a serious issue. I don’t care about Twitter drama, I care about the fact that people are losing their jobs to AI.

    • Oliver Lowe@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Because they can’t or are not willing to investigate what happened at this particular company nor to its staff. The push of the story is therefore about what’s happening on Twitter (“getting absolutely roasted”) because people connect with action.

      A better story could recount the events up to now. Maybe something like this?

      1. Find some fired staff members. How long were they working there?
      2. Tell a little story of the day the staff first heard of the layoffs.
      3. Show the layoff message or paraphrase what was said to them by CEO or whoever
      4. Interesting point: Were they told they were being replaced by a large language model or some “AI” tech?
      5. Now include the obnoxious tweet by the CEO

      Finding this information and weaving it into a story that people go “And then what happened?!” is difficult and takes time. It’s hard to justify when you can get clicks from shit like this article.