• tehmics@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    27 days ago

    It’s not scalable. Sure you could have humans comb through the most common 1000 or so results, but there’s got to be billions of unique searches every day

    • ssj2marx@lemmy.ml
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      27 days ago

      Yeah but downranking one AI-generated page downranks it from all search results, and you could come up with rules like downranking specific service providers or companies that are repeat offenders. I don’t think it would be easy but I think it’s the only way to get something better than what we have with techniques that currently exist.

      • tehmics@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        27 days ago

        The amount of man hours this would require would bankrupt even Google. You’d be better off building a new index of whitelisted sites