Research Findings:

  • reCAPTCHA v2 is not effective in preventing bots and fraud, despite its intended purpose
  • reCAPTCHA v2 can be defeated by bots 70-100% of the time
  • reCAPTCHA v3, the latest version, is also vulnerable to attacks and has been beaten 97% of the time
  • reCAPTCHA interactions impose a significant cost on users, with an estimated 819 million hours of human time spent on reCAPTCHA over 13 years, which corresponds to at least $6.1 billion USD in wages
  • Google has potentially profited $888 billion from cookies [created by reCAPTCHA sessions] and $8.75–32.3 billion per each sale of their total labeled data set
  • Google should bear the cost of detecting bots, rather than shifting it to users

“The conclusion can be extended that the true purpose of reCAPTCHA v2 is a free image-labeling labor and tracking cookie farm for advertising and data profit masquerading as a security service,” the paper declares.

In a statement provided to The Register after this story was filed, a Google spokesperson said: “reCAPTCHA user data is not used for any other purpose than to improve the reCAPTCHA service, which the terms of service make clear. Further, a majority of our user base have moved to reCAPTCHA v3, which improves fraud detection with invisible scoring. Even if a site were still on the previous generation of the product, reCAPTCHA v2 visual challenge images are all pre-labeled and user input plays no role in image labeling.”

  • siph@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    4 months ago

    Considering the article states that reCAPTCHA v2 and v3 can be broken/bypassed by bots 70-100% of the time, they are obviously not the solution.

    • conciselyverbose@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      4 months ago

      At what cost?

      100% success rate isn’t even moderately useful if it costs $5 per pass. The discussion is completely pointless without a concrete, documented analysis of the actual hardware and energy costs involved.

    • radivojevic@discuss.online
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 months ago

      “Google should bear the cost”

      Google should shut it down and make sites roll their own verification. Give everyone a month to implement a new solution on millions of websites.

      • AeroLemming@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        4 months ago

        This is unironically the answer. You can’t make a general-purpose captcha solver AI if every website or group of websites uses a completely different kind of captcha.

        • radivojevic@discuss.online
          link
          fedilink
          English
          arrow-up
          1
          ·
          4 months ago

          I’m actually 100% for rolling your own… almost everything.

          20 years ago I made an e-commerce website for a client. Looking at the code now I’m embarrassed how insecure it is. However, because it was totally custom no one ever found the bugs and it has never been cracked. (Knock on wood) that’s the benefit of not using a prebuilt solution that isn’t a target for mass exploits.

    • polonius-rex@kbin.run
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      4 months ago

      how do you get the metric of 70-100% of the time?

      the best bots doing it 70-100% of the time is very different to the kind of bot your average spammer will have access to

      • siph@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        4 months ago

        Did you read the article or the TL:DR in the post body?

        The paper, released in November 2023, notes that even back in 2016 researchers were able to defeat reCAPTCHA v2 image challenges 70 percent of the time. The reCAPTCHA v2 checkbox challenge is even more vulnerable – the researchers claim it can be defeated 100 percent of the time.

        reCAPTCHA v3 has fared no better. In 2019, researchers devised a reinforcement learning attack that breaks reCAPTCHAv3’s behavior-based challenges 97 percent of the time.

        So yeah, while these are research numbers, it wouldn’t be surprising if many larger bots have access to ways around that - especially since those numbers are from 2016 and 2019 respectively. Surely it is even easier nowadays.

        • polonius-rex@kbin.run
          link
          fedilink
          arrow-up
          5
          ·
          4 months ago

          researchers were able to defeat reCAPTCHA v2 image challenges 70 percent of the time

          that doesn’t answer the question?

          researchers devised a reinforcement learning attack that breaks reCAPTCHAv3’s behavior-based challenges 97 percent of the time

          i’d argue “bespoke system, deployed in a very limited context, built by researchers at the top of their field” is kind of out of reach for most people? and any bot network scaled up automatically becomes easier to detect the further you scale it

           

          the cost of just paying humans to break these already at or below pennies per challenge

      • siph@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        4 months ago

        Maybe a billion dollar company has the budget to come up with something?

        Looking at the numbers in this post, reCAPTCHA exists to make Google money, not to keep bots out.

        I’d rather have no reCAPTCHA than the current state.

        • OsrsNeedsF2P@lemmy.ml
          link
          fedilink
          English
          arrow-up
          9
          ·
          edit-2
          4 months ago

          Hi it’s me. I work for a billion dollar company with a budget. We have no ethical ideas on how to stop bots. Thanks for coming to my tech talk.

          • siph@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            arrow-down
            1
            ·
            4 months ago

            Yeah, that’s about the way I’d expect it to go.

            “Traffic resulting from reCAPTCHA consumed 134 petabytes of bandwidth, which translates into about 7.5 million kWhs of energy, corresponding to 7.5 million pounds of CO2. In addition, Google has potentially profited $888 billion from cookies [created by reCAPTCHA sessions] and $8.75–32.3 billion per each sale of their total labeled data set.”

            There might be a tiny chance they’re not interested in changing things.