Select Page

  • Facebook
  • Twitter
  • LinkedIn
Bots make up a shocking percentage of internet traffic.  In fact, in some industries, there are far more bots trolling sites than there are humans.  Keeping bots from interfering with a website’s functionality and impacting human user experiences is an ongoing challenge that Google and many other companies are struggling to address.

Google’s first attempt at reigning in bot activity took the form of their reCAPTCHA system, which worked by requiring a website visitor to type in a string of graphically warped numbers and letters to prove their humanity.

reCAPTCHA worked, but in response, the people behind all the bot traffic trained their bots to be able to decipher the text, creating a kind of digital arms race.

Google’s next version of reCAPTCHA had users clicking on images to prove their humanity, verifying such mundane sights as street signs, busses, storefronts, intersections and the like.  The second version also had the advantage of allowing people who correctly identified the landmarks in question through with minimal fuss.

Even so, it was far from a perfect solution that created annoying busy work for humans who just wanted to see the content on the website in question.

Now Google is taking another stab at it with the release of their third version of reCAPTCHA, and this one promises to allow humans to pass through without a single click and without having to decipher and type warped text strings using reCAPTCHA v3, calling it ‘zero friction’ for the user.  This new version is for webmasters and really isn’t designed for the user. This is supposed to improve the user (a human, not a bot) experience so that the user has an improved experience looking for whatever the site is offering, without an annoying reCAPTCHA experience, but instead, an invisible reCAPTCHA.

The latest version has been in testing by a large user group for more than a year. It relies heavily on machine learning that focuses on deciphering and understanding human interaction with websites and how they differ from bot interaction.  The command console allows admins a wide range of freedom to set their own identifying thresholds and protocols, which impact what traffic ultimately gets through the gateway. This allows users on a website not to be interrupted and being annoyed at reCAPTCHA. The webmaster of the site administers how this new approach is used on each page of the website.

It remains to be seen how successful this new approach will be, but currently, hopes and expectations are high.  At long last, Google may have figured out a way to separate bot traffic from human, and to do so in a way that cuts down on the annoyance.  Kudos to Google for their continued efforts on this front! You can watch the Google video below that explains how this works: