One concept I have been playing with a lot lately is interesting ways to take the robot out of CAPTCHA solving, but still solving it subversively. Sure, we came up with the mechanical turk methods, the porn proxy, using kid’s games, and a variety of other low tech solutions. However, the other day, I came up with a concept for an actual service that does this. Let me explain:
CAPTCHAs or any automated Turing tests in general attempt to see if the consumer is a robot or not by throwing up an image to test if the human can read them. The reason why webmasters use them is so they can detect if the user is real or not. So webmasters have a need, and spammers also have a need. Webmasters want to detect if a user is really a person or not, and a spammer wants to solve those CAPTCHAs in whatever way is effective. So here’s the concept.
By setting up a central proxy with APIs for webmasters you can solve both problems at once. The webmaster gets to have unique CAPTCHAs by using the API to query the proxy. The proxy pulls a CAPTCHA from somewhere on the Internet that a spammer wants to break. The spammer uses their own API to decide if the consumer types in the correct password or not and sends back a decision back to the webmaster through the proxy. The webmaster then can allow the user to succeed or fail as they choose. The only motivation for the black-hat webmaster to do this is if they represent a lower value target than the websites that the spammer tends to attack and/or if they don’t care about other websites’ problems with security.
Of course this is entirely black-hat, and provides no good service whatsoever, but it does solve two different people’s problems at the same time. Of course this symbiosis does introduce latency by slowing the consumer down while they wait for the proxy and the spammer to validate the entry. Maybe a credit system would need to be put in place based on the latency time to ensure quality. This service exploits one of the two fatal flaws in CAPTCHAs - if it works perfectly although it can detect it is a person or not, it cannot detect their intentions (the second being that if it is created by a computer it can be read by a computer). Yah, evil, I know.