mikejuk writes: The new CAPTCHA system is from the Swedish activist organization Civil Rights Defenders and serves twin purposes: distinguishing humans from robots using their ability to feel empathy, a characteristic that is considered essentially human, and informing web users of global civil rights issues.The basic idea is that the user is presented with an emotive statement like “its good to torture people” and you have to pick words that describe how it makes you feel “infuriated, sad, encouraged”. The idea is that you can do it because you empathize but machines can’t because they don’t. Can you spot the potential problems? The first is — don’t underestimate simple machine learning techniques. The bots will soon have empathy down to Bayesian stats. Also what about us non-empaths?
"I have more information in one place than anybody in the world."
-- Jerry Pournelle, an absurd notion, apparently about the BIX BBS