One massive side effect of the pandemic has been the rapid acceleration of the shift to digital. Many organizations had robust digital presences in advance of March 2020, but the subsequent lockdowns pushed everybody—including and especially B2C businesses—to find new and effective ways to connect with customers and prospects from a distance.
The web apps deployed to support this shift brought with them their own unintended side effect: they created additional surfaces for cybercriminals to attack in search of personally identifiable information (PII), payment information, or credentials for reuse in later stuffing attacks. And the countermeasure to that secondary effect was the use of tools intended to spot and stymie automation.
However, as with the best-laid plans of mice and men, these tools did little to stop the actual threats and instead created a challenging and frustrating user experience for the actual humans simply trying to log into a web portal or purchase something online.
Ghosts Fraud in the machine
“Fraud” is one of those words that makes people flinch: it has all of the elements of a great curse word (one syllable, big open vowel sound, sounds fantastic when you shout it out loud), and it’s something that immediately raises your hackles. It’s one of the reasons that it’s so important that we use the word when we describe the problems that bots create throughout the digital experience - it helps establish the severity of the action.
Fraud doesn’t happen by accident. Ad fraud happens when someone makes the conscious choice to break the digital advertising system to steal money. Marketing fraud happens when someone makes the conscious choice to interfere with modern marketing technology stacks to steal money. Account takeover happens when someone makes the conscious decision to gain access and take over control of an account that does not belong to them for nefarious use.
That’s why it’s so important for organizations victimized by fraud to take proactive steps to combat it. If fraud is a deliberate action, organizations need to respond with at least an equal and opposite reaction (thank you Isaac Newton) to fight back. Passive responses don’t fight back, they merely create friction.
The friction martini
User friction, like I Spy type games deployed on all who enter, has become a kind of proxy for effective cybersecurity. Introduce enough friction that they can’t move rapidly through the process, but not so much that they abandon the transaction altogether, and you’ve found that magical inflection point.
That attitude, however, demonstrates that user friction is how Hawkeye Pierce in the TV show MASH orders his martinis: by pouring straight gin into the glass while looking in the direction of Italy. There’s no actual vermouth (substance) to the final concoction, but there was a gesture to acknowledge that this is theoretically a part of the process.
Rejecting the premise
That balancing act, searching for the equilibrium between fraud reduction and user friction, is a false narrative. HUMAN recently completed research into the impacts of one fraud reduction tactic on user friction, and the results suggested that many users’ tolerance for friction is remarkably low. The pandemic-accelerated shift to digital has opened numerous new doors for fraud, and the friction-as-security response has left many consumers dissatisfied.
In future posts in this series, we’ll explore individual fraud reduction tactics and their impacts on user friction, how this challenge is reflected in specific attack patterns, and new research into how consumers view the issue.