How We Game the Algorithm to Tame the Algorithm How We Game the Algorithm to Tame the Algorithm
Algorithms are everywhere. Sometimes we see traces. Once in a while we feel the effects. Mostly, we go about our days... How We Game the Algorithm to Tame the Algorithm

Algorithms are everywhere. Sometimes we see traces. Once in a while we feel the effects. Mostly, we go about our days vaguely aware of an invisible algorithmic presence. We set out to learn how people live with algorithms now and what it means for the future.

One thing became clear as we asked people to tour us around their personal, algorithmically determined worlds — people are expert, if somewhat misguided, algorithm trainers. Algorithms adapt experiences to a version of what we do. The problem, according to our participants, is that what we do is not who we are. So, people try to shape the algorithm.

People use different strategies to change their experience with a site or app. One thing they never do — use settings. Most never even thought about trying to look at settings. Jennifer from Illinois summed up the general feeling, “I’ve never used settings, but I can control my feed without them.”

Forget playing around with settings. People try to game the algorithm to tame the algorithm.

When we asked people to look at their ad preferences in Facebook, Google, and Acxiom, they had a hard time finding them. Too many places to look, and without our guidance, most were not even sure what they would look for. Once participants did look at settings, people were unpleasantly surprised. Like many other people in the study, Mary from Virginia was upset, “It’s a little embarrassing to see what Facebook thinks of me. I have no idea how I could have ended up with some of these.”

What the algorithm knew was sometimes a little too close for comfort, creating a kind of uncanny valley effect. At the same time, it wasn’t quite right. That was disturbing too. While a few people removed the mismatches, many were reluctant to do so. Brittany from Florida noted that “Without knowing how this is really used, I’m not sure I can make a good choice.” Others simply did not want the algorithm to know them too well.

The algorithm spawns a second self, or maybe someone else entirely.

It may not be news to learn that most people try to spend as little time with settings as possible. When there’s a disconnect, the impulse is to game the algorithm rather than actually adjust the settings. All the liking and reacting and following and unfollowing come down to four basic algorithm-shaping techniques.

  1. Evading
    People don’t want the algorithm to follow their every move. So the day ends up being a delicate balance of private browsing sessions and “for the record” browsing. Matthew from Wisconisin described a common practice, “I move back and forth between private windows all day long.”That said, people aren’t sure whether it makes much of a difference. Evading is about keeping certain key information private.
  2. Crafting
    Because algorithms seem to be one step behind, people engage in some activities purely for the sake of the algorithm. Following, unfollowing, liking, posting from multiple browsers or devices and using multiple social media identities are done with a strategy in mind.
  3. Questioning
    Once in a while, people will notice something that seems off. Whether an ad, or people missing from their feed, or odd recommendations, the first impulse is to ask for help or just complain about it. This is how mythologies about how algorithms work are born. And more gaming ensues.
  4. Pranking
    When an algorithm makes itself too present — an intrusive ad or an awkward bot — people can’t seem to help themselves. They pull pranks. Same goes for algorithms that get a little too personal, like How-Old.net. Pranking comes from an urge to show the algorithm who is really smartest.

When the algorithm doesn’t quite sync up with who we are, it’s a problem. “It’s like another me made from clicks and likes. So, it’s half-formed and honestly pretty strange,” according to Matt from New Jersey.

Once in a while, we get a glimpse of this other version of ourselves. For some people, it is like seeing a distorted self-portrait. For others, it is another person entirely, often one they don’t much like. Why? “Because whatever it knows about me now, was from me a year ago, or even a day ago. It’s not me right now,” says Kate from Washington.

Ultimately, we want to see our self as the algorithm sees us and have more say in how the algorithm defines us. The best algorithms will grow and change with us over time. The future of algorithms — whether feed or chatbot or recommendation — will invite human collaboration.



Read the original article here, and or the (free) full report from Change Sciences: Living with Algorithms

Pamela Pavliscak

Pamela is an sought-after expert on emotion and technology. She advises designers, developers, and decision-makers on how to create technologies with greater emotional intelligence. Pamela is faculty at Pratt Institute School of Information and has lectured at Parsons School of Design, Stanford Design School, and ASU’s Center for Science and the Imagination. Collaborating with a global committee of scholars and practitioners, she is helping to shape IEEE Standards for ethics and artificial intelligence. Pamela often speaks on creativity in the digital age, generation Z, and emotion and technology, recently at SXSW and Collision.