Google Chrome extensions can protect you from computer algorithms

Computer algorithms shape our world in ways we can’t always comprehend.

Algorithms are a guide to solving a problem based on sets of rules for calculations, taking historical data and predicting future successful outcomes.

Ever think about how Facebook uses data with the content it predicts you will “like”, making you come back for more?

There are a number of open-source Chrome extensions to protect your data. Take Data Selfie, for example. This extension monitors your Facebook interactions for patterns, then crunches all the collected data into reports to help you gain insight about your personality and habits.

Facebook News Feed Eradicator replaces your entire news feed with an inspirational quote like “There are still many causes worth sacrificing for, so much history yet to be made.” ~ Michelle Obama. This extension allows the user to bypass algorithms altogether, because there is very little history of likes or shares from mindless scrolling for the algorithms to collect data from.

Or friends feed for Facebook which only allows you to see stories by your friends and pages you follow. This extension will hide all stories of what your friends liked or commented on and also hide “suggested” stories and “people you may know” stories.

Mariana M. Rodriguez, a Facebook user, deleted her account in October, 2016 but got it back when she moved to Thailand a year later. In her original sign off message, it read, “Right now I’m done with the click-bait, pseudo-science articles, sensationalist and shocking awful news that I can’t stop reading if shown on my feed, the mindless scrolling, the way we have been programmed to feel a high with likes, the fact that it is engineered to keep you addicted, the aggressively targeted ads, the lack of respect for users privacy, the waste of the most important non-renewable resource we have, which is life TIME, the Donald Trump memes, the overwhelming feeling that your brain has just soaked in a hundred friends watered down opinions without having a real conversation, burning out like a computer with 99 tabs open.”

She should have just gotten the Facebook News Feed Eradicator, probably. But she raises some interesting points.

In the age of the algorithm, what we see on social media is filtered, job performances are evaluated and our health is monitored. The list goes on. Algorithms have replaced what humans have been doing forever, and the goal is to have more objective measurements – which isn’t always the case, but these mathematical models (algorithms) aren’t always evil.

A University of British Columbia computer scientist has created a new software that can create a design sketch of an everyday object, addressing the challenge of accurately describing shapes.

“If you try to explain what your computer mouse looks like to someone who has never seen a mouse before, you’re going to struggle to verbally describe its shape,” said Sheffer.

“Humans are good at verbally describing colour or dimensions, but cannot easily articulate geometric properties. The easiest way to describe shapes is to sketch them.”

What’s evil about algorithms is how they are used.

Cathy O’Neil, a mathematician and author of the book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy, has a metaphor that speaks to the reality of every algorithm reflecting the choices of its human designer. In an interview with Roman Mars of 99% Invisible, O’Neil gives the example of cooking dinner for her family. The ingredients in her kitchen are the “data” she has to work with, “but to be completely honest I curate that data because I don’t really use [certain ingredients] … therefore imposing my agenda on this algorithm. And then I’m also defining success, right? I’m in charge of success. I define success to be if my kids eat vegetables at that meal …. My eight year old would define success to be like whether he got to eat Nutella.”


According to O’Neil, it’s up to researchers to “examine cases where algorithms fail, paying special attention to the people they fail and what demographics are most negatively affected by them.”


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s