• %Providing quality teaching resources for the 'computer subjects' (CAT and IT) since 1995.
  • We believe that all learners should be comfortable with computers as part of their lives.
  • Content is presented through real-life examples and scenarios, so that learners may identify with the material more easily and make it relevant to their lives / experiences.
  • We provide videos, PowerPoint presentations, solutions to exercises and data files for exercises - all to make life easier for teachers and learners.
Stacks Image 58588

Study Opportunities' Blog

Brain Hacking - are we all Zucked?

“Social media is the toilet of the internet.” - Lady Gaga.


Q. How often do we touch our phones?

A. Oh, only about 2,617 times a day.


Q. How often do we touch anything that’s *not* made by Google or Facebook?

A. Nearly half of touches were guided by apps made by Alphabet and Zuckerberg. The other half were split among the other 700+ apps.


Social Media is a new phenomenon. It is unprecedented in its reach and adoption speed. Its influence is hard to measure and judge accurately. No living human has ever seen anything like it before. We are living through its birth and evolution - and trying to make sense of it - at the same time that we are expected to teach and guide our learners about its functions, uses and (as the syllabus is fond of saying) its advantages and disadvantages.

Whilst much of the fuss and furore about social media centers on the concept of privacy and how the social media companies exploit and mine our data for their own financial benefit, too little attention is paid to the even larger and more insiduous problem of manipulative and addictive design. Think about this:

We live and work in an attention economy. In many ways, information is less important than user attention (also known as 'engagement').

  • The holy grail for a Social Media company is user engagement.
  • User Engament boils down to one simple metric - how long you spend on the site / using the app.
  • Every extra second / minute / hour you spend ‘engaged’ translates directly into money for the company.
  • Social Media companies will do anything to keep you on the site longer - and coming back for more as often as possible.

Hundreds and thousands of hours of time, research and effort is therefore dedicated to figuring out how to make the 'service' something that you cannot live without - something you crave with all the intensity and harrowing, overwhelming need of the most enslaved drug addict.

Not because you really need it.

Not because it adds that much value and utility to your life.

Not because it makes you feel better.

Simply because it (the 'service') is purposefully designed to exploit every aspect of psychology, behavioural science, neuroscience and design finesse to hook you - so that a (very substantial) profit can be made from your addiction.

Whilst writing this article I checked on LinkedIn (i.e on just one source) - there were 635 psychology related jobs advertised for Facebook for the US alone.

There's a name for it. Persuasive Design. Books have been written about Persuasive Design. Courses created and presented at universities and online. Conferences arranged to help people and companies learn how to create and use it. Privacy is discarded and data gathered (with and without user consent) to be better able to implement it.

The biggest flaw of persuasive design is that we tend to focus on helping ourselves (the programmers / companies) rather than helping the users.

The motivation behind persuasive design is ostensibly to improve the user experience, to create a product that they enjoy using and want to use repeatedly.

The problem is: economics. Creating, maintaining and hosting an online service is not cheap. And people want to make a profit. As big a profit as they can. The problem is, users want to pay as little as they can. Subscriptions are not popular. The only solution anyone has come up with is advertising. Magazines and Newspapers have done it for the longest time. They exist (economically speaking) not as a source of news or entertainment but rather as a way to collect eyeballs and attention so that their creators can make a profit by selling advertising. Their disadvantage is that they are not interactive - they cannot provide the immediate Pavlovian feedback that keeps users coming back for more. They are unable to leverage the

"...subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,”

Advertisers want to know that their messages are being seen. For that to happen Social Media sites need users to actively spend time on the site - time that they can measure and show as proof that adverts are being seen. Repeat Visitors and Time on Site are important metrics. They reflect an increased chance that the user is likely to see your advert multiple times which, in turn, increases the chance that they will respond to it or at least develop a familiarity with your brand or product - making it more likely they will seek it out when they need such a product in the future.

“The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”.

We are all Experimental Subjects

It's not just all design and psychology theory though.

There are many millions of experiments being run on the internet every day to work out the best way to capture and keep user attention - and how to prod / tempt / guide / lure users into online actions that may not be in their own best interests.

How many of you / your learners know what ‘A’ - ‘B’ testing is?

How many know that every pixel of the Facebook interface is monitored and tested to see just how to maximise user engagement?

For example: If this 'Like' button is a little bigger and a slightly different shade of blue are users more or less likely to click on it? What about if the icon is bigger? Or if the icon is different? Or if we put the button above or below the article?

The best way to find out is to create multiple different versions of a web page, show it to different people and measure their responses. Then use statistical analysis to figure out which page is better at getting the user to do what you want them to do. This is what A - B testing is.

And it happens all the time. Without our knowledge or consent. We are all part of one continuous experiment on how to make users do things to maximise engagement and profits.

There are many web services to help you do A - B Testing. Google even offers one for free. It's called Google Optimise and they walk you through the steps of how to create and use an A - B test. You can do it yourself. It's easy!

We are all slaves to the Algorithm

The final trick in the arsenal is that Social Media controls what we see. On the one hand User Engagement can be maximised by reducing 'friction'. It's simple - don't display content that disagrees from the users' (measured and tracked) world view, interests, likes, political viewpoint, etc. If the user is liberal, decrease or remove the number or conservative articles, posts, etc. from their feed. If they are conservative, then remove the liberal content.

Data is gathered and collected and then algorithms process that data to narrow down the content presented to us. This in an effort to make the social media site a place we feel comfortable and relaxed and at home in, because our core identities are not being challenged by views that are different from our own.

Being comfortable and secure in our own identities helps to keep us online and 'engaging' for longer. Eli Pariser first popularised this concept in his book The Filter Bubble.

But algorithms don't just keep us in a safe protected filter bubble. Being in a safe, comfortable, familiar environment is not enough. The algorithms go further, seeking out ways to provoke us into response, into clicking and posting and forwarding and coming back later to carry on the 'engagement'.

For this the algorithms like to target our 'Lizard brain' - the most basic and primitive urges we all feel. Sure, a cute and happy post can make us feel good. But does it promote engagement? Not as much as something far more provocative.

"... anger is addictive—it feels good and overrides moral and rational responses because it originates from our primordial, original limbic system—the lizard brain"

"... anger makes people indiscriminately punitive, careless thinkers, and eager to take action. It colors our perception of what’s happening and skews ideas about what right action might be"

It contradicts common sense, but User Engagement is maximised most effectively by content that plays on our fears and provokes outrage, misery, jealousy or despair. This content does not objectively examine or discuss opposing views but rather presents them as a threat or disparages them as ridiculous. Your identity is affirmed by creating an 'us vs them' scenario that provokes and outrages you without making the site feel like a less safe place. Rather the site is you bastion, your place of security from which you can hurl abuse at your foes and be cheered on by likeminded people without ever having to listen to the voice of reason.

Complex issues are simplified to fit in a tweet or headline and the messages make us feel good, even while they make us mad. The simplification creates an illusion that problems are easier to solve than they are, indeed that all problems would be solved if only they (whoever they are) thought like us.

Algorithms maximise the spread of this type of content over longer, more rational arguments aimed at discovering the truth and promoting co-operation, conciliation and arriving at a shared truth. Instead they push us into enclaves, divide us into tribes that cling ever more tightly to what separates rather than what unites. The algorithms discard honest debate and rational discourse in favour of emotional outbursts and denialism.

"a cursory glance at the tenor of cultural discussion online and in the media reveals an outsized level of anger, hyperbole, incivility, and tribalism"

Why? - well, simply because short, outrage inducing pieces generate more 'engagement' than long, rational arguments do.

What is good for the individual, society and humanity at large is substituted for what will generate the most profits.

The algorithms are not trying to make the world a better place, not trying to benefit mankind. They are simply trying to maximise engagement and so maximise profit.

Maybe it is time to stop worrying about the symptoms of the Social Media malaise that affects us all (the collection, sale and exploitation of our private data). Perhaps we should rather worry about a culture that will unthinkingly maximise 'engagement' (and profits) without considering the broader impact these techniques have on us as individuals and on society as a whole.

“Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”


Some resources you can use on this topic:

  • Think Big video on brain hacking


Show more posts


Contact Information




Postal Address:


086 293 2702

012 565 6469

PO Box 52654, Dorandia, 0188

Copyright Study Opportunities 2016 - 2021. All rights reserved.

Privacy Policy | Terms of use