This Week in Tech
“Social media is the toilet of the internet.” - Lady Gaga.
Social Media is a new phenomenon. It is unprecedented in its reach and adoption speed. Its influence is hard to measure and judge accurately. No living human has ever seen anything like it before. We are living through its birth and evolution - and trying to make sense of it - at the same time that we are expected to teach and guide our learners about its functions, uses and (as the syllabus is fond of saying) its advantages and disadvantages.
Whilst much of the fuss and furore about social media centers on the concept of privacy and how the social media companies exploit and mine our data for their own financial benefit, too little attention is paid to the even larger and more insiduous problem of manipulative and addictive design. Think about this:
We live and work in an attention economy. In many ways, information is less important than user attention (also known as 'engagement').
Hundreds and thousands of hours of time, research and effort is therefore dedicated to figuring out how to make the 'service' something that you cannot live without - something you crave with all the intensity and harrowing, overwhelming need of the most enslaved drug addict.
Not because you really need it.
Not because it adds that much value and utility to your life.
Not because it makes you feel better.
Simply because it (the 'service') is purposefully designed to exploit every aspect of psychology, behavioural science, neuroscience and design finesse to hook you - so that a (very substantial) profit can be made from your addiction.
Whilst writing this article I checked on LinkedIn (i.e on just one source) - there were 635 psychology related jobs advertised for Facebook for the US alone.
There's a name for it. Persuasive Design. Books have been written about Persuasive Design. Courses created and presented at universities and online. Conferences arranged to help people and companies learn how to create and use it. Privacy is discarded and data gathered (with and without user consent) to be better able to implement it.
The biggest flaw of persuasive design is that we tend to focus on helping ourselves (the programmers / companies) rather than helping the users.
The motivation behind persuasive design is ostensibly to improve the user experience, to create a product that they enjoy using and want to use repeatedly.
The problem is: economics. Creating, maintaining and hosting an online service is not cheap. And people want to make a profit. As big a profit as they can. The problem is, users want to pay as little as they can. Subscriptions are not popular. The only solution anyone has come up with is advertising. Magazines and Newspapers have done it for the longest time. They exist (economically speaking) not as a source of news or entertainment but rather as a way to collect eyeballs and attention so that their creators can make a profit by selling advertising. Their disadvantage is that they are not interactive - they cannot provide the immediate Pavlovian feedback that keeps users coming back for more. They are unable to leverage the
"...subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,”
Advertisers want to know that their messages are being seen. For that to happen Social Media sites need users to actively spend time on the site - time that they can measure and show as proof that adverts are being seen. Repeat Visitors and Time on Site are important metrics. They reflect an increased chance that the user is likely to see your advert multiple times which, in turn, increases the chance that they will respond to it or at least develop a familiarity with your brand or product - making it more likely they will seek it out when they need such a product in the future.
“The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”.
We are all Experimental Subjects
It's not just all design and psychology theory though.
There are many millions of experiments being run on the internet every day to work out the best way to capture and keep user attention - and how to prod / tempt / guide / lure users into online actions that may not be in their own best interests.
How many of you / your learners know what ‘A’ - ‘B’ testing is?
How many know that every pixel of the Facebook interface is monitored and tested to see just how to maximise user engagement?
For example: If this 'Like' button is a little bigger and a slightly different shade of blue are users more or less likely to click on it? What about if the icon is bigger? Or if the icon is different? Or if we put the button above or below the article?
The best way to find out is to create multiple different versions of a web page, show it to different people and measure their responses. Then use statistical analysis to figure out which page is better at getting the user to do what you want them to do. This is what A - B testing is.
And it happens all the time. Without our knowledge or consent. We are all part of one continuous experiment on how to make users do things to maximise engagement and profits.
There are many web services to help you do A - B Testing. Google even offers one for free. It's called Google Optimise and they walk you through the steps of how to create and use an A - B test. You can do it yourself. It's easy!
We are all slaves to the Algorithm
The final trick in the arsenal is that Social Media controls what we see. On the one hand User Engagement can be maximised by reducing 'friction'. It's simple - don't display content that disagrees from the users' (measured and tracked) world view, interests, likes, political viewpoint, etc. If the user is liberal, decrease or remove the number or conservative articles, posts, etc. from their feed. If they are conservative, then remove the liberal content.
Data is gathered and collected and then algorithms process that data to narrow down the content presented to us. This in an effort to make the social media site a place we feel comfortable and relaxed and at home in, because our core identities are not being challenged by views that are different from our own.
Being comfortable and secure in our own identities helps to keep us online and 'engaging' for longer. Eli Pariser first popularised this concept in his book The Filter Bubble.
But algorithms don't just keep us in a safe protected filter bubble. Being in a safe, comfortable, familiar environment is not enough. The algorithms go further, seeking out ways to provoke us into response, into clicking and posting and forwarding and coming back later to carry on the 'engagement'.
For this the algorithms like to target our 'Lizard brain' - the most basic and primitive urges we all feel. Sure, a cute and happy post can make us feel good. But does it promote engagement? Not as much as something far more provocative.
"... anger is addictive—it feels good and overrides moral and rational responses because it originates from our primordial, original limbic system—the lizard brain"
"... anger makes people indiscriminately punitive, careless thinkers, and eager to take action. It colors our perception of what’s happening and skews ideas about what right action might be"
It contradicts common sense, but User Engagement is maximised most effectively by content that plays on our fears and provokes outrage, misery, jealousy or despair. This content does not objectively examine or discuss opposing views but rather presents them as a threat or disparages them as ridiculous. Your identity is affirmed by creating an 'us vs them' scenario that provokes and outrages you without making the site feel like a less safe place. Rather the site is you bastion, your place of security from which you can hurl abuse at your foes and be cheered on by likeminded people without ever having to listen to the voice of reason.
Complex issues are simplified to fit in a tweet or headline and the messages make us feel good, even while they make us mad. The simplification creates an illusion that problems are easier to solve than they are, indeed that all problems would be solved if only they (whoever they are) thought like us.
Algorithms maximise the spread of this type of content over longer, more rational arguments aimed at discovering the truth and promoting co-operation, conciliation and arriving at a shared truth. Instead they push us into enclaves, divide us into tribes that cling ever more tightly to what separates rather than what unites. The algorithms discard honest debate and rational discourse in favour of emotional outbursts and denialism.
"a cursory glance at the tenor of cultural discussion online and in the media reveals an outsized level of anger, hyperbole, incivility, and tribalism"
Why? - well, simply because short, outrage inducing pieces generate more 'engagement' than long, rational arguments do.
What is good for the individual, society and humanity at large is substituted for what will generate the most profits.
The algorithms are not trying to make the world a better place, not trying to benefit mankind. They are simply trying to maximise engagement and so maximise profit.
Maybe it is time to stop worrying about the symptoms of the Social Media malaise that affects us all (the collection, sale and exploitation of our private data). Perhaps we should rather worry about a culture that will unthinkingly maximise 'engagement' (and profits) without considering the broader impact these techniques have on us as individuals and on society as a whole.
“Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
Some resources you can use on this topic:
Facebook never seems to stop putting its foot in its mouth. They're in the news again this week because of, amongst other things, spreading fake news videos about the victims of the recent mass school shooting in the USA. China also features with a concept sure to appeal to at least some of your learners - namely gaming schools! Then there's a whole batch of other interesting tidbits - specifically an in depth article on how it is becoming more difficult to learn to program.
E-Sports - a career option?
E-Sports are a thing. There are competitions with significant prizes and even TV stations dedicated to covering people playing games such as StarCraft against each other. The Citizen has an interesting article on China's approach to the concept of learners playing games in school. Well worth a read...
|260 million people are already playing eSport games or watching competitions...|
the eSport industry will be worth $906 million in global revenues in 2018
China an example of future surveillance state?
Not really an article you can use in the classroom, but an interesting view of ways that the state can use technology to surveil its citizens. Engadget has the details.
Always Connected Windows - limits exposed
Remember a few posts back I mentioned the prospect of an 'Always Connected' Windows machine using ARM processors - and feared that it would have the same kind of limitations as the failed Windows RT project. Well, DigitalTrends has an article detailing these limitations that was briefly listed (and then pulled) by Microsoft. Spoiler: if you were expecting the full Windows experience then prepare to be disappointed.
YouTube, Facebook and Fake News
Recently 17 young people were killed in a school shooting in Florida. Or were they? Right wing gun freaks claim its all a hoax - and YouTube and Facebook spread their message... Business Insider has the details. Om Malik Explains why Facebook will never change this kind of behaviour.
This YouTube channel has a set of lessons on how computing theory that you could find very useful.
That's it! Hope it's useful.
Welcome to our 50th blog post. We hope that the blog has at least made one useful contribution to your teaching, classroom and / or learners. This week's news tends towards the lighter side and there are a couple of fun things you can show your learners to put smiles on their faces.
The first item on the agenda is MIT engineers proving that you don't need GPS and precise knowledge of location to improve an autonomous drone's ability to avoid obstacles. Instead they allow the drone to keep what they call a 'nano map' in memory which the drone continually refers to. By comparing past images with the current image the drone can position itself relative to obstacles and take the appropriate evasive action.This is much closer to how we humans do things and reduces crash rates from 28% to 2%!
Weird Hardware Hack
Q: What do you get if you combine parts from a flatbed scanner, dot matrix printer and a hard drive, with some mechanical parts and a pencil?
A: The weird 'printer' below that uses a pencil to 'tap' out an image.
Useful? NO. Fascinating? Yes!
Robots continue their advance...
Wired has a story on how Boston Dynamic's Spot robot dog can now open doors (video below). Makes me think of the 'Metalhead' episode from Black Mirror season 4.
Or maybe not so much... The Winter Olympics provided the ideal opportunity for various robotics teams to show just how far robots have to go. The narrative is not English but the visuals are universally understandable.
5G and Wild Boars
More from the Winter Olympics. 5G is a specification that is only due to hit mainstream in 2020. South Korea has been using the technology (capable go 10 Gigabits data transmission speed) in various demonstrations throughout the Olympics. One of the uses is for automated defences against Wild Boars to keep them from invading competition tracks. TechCentral has the details.
Recycling old computers into art
Zayd Menck has built a model of Midtown Manhattan (New York) from old computer parts...
In other news:
And that's it for this week. Enjoy!
This week has a lot of news, in many mixed areas of interest. No space for an into - just jump in and enjoy!
Amazon Go - 'Queue free shopping experience'
I'm not sure how it slipped past, but the last post was meant to include the new Amazon Go - 'Queue free shopping experience' shop that has just opened in Seattle.
Unfortunately there's a queue to get in...
RFID was always touted as the way that shoppers would be able to pile goods in their shopping cart and then simply walk out the shop and have the sensors automatically read the price of their goods and bill them without having to stand in a queue. That dream has not (yet) materialised - and is vulnerable to people doing things like removing the RFID tag from goods, swapping tags on expensive goods for cheaper ones, etc.
Amazon thinks they have a solution. A shop where you can only enter by having your smartphone scanned, and then being watched by many, many, many cameras that track what you put into your basket so the system bills your credit card when you walk out. Several news outlets have tried shoplifting (and failed - here's Ars Technica's report on their attempt) but some youtubers have claimed success.
There are some obvious cheats - shelves are designed to try to ensure that you can't put items back in the wrong place (to make it easier for the computers to identify them)
Here's Amazon's info page.
Think of the thousands of cashier jobs that will be lost if this technology proves a success (Forbes has).
UK Airport Security takes romance into consideration.
Digital Trends has the scoop - an amusing read.
Contactless (NFC) cards and security in SA
MyBroadband has an article where banks tout the safety of the system. No research, just spokespeople...
The value of Data
MyBroadband has an article on how Vodacom makes R2 BILLION per month on data alone.
Keeping fit... leaks info on military bases
Making data sharing an opt-out feature is always a bad idea. Sure, it lets companies be confident that they will be able to slurp up data from users who don't think about the fact that they are being tracked - or are too lazy (or ignorant) to turn off data sharing for the app. But even 'anonymised' data has its risks. This week it emerged that Strava, a fitness tracking service, has inadvertently spilled the beans on military and other secret installations around the world.
Users of products such as fitbit go out for a run. Their route is tracked. The data is 'cleaned' and anonymised and uploaded. Strava thought it was a great idea to aggregate the data and display it on a global map so that fitness buffs could find popular places to run and exercise. Problem is, some of those routes are run by military personnel inside military bases... Read it at Hackaday and Nine.Com.au (some good graphics and explanations of consequences here).
More Privacy - G.D.P.R. and how tech companies are scrambling to prepare for it
This one is important. Europe has a new set of rules to protect privacy (General Data Protection Regulation) which come into effect on 25 May 2018. If your internet service breaches these rules then your company can be fined up to 4% of your yearly income. As you can imagine, big companies are working hard to make sure that they comply.
Often they take the easy way out - excluding privacy busting features of their products from the European market.
More Amazon - patent granted for wristband to track workers
Gizmodo has an article on a patent that has just been granted to Amazon. The patent is for a bracelet that workers will wear - and which will allow their hand movements to be tracked. This will allow the system to see if you are slacking off - or making mistakes. As the article points out, this is only a patent (at the moment) and probably serves as a way to treat human workers more like robots until robotics advances enough to replace them.
Cartoonist predicted the problem of intrusive cell phones - more than 100 years ago!
Boing Boing has more info on the cartoon and cartoonist.
Bitcoin miner uses oil to cool his rig
Submerging your computer in oil is an effective (if messy) way to keep it cool (oil does not conduct electricity but is good at dispersing heat). The really interesting thing about the article from Motherboard is some of the statistics it reveals about the cost of mining bitcoin. If you have been carried away by the soaring price of Bitcoin in the last short while, these stats will be of particular interest to you. Summarised, they are:
Bitcoin and TAX
If you have made some money from Bitcoin (or know someone who has) then read this. Hope you put aside the tax man's share...
MinION - Palm sized DNA Sequencer
It took a group of scientists 13 years of work and cost $3 Billion to map the human genome. Supercomputers and distributed computing techniques were needed to do the work. Now the MinION, the pocket sized device in the video below connects to your laptop or desktop using USB 3 (and is powered by UB) and can map a genome for as little as $1 000.
AR lets doctors see through your skin
Augmented Reality is so much more useful than Pokemon Go would make you think... Digital Trends has the low down on how researchers are displaying your insides on your outside to help doctors...
How much money (profit) do big companies make - per second?
Check out this interactive graphic to find out. Spoiler alert: Disney only makes $297 per second. Facebook makes $323 per second. Apple makes $1 445 per second!
What's so special about this movie?
The entire, feature length movie was shot on iPhone. No more excuses - you have the same camera tech in your pocket. Now go out and make a movie! More info available here at htxt.africa.
Paying for popularity
The New York Times has a great article on a company called Devumi that sells followers, tweets, retweets, etc for people who need to boost their metrics to prove their popularity. Some of the followers they sell are automated bots based on real people - the product of identity theft.
|Devumi has more than 200,000 customers, including reality television stars, professional athletes, comedians, TED speakers, pastors and models. ...|
Devumi offers Twitter followers, views on YouTube, plays on SoundCloud, the music-hosting site, and endorsements on LinkedIn, the professional-networking site.
If you are still using Flash, it's time to stop!
Flash is hacked again with another zero day vulnerability out in the wild. The Hacker News has the details.
That's it for this week....
Social interaction has always (in my mind) been humanities Achilles' Heel. It is in this area where our insecurities and fears are most exposed - and where our need to dominate and profit often rise above our more redeeming characteristics. The rise of mobile, always on, always connected computing has gone hand in hand with the rise of mega-companies that are little more than symbiotic parasites - they ostensibly offer 'free' services that add value to our lives yet - leech like - drain much of the good and decent and substantive from our lives and social interactions. It would seem as if there is no low they will not stoop to in order to maximise their own profits.
In recent weeks we have seen these giant corporations scrambling to explain how and why they sold adverts that influenced the American election; how and why they publish and promote fake news; how and why it is OK for the American President to spout divisive, bullying hate speech on their platform... I find myself viscerally sickened and repulsed by it all.
And yet their quest to inveigle themselves into our lives is ever more persistent, determined --- and creepy. Two stories on Gizmodo this week particularly creeped me out:
Both stories deal with PYMK (People You May Know). Facebook wants you to make 'friends'. Their thinking (and research) is that the more friends you have, the more you will interact with their site (and the more money they will make from you). So they keep on suggesting people for you to connect with and be friends with. How they find these people is a closely guarded algorithmic secret (after all, other companies want you to connect to people using their network so that they can make money from you) and no one outside of Facebook really knows how it works.
PYMK uses '100 signals' to work out who to connect you to. Facebook refuses to say what these signals are. They deny that they use data bought from third parties or location data / location tracking in this mysterious algorithm. Yet they only vaguely describe around five of these 100 signals.
Both the articles describe extremely creepy connections that Facebook has made between users - connections that should not be possible.
Should any company have this kind of invasive power that they can wield at their own discretion without our having any recourse to prevent them?
The Reed Dance and social media
Facebook and Google and most other social media tries to block and censor nudity. But what if being bare breasted is part of your culture?
The Mail and Guardian has an article on how local girls protest their bare-breasted photos from the Reed Dance being deleted from social media....
In case you missed it, Microsoft has discontinued support for Office 2007 (upgrade if you haven't already) and says that Windows 10 Mobile (and physical phones) is no longer a priority. The mobile space really belongs to Apple's iOS and Google's Android.
Kaspersky - Anti-Virus or Hacking tool?
If you use Windows then going without anti-virus software is like going into space without a spacesuit. It feels kinda suicidal. Of course, the fact that everyone needs anti-virus to protect themselves from the baddies who want to hack and steal data means that, well, the AV programs themselves are the perfect way to hack...
In the news this week is a complicated story of how Israeli intelligence hacked into Kaspersky AV to find proof that the Russians had hacked the AV software so that it would steal American spies' secrets. Sounds more complicated than a badly written Hollywood tech-spy thriller? Probably - but it is true nonetheless. Read it at Ars Technica (and many other places).
Technology and the future
MIT Technology Review has an interesting article on predicting the future of AI (and technology). It does a good job of explaining the limits of AI in its current forms (including the 'machine learning' that is a buzz concept today). Excellent, thoughtful and worth a read.
|We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run.|
|Roy Amara - Amara's Law|
A robotic massage
Digital Trends has an article on a massaging robot that has just started work in Singapore.
That's it for this week. Enjoy.
086 293 2702 or 012 546 5313
012 565 6469
Copyright Study Opportunities 2016 - 2019. All rights reserved.