This Week in Tech
GPS. It lets us know where we are on the surface of our huge and amazing planet. Software allows us to combine our position with digitised maps and routing algorithms to find how to get to a specified destination. Our location can even be used to draw up lists of shops / attractions / faculties nearby to us, so that even if we are new to an area we can easily know what destinations are around us.
But, it is never quite so easy to let people know where we are. Sure we can share a location - if we are online and using the same app. Reading out lattitude and longitude co-ordinates to tell someone where we are is tedious - and often inaccurate. What is needed is an easy-to-communicate global standard method for communicating a location on the planet.
Enter "What Three Words". This amazing startup has divided the entire globe into a grid of 3m x 3m squares. Each grid has been given a name made up of three words. 26 Languages are supported (including isiZulu, isiXhosa and Afrikaans). These words are easy to remember, easy to communicate and can be typed directly into a mobile app or online browser map to find the location they represent.
It's a unique idea, well worth pushing as a global standard. It has tremendous potential for businesses and customers to quickly and accurately communicate location. In the UK emergency services are adopting it as a standard and it is rapidly garnering support in many other places (including here in SA).
What if the company fails? What happens to your ability to convert locations into words and vice-versa? To quote from the site:
|If we, what3words ltd, are ever unable to maintain the what3words technology or make arrangements for it to be maintained by a third-party (with that third-party being willing to make this same commitment), then we will release our source code into the public domain. We will do this in such a way and with suitable licences and documentation to ensure that any and all users of what3words, whether they are individuals, businesses, charitable organisations, aid agencies, governments or anyone else can continue to rely on the what3words system.|
I'd really recommend installing the app, using it and telling as many other people about it as possible.
And the article's headline? That's one of my favourite places to camp.
GauGAN - the AI Artist
A short while ago I wrote about 'This Person Is Not Real' - an AI project that created realistic human faces from scratch. nVidia is experimenting with an app that can turn MS Paint style sketches into realistic looking photographic images. The app is not generally available, relies on computers with AI CPUs (Tensor chips) and so is not something that you can rush out and try.
Some of the resulting images can look like bad uses of the cut, paste and clone stamp tools in Photoshop, but that even this much is possible is pretty amazing.
But the video is cool in a kinda awesome, breathtaking way. Well worth showing your learners.
Google- The serial App, product and Services killer.
I am a voracious reader of news. That's why I write this blog. I manage this by using RSS - and for a long time I relied on Google Reader as my go-to RSS reading tool. Seven years after creating it Google summarily cancelled Reader.
I also enjoy taking (and editing) photos. One of the best plugins tool suites for image editing is called the NIK Suite, of which Viveza is my favourite tool. Google bought the tool in 2012. It dropped prices drastically (from $500 to $130) and then, in 2016, started to give the suite away for free. In 2017 they decided to kill the NIK product line. Luckily Dx0 (a photography software company) bought the brand from them and has continued development.
The list of Apps and Services that have died at the hands of Google is long - and does not include examples such as the NIK photographic plugins (because they were bought out and so did not die). Many of these were not created by Google. They were bought; they had loyal, enthusiastic users who watched their favourite tools languish and die at the hands of a mindless behemoth that consumed them, used them up and excreted them on the dungheap of history.
How long is this list, you ask? Just take a look at KilledByGoogle.
Does that seem like the behaviour of a responsible digital citizen to you?
Talking of irresponsible: Facebook strikes again.
It might be a really good idea to change your Facebook or Instagram password. And anyother password that is the same as your Facebook password (you naughty user you!).
Why? Because it turns out that Facebook kept hundreds of millions of user's data stored on locally accessible computers in plain text (i.e. unencrypted format). That means any Facebook employee (or person with access to the data) could look up the password of almost any Facebook user.
Liklihood that someone actually looked up your password: Low. Change it anyway, to be safe. And think about just how irresponsible Facebook is when it comes to valuing / protecting your data and your privacy.
Malvertising vs Adware
CSO Online explains (includes a brief explanation of the use of steganography).
Fabian Fights Back - against Ransomware
Pay by Face
Not sure I'm ready for this. Apparently the Chinese are.
Follow up on Boeing 737 Max 8
Popular Science on software as part of aircraft design.ExtremeTech on how safety features that could have prevented the crashes were 'optional' (expensive) extras. CNN on how pilots with experience on other 737 models were 'trained' on the 737 Max 8 (with no reference to the new MCAS system in the course materials).
Profits over lives. Not looking good for Boeing.
I've known about people choosing to believe that the earth is flat for a while. What I have not known is the craziness of the world that these people inhabit. Ars Technica has an article that sums up the content of 'Behind the Curve' - a documentary screening on Netflix, Amazon and Google Play. Not really tech or IT related, but the article is worth reading and the documentary worth watching.
That's it for this week.
“Social media is the toilet of the internet.” - Lady Gaga.
Social Media is a new phenomenon. It is unprecedented in its reach and adoption speed. Its influence is hard to measure and judge accurately. No living human has ever seen anything like it before. We are living through its birth and evolution - and trying to make sense of it - at the same time that we are expected to teach and guide our learners about its functions, uses and (as the syllabus is fond of saying) its advantages and disadvantages.
Whilst much of the fuss and furore about social media centers on the concept of privacy and how the social media companies exploit and mine our data for their own financial benefit, too little attention is paid to the even larger and more insiduous problem of manipulative and addictive design. Think about this:
We live and work in an attention economy. In many ways, information is less important than user attention (also known as 'engagement').
Hundreds and thousands of hours of time, research and effort is therefore dedicated to figuring out how to make the 'service' something that you cannot live without - something you crave with all the intensity and harrowing, overwhelming need of the most enslaved drug addict.
Not because you really need it.
Not because it adds that much value and utility to your life.
Not because it makes you feel better.
Simply because it (the 'service') is purposefully designed to exploit every aspect of psychology, behavioural science, neuroscience and design finesse to hook you - so that a (very substantial) profit can be made from your addiction.
Whilst writing this article I checked on LinkedIn (i.e on just one source) - there were 635 psychology related jobs advertised for Facebook for the US alone.
There's a name for it. Persuasive Design. Books have been written about Persuasive Design. Courses created and presented at universities and online. Conferences arranged to help people and companies learn how to create and use it. Privacy is discarded and data gathered (with and without user consent) to be better able to implement it.
The biggest flaw of persuasive design is that we tend to focus on helping ourselves (the programmers / companies) rather than helping the users.
The motivation behind persuasive design is ostensibly to improve the user experience, to create a product that they enjoy using and want to use repeatedly.
The problem is: economics. Creating, maintaining and hosting an online service is not cheap. And people want to make a profit. As big a profit as they can. The problem is, users want to pay as little as they can. Subscriptions are not popular. The only solution anyone has come up with is advertising. Magazines and Newspapers have done it for the longest time. They exist (economically speaking) not as a source of news or entertainment but rather as a way to collect eyeballs and attention so that their creators can make a profit by selling advertising. Their disadvantage is that they are not interactive - they cannot provide the immediate Pavlovian feedback that keeps users coming back for more. They are unable to leverage the
"...subtle psychological tricks that can be used to make people develop habits, such as varying the rewards people receive to create “a craving”, or exploiting negative emotions that can act as “triggers”. “Feelings of boredom, loneliness, frustration, confusion and indecisiveness often instigate a slight pain or irritation and prompt an almost instantaneous and often mindless action to quell the negative sensation,”
Advertisers want to know that their messages are being seen. For that to happen Social Media sites need users to actively spend time on the site - time that they can measure and show as proof that adverts are being seen. Repeat Visitors and Time on Site are important metrics. They reflect an increased chance that the user is likely to see your advert multiple times which, in turn, increases the chance that they will respond to it or at least develop a familiarity with your brand or product - making it more likely they will seek it out when they need such a product in the future.
“The technologies we use have turned into compulsions, if not full-fledged addictions,” Eyal writes. “It’s the impulse to check a message notification. It’s the pull to visit YouTube, Facebook, or Twitter for just a few minutes, only to find yourself still tapping and scrolling an hour later.” None of this is an accident, he writes. It is all “just as their designers intended”.
We are all Experimental Subjects
It's not just all design and psychology theory though.
There are many millions of experiments being run on the internet every day to work out the best way to capture and keep user attention - and how to prod / tempt / guide / lure users into online actions that may not be in their own best interests.
How many of you / your learners know what ‘A’ - ‘B’ testing is?
How many know that every pixel of the Facebook interface is monitored and tested to see just how to maximise user engagement?
For example: If this 'Like' button is a little bigger and a slightly different shade of blue are users more or less likely to click on it? What about if the icon is bigger? Or if the icon is different? Or if we put the button above or below the article?
The best way to find out is to create multiple different versions of a web page, show it to different people and measure their responses. Then use statistical analysis to figure out which page is better at getting the user to do what you want them to do. This is what A - B testing is.
And it happens all the time. Without our knowledge or consent. We are all part of one continuous experiment on how to make users do things to maximise engagement and profits.
There are many web services to help you do A - B Testing. Google even offers one for free. It's called Google Optimise and they walk you through the steps of how to create and use an A - B test. You can do it yourself. It's easy!
We are all slaves to the Algorithm
The final trick in the arsenal is that Social Media controls what we see. On the one hand User Engagement can be maximised by reducing 'friction'. It's simple - don't display content that disagrees from the users' (measured and tracked) world view, interests, likes, political viewpoint, etc. If the user is liberal, decrease or remove the number or conservative articles, posts, etc. from their feed. If they are conservative, then remove the liberal content.
Data is gathered and collected and then algorithms process that data to narrow down the content presented to us. This in an effort to make the social media site a place we feel comfortable and relaxed and at home in, because our core identities are not being challenged by views that are different from our own.
Being comfortable and secure in our own identities helps to keep us online and 'engaging' for longer. Eli Pariser first popularised this concept in his book The Filter Bubble.
But algorithms don't just keep us in a safe protected filter bubble. Being in a safe, comfortable, familiar environment is not enough. The algorithms go further, seeking out ways to provoke us into response, into clicking and posting and forwarding and coming back later to carry on the 'engagement'.
For this the algorithms like to target our 'Lizard brain' - the most basic and primitive urges we all feel. Sure, a cute and happy post can make us feel good. But does it promote engagement? Not as much as something far more provocative.
"... anger is addictive—it feels good and overrides moral and rational responses because it originates from our primordial, original limbic system—the lizard brain"
"... anger makes people indiscriminately punitive, careless thinkers, and eager to take action. It colors our perception of what’s happening and skews ideas about what right action might be"
It contradicts common sense, but User Engagement is maximised most effectively by content that plays on our fears and provokes outrage, misery, jealousy or despair. This content does not objectively examine or discuss opposing views but rather presents them as a threat or disparages them as ridiculous. Your identity is affirmed by creating an 'us vs them' scenario that provokes and outrages you without making the site feel like a less safe place. Rather the site is you bastion, your place of security from which you can hurl abuse at your foes and be cheered on by likeminded people without ever having to listen to the voice of reason.
Complex issues are simplified to fit in a tweet or headline and the messages make us feel good, even while they make us mad. The simplification creates an illusion that problems are easier to solve than they are, indeed that all problems would be solved if only they (whoever they are) thought like us.
Algorithms maximise the spread of this type of content over longer, more rational arguments aimed at discovering the truth and promoting co-operation, conciliation and arriving at a shared truth. Instead they push us into enclaves, divide us into tribes that cling ever more tightly to what separates rather than what unites. The algorithms discard honest debate and rational discourse in favour of emotional outbursts and denialism.
"a cursory glance at the tenor of cultural discussion online and in the media reveals an outsized level of anger, hyperbole, incivility, and tribalism"
Why? - well, simply because short, outrage inducing pieces generate more 'engagement' than long, rational arguments do.
What is good for the individual, society and humanity at large is substituted for what will generate the most profits.
The algorithms are not trying to make the world a better place, not trying to benefit mankind. They are simply trying to maximise engagement and so maximise profit.
Maybe it is time to stop worrying about the symptoms of the Social Media malaise that affects us all (the collection, sale and exploitation of our private data). Perhaps we should rather worry about a culture that will unthinkingly maximise 'engagement' (and profits) without considering the broader impact these techniques have on us as individuals and on society as a whole.
“Facebook and Google assert with merit that they are giving users what they want,” McNamee says. “The same can be said about tobacco companies and drug dealers.”
Some resources you can use on this topic:
So we are IT / CAT teachers. By definition we encourage the use of screens and tech. How ambivalent does that make us feel when countless headlines from the media scream out reminders that 'screen time' is bad for kids and should controlled / limited / eradicated completely? I know that I have felt the inner conflict at times. Surely so many pundits and experts can't be wrong? Is what I am doing actually bad for the children under my care?
(Here are some examples of the dire 'screen time is the apocalypse and is turning our kids brains to mush' warnings out there: New York Times "I am convinced the devil lives in our phones." and here (Oct 2018); Business Insider; Quartz; The Guardian; IOL; and many, many, many more.)
To add to this I want to / have taken things a step further. I believe that an online, interactive textbook is a better tool for our learners than a traditional textbook - and I have gone ahead and "put my money where my mouth is" to create just such a thing (check it out at LearningOpportuinities.co.za).
And yet I still have this nagging question inside me about whether I am only making things worse....
But, here's the thing. Deep inside me I know that this hysteria about screen time is wrong. Screen time is not the problem. It's how the screen is used and what is on the screen that are the cores of the problem.
Screen time is often used as a nanny / pacifier (dummy) by adults too busy and caught up in their own lives to become involved with their children on a meaningful level. The screen keeps the kids quiet and out of your hair for hours at a time. It's a miracle of modern technology! Give it to the kids and they go away and don't bother you.
The screen time most known and feared by concerned adults (parents, teachers, researchers and especially sensationalist media) is the passive, vegetative watching of meaningless video (YouTube), hours of gaming and other isolating, unproductive activities (which to my thinking should include use of social media).
Screens, especially the small screens we carry around with us all the time - smartphones and tablets) are technological incarnations of the Dr Jeckyl / Mr Hyde (free ebook here at Gutenberg.org) dichotomy. They are not all bad (and not all good). They can be used for reading (as an avid ebook reader since before the advent of the iPhone and tablet I can and do sing the praises of the wonder of a library in my pocket). Not all videos are bad (there are many useful tutorial videos on YouTube as well as the mindless gunk). Some games are really great (if you have not tried - and made your learners play - Human Resource Machine then you need to stop reading this article now and do so; it's a great way to understand how a CPU works!).
So what do we do when confronted by people telling us that screen time is bad?
My response is to ask how the screen is being used.
Is the kid given a screen and expected to go away, shut up and keep themselves busy in an unsupervised way? Yes. That kind of screen time is bad.
Do you spend meaningful time with kids doing all sorts of activities (including outdoors activities, chores, sports, games and screen time) and so naturally keep a balance in their lives? Do you share screen time with them, discuss what is on the screen - and make sure that the things available on their screen are not all mindless drek? Do you encourage the use of the screen to discover, explore and create new things? Do you encourage and foster independence and self reliance by showing how the screen can be used to find solutions to problems?
These questions direct to a realisation that handling the screen differently can transform what could be bad into something good.
It's about time we protagonists of tech took a stand and said that IT doesn't have to be this way!
This rant is prompted by finally seeing an article "In defence of screen time" on Tech Crunch, reading it and feeling that it does not go far enough.....
This weeks news links:
First, the right to have personal data minimized. Companies should challenge themselves to strip identifying information from customer data or avoid collecting it in the first place. Second, the right to knowledge—to know what data is being collected and why. Third, the right to access. Companies should make it easy for you to access, correct and delete your personal data. And fourth, the right to data security, without which trust is impossible.
One of the biggest challenges in protecting privacy is that many of the violations are invisible. For example, you might have bought a product from an online retailer—something most of us have done. But what the retailer doesn’t tell you is that it then turned around and sold or transferred information about your purchase to a “data broker”—a company that exists purely to collect your information, package it and sell it to yet another buyer.
The trail disappears before you even know there is a trail. Right now, all of these secondary markets for your information exist in a shadow economy that’s largely unchecked—out of sight of consumers, regulators and lawmakers.
Let’s be clear: you never signed up for that. We think every user should have the chance to say, “Wait a minute. That’s my information that you’re selling, and I didn’t consent.”
VR seems to be going the way of 3D TVs. You saw a lot of hype about it for a while but are seeing less and less as time goes on. Why?
That's it for this week. Happy teaching!
Blockchain. It's the new buzzword in tech. Everybody is talking about it. Everybody wants to use it. In some ways it feels like a solution in need of a problem. But boy, when that problem is identified... we might see block chain add some order and accountability to the otherwise unruly wild-wild west of cyberspace.
What is blockchain?
When you hear the word blockchain you probably immediately think about cryptocurrencies like Bitcoin or Ethereum. Whilst blockchain is part of the technology that makes these currencies possible - it is not a currency itself. It is easier to understand what blockchain is if you get the idea of cryptocurrencies out of your head when trying to understand it.
Blockchain is a public distributed way of tracking transactions secured by cryptography. Let's break this down:
Other resources and explanations:
The blockchain stuff in the news this week:
|In 2014, Maersk followed a refrigerated container filled with roses and avocados from Kenya to the Netherlands. The company found that almost 30 people and organisations were involved in processing the box on its journey to Europe. The shipment took about 34 days to get from the farm to the retailers, including 10 days waiting for documents to be processed. One of the critical documents went missing, only to be found later amid a pile of paper.|
A dose of reality amidst the hype:
Blockchain is designed to record transactions. That is all.
More data added to each transaction (e.g. note, images, other fields in the record) makes the calculation of the secure hash far more complicated and requires additional computing resources. Also, each transaction is only valid when 'accepted' by more than 50% of the network, which can make validating a transaction much slower. As the list of transactions or 'blocks' grows, so does the computing power needed to manage the blockchain. It also makes each transaction slower to process. What incentive is there for people to keep on running their part of the distributed network with no reward but considerable cost?
|"...last year it was claimed that the computing power required to keep the [Bitcoin} network running consumes as much energy as was used by 159 of the world’s nations"|
Here are some people raining on the parade...
|“Right now, Ethereum can process 17 transactions per second. Facebook can handle 175,000 requests per second. Visa, 44,000 transactions per second. So, if we really want to use cryptocurrencies as currencies, it would not be possible as of this moment.”|
|Justas Pikelis, co-founder of blockchain Ecommerce platform, Monetha|
|As of late 2016, it [Bitcoin] can only process about seven transactions per second, and each transaction costs about $0.20 and can only store 80 bytes of data.|
Hacks and Cracks
Robots and Robotics
Social media & social implications
That's it for this week. Hope you have a much better understanding of blockchain and are no longer stymied when the learners ask tricky questions about it!
Welcome back.. Lots of news to catch up on...
Social media and Privacy
Hacks and scams
Wow. That's a lot to catch up on!
Welcome back & happy teaching....
086 293 2702 or 012 546 5313
012 565 6469
Copyright Study Opportunities 2016 - 2019. All rights reserved.