This Week in Tech
Last year I repeatedly wrote about fake news - largely quoting articles that revealed the latest bit of fake news - or which provided tips on how to avoid becoming a victim of fake news. Towards the end of the year I reduced the mentions of fake news in the blog. This was more to avoid ‘ranting’ than because the amount of fake news decreased in any way.
In the meanwhile I have spent quite some time thinking about why the problem of fake news exists. There are, of course, multiple factors that contribute to the phenomenon of fake news. Some of my conclusions are explained a little further into this post.
As teachers you don’t really have time for long, philosophical arguments and discussions of the topic. It is, however covered in the syllabus under the section relating to validating information / web sites. When teaching about fake news the core of the matter boils down to:
All the aspects of fake news are covered in detail in the Data Communications section of my Grd 10 IT Theory textbook (find it at learningopportunities.co.za - a year’s subscription is only R100).
What I really want to do in this blog is explore the WHY of fake news.
NB: The usual list of links and news comes after this longer than normal piece. Just scroll down to get to them if you want to skip this.
ALSO: This is my personal take on the issue. I am sharing it and inviting comment, not trying to be arrogant (assuming I have all the answers) or trying to teach you something that you may already know. If you don't like it or feel I am patronising you, then just jump to the end - it's not worth raising your blood pressure over!
Fake news is nothing new. It has been around in many forms throughout the milleni. Its sudden elevation to a problem that should be of grave concern to any thinking person is due to its scale, the quickness and ease with which it spreads through electronic media - and the inclination for large numbers of people to accept it as true without question (and even to defend it when it is questioned).
Unquestioning acceptance of (and belief in) fake news are core issues.
Let's go back in time to before the internet...
‘Cost of entry’ not only made it difficult for anyone to publish their version of the news, it also limited the number of publications available. News was available, but not in the form of the information deluge that we have to deal with today.
Though news sources were never entirely impartial, it was often much easier in the past to detect the bias and editorial commitment to truth of a publication - and evaluate the likelihood of it publishing untrue, fake or unverified content. Publications had clear reputations. Some were respected. Some not. In apartheid South Africa you were far more likely to find accurate news in ‘The Weekly Mail and Guardian’ than in the government controlled (at the time) ‘Citizen’. The ‘New York Times’ was clearly much more reliable than ‘The National Enquirer’.
Just the source of the news helped give you an idea of whether it was likey to be fake or not.
On top of that, publications checked the veracity of their content to satisfy the requirements of the armies of editors and lawyers that vetted any controversial content for fear of legal consequences or censure from professional bodies.
The publishing revolution.
Then along came the internet. And with it came blogs and social networks and video sharing sites and micro-blogging and photo sharing and instant messaging - and so on. Publishing your message suddenly has no cost. Reaching an audience of millions has no cost. Suddenly publishing is available to anyone with a computer and the skills to put up a web site. Media is democratised. We have a brave new world where information is 'free' and no rich media moguls or governments can block uncomfortable truths from coming out.
On top of this there is money to be made - especially if your message is new and short and controversial enough to get millions of eyeballs to look at it. Millions of eyeballs = $$$ in advertising. And anyone can do it. Even those for whom the truth is unimportant as long as they make $$$. Even those who don't care about truth or money but have some other goal to achieve (such as discrediting a person or making sure someone gets elected).
Now we have news that anyone can publish whether they researched it or made it up. They can publish it and reach audiences of millions around the world. So the door to fake news opens up.
What happened to fact checking?
Most of those eyeballs that earn the advertising dollars will only look at something once.
The first with the news gets the advertising bucks.
Suddenly speed to publication is more important than accuracy. So checking facts before publishing goes out the window (too slow: your post won't be first, the eyeballs and advertising dollars will go to the first one to publish). It's easier to apologise and retract afterwards - even though people will only remember the original, sensational, incorrect content.
So we have news that is not checked before it is published. The door to fake news is opened even wider.
The insulation of the Filter Bubble.
The burgeoning of sites gives us too many choices, too many sources of information.
So we tend to settle on one source - preferably a source that gives us our news the way we like it. And we like it all in one place, served up on a platter. Lots of people should use this news source - because after all, lots of people can't be wrong / fooled...
Our news source should preferably only deal with the topics we want to read about.
What we end up with is a news source that caters to our prejudices and preferences and which keeps us in a nice, cozy filter-bubble - and so obviates the need to think and engage with anything that disturbs our world view.
Social media companies know this - and also know that conflicting content that requires some effort to resolve is off-putting for the average reader. Anything off-putting is likely to reduce the user's screen time (and so the money that the social media site makes). So they filter the content. They only let the user see what they expect and like to see (whether it is true or fake). A user that is not conflicted or unhappy will keep clicking and scrolling for longer - and earn them more money.
It's hard to think that something might be fake if it is the only version of the news that you see - and if you only see the same news repeated across multiple stories without contradicting articles. So the filter bubble that these sites create make it more difficult to detect fake news (even Google is guilty of this - it generates its own filter bubble so you are likely to see only search results that match up with the content of the news that you read).
Now people are only reading news that matches what they think they know to be right. Their ability to identify fake news decreases.
The problem of collation.
Often the simplest way out is to get our news on social media.
This way our news is all in one place and many millions share the same news source, so the news must be correct, mustn't it?
The internet floods us with information - too much information. From all types of sources. Good or bad. True or fake. Real or rumour / gossip / propaganda.
Social Media news at least seems to control this flood - but getting all the news through social media has another effect. To us, the end users, all the news comes from the same place - the social media site.
It's hard to use the tool of 'checking the quality of the source publication' when all the news seems to come from the same source (most people when asked where they get their news will answer 'Facebook' - not publication xyz through Facebook). It seems as if even recognising and acknowledging the real source of the content is too much effort for us.
The melding of all news sources into one means that media reputation (i.e. 'you can't believe that - everyone knows that publication X is junk') no longer applies. The true is published next to the fake in the same place. It's so much easier just to believe it all than to try to figure out the difference.
Fast and Furious
The tsunami of information and 'news' on social media has another consequence. We are overcome with a sense that information is a huge, daunting, unclimbable mountain that we shy away from. Our lives are too busy to 'read all that shit'... So we want our information doled out in bite sized, pre-digested, simplified chunks - which we only skim read in any case.This skimming forces headline creators to try all sorts of tricks to grab our attention - even if it means bending the truth or completely fabricating the story.
Only the sensational gets our attention.
And it is the sensational that gets shared.
We are far more likely to 'share' something short and sensational (or 'cute' or 'inspirational') than a meaty, in-depth discussion of any topic at all.
And the more sensational a news item is, the faster we are likely to share it - with as many people as possible. We also want to be 'first' with our shares. Often we share fake news without even stopping for a moment to think about whether it is true or not.
So it is that fake news spreads quicker than a measles epidemic in an anti-vaccination community.
Now we have news that spreads so fast and is shared by so many people that it becomes difficult to think that it might be fake.
Lazy / Partisan readers
Many people are too lazy to check the accuracy of news - that would mean reading multiple sources, comparing the differing facts, thinking - and then forming your own view. It's just much easier to accept what you've read as 'true' - after all it was in the news.
The last piece of the puzzle ties in with the filter bubble mentioned earlier. Partisanship means that people cling to ideas (political and otherwise) that form part of their identity. They are unwilling to question news that affirms their ideas - and quick to reject any news that conflicts with their ideas.
The issue for them is not the accuracy of the news but their own perception of themselves and their view of reality. They will accept and defend any fake news that confirms them. They will reject any true news that threatens or contradicts them. The truth does not matter. Only what they believe in.
Often these are the people creating the fake news in the first place - for consumption by people who share their world view.
For them the truth will never matter.
Weekly news summary:
The following links provided courtesy of Claire Smuts
Hardware / Software
That's it for this week.
So we are IT / CAT teachers. By definition we encourage the use of screens and tech. How ambivalent does that make us feel when countless headlines from the media scream out reminders that 'screen time' is bad for kids and should controlled / limited / eradicated completely? I know that I have felt the inner conflict at times. Surely so many pundits and experts can't be wrong? Is what I am doing actually bad for the children under my care?
(Here are some examples of the dire 'screen time is the apocalypse and is turning our kids brains to mush' warnings out there: New York Times "I am convinced the devil lives in our phones." and here (Oct 2018); Business Insider; Quartz; The Guardian; IOL; and many, many, many more.)
To add to this I want to / have taken things a step further. I believe that an online, interactive textbook is a better tool for our learners than a traditional textbook - and I have gone ahead and "put my money where my mouth is" to create just such a thing (check it out at LearningOpportuinities.co.za).
And yet I still have this nagging question inside me about whether I am only making things worse....
But, here's the thing. Deep inside me I know that this hysteria about screen time is wrong. Screen time is not the problem. It's how the screen is used and what is on the screen that are the cores of the problem.
Screen time is often used as a nanny / pacifier (dummy) by adults too busy and caught up in their own lives to become involved with their children on a meaningful level. The screen keeps the kids quiet and out of your hair for hours at a time. It's a miracle of modern technology! Give it to the kids and they go away and don't bother you.
The screen time most known and feared by concerned adults (parents, teachers, researchers and especially sensationalist media) is the passive, vegetative watching of meaningless video (YouTube), hours of gaming and other isolating, unproductive activities (which to my thinking should include use of social media).
Screens, especially the small screens we carry around with us all the time - smartphones and tablets) are technological incarnations of the Dr Jeckyl / Mr Hyde (free ebook here at Gutenberg.org) dichotomy. They are not all bad (and not all good). They can be used for reading (as an avid ebook reader since before the advent of the iPhone and tablet I can and do sing the praises of the wonder of a library in my pocket). Not all videos are bad (there are many useful tutorial videos on YouTube as well as the mindless gunk). Some games are really great (if you have not tried - and made your learners play - Human Resource Machine then you need to stop reading this article now and do so; it's a great way to understand how a CPU works!).
So what do we do when confronted by people telling us that screen time is bad?
My response is to ask how the screen is being used.
Is the kid given a screen and expected to go away, shut up and keep themselves busy in an unsupervised way? Yes. That kind of screen time is bad.
Do you spend meaningful time with kids doing all sorts of activities (including outdoors activities, chores, sports, games and screen time) and so naturally keep a balance in their lives? Do you share screen time with them, discuss what is on the screen - and make sure that the things available on their screen are not all mindless drek? Do you encourage the use of the screen to discover, explore and create new things? Do you encourage and foster independence and self reliance by showing how the screen can be used to find solutions to problems?
These questions direct to a realisation that handling the screen differently can transform what could be bad into something good.
It's about time we protagonists of tech took a stand and said that IT doesn't have to be this way!
This rant is prompted by finally seeing an article "In defence of screen time" on Tech Crunch, reading it and feeling that it does not go far enough.....
This weeks news links:
First, the right to have personal data minimized. Companies should challenge themselves to strip identifying information from customer data or avoid collecting it in the first place. Second, the right to knowledge—to know what data is being collected and why. Third, the right to access. Companies should make it easy for you to access, correct and delete your personal data. And fourth, the right to data security, without which trust is impossible.
One of the biggest challenges in protecting privacy is that many of the violations are invisible. For example, you might have bought a product from an online retailer—something most of us have done. But what the retailer doesn’t tell you is that it then turned around and sold or transferred information about your purchase to a “data broker”—a company that exists purely to collect your information, package it and sell it to yet another buyer.
The trail disappears before you even know there is a trail. Right now, all of these secondary markets for your information exist in a shadow economy that’s largely unchecked—out of sight of consumers, regulators and lawmakers.
Let’s be clear: you never signed up for that. We think every user should have the chance to say, “Wait a minute. That’s my information that you’re selling, and I didn’t consent.”
VR seems to be going the way of 3D TVs. You saw a lot of hype about it for a while but are seeing less and less as time goes on. Why?
That's it for this week. Happy teaching!
Welcome back. A question for you: Would you prefer fewer news links with more 'in depth' though provoking pieces - or do you like the flood of news that tended to happen towards the end of last year? Please let us know in the comments. To give you an idea of the alternatives - the last few blog posts of last year were in the 'list of items' category and this post gives you a little sample of what a more 'in depth' approach might look like...
Automation. It's the big bogeyman of tech - the process of getting rid of the human factor in the name of increased efficiency and productivity at the cost of jobs for real people. We often only tend to think about it in abstract - as something that happens in big factories where robots stand side by side in long rows on assembly lines. The reality is that automation happens everywhere - even in small businesses and offices. A small example with significant consequences is the automation of parking payment systems in malls. There's even a field of study devoted to devising new ways to automate things - anyone want to be an Automation Engineer?
We talk a lot about the consequences of automation - the loss of jobs, the fact that people need higher levels of education and training to qualify for the new jobs available (because all the low skilled jobs have become automated). Does anyone think about the people who come up with the automation ideas - or have to implement automation in their workplace - and how it affects them?
Brian Merchant at Gizmodo has a piece on the topic well worth reading. It's called So you automated your coworkers out of a job.
The types of jobs that are lost to automation are also changing. Here's a take (from FastCompany) on jobs that will be at risk in the future. This is not the list of jobs that you typically associate with robots and automation - but is a trend that is emerging and is well worth paying attention to.
How many jobs are at stake? Well, according to this study from McKinsey and associates up to 73 million jobs in America alone could be lost to automation by 2030. That's after the over 500 million jobs already lost worldwide to date.
Maybe we should pay more attention to the discussion about the potential need for (and feasibility of) a GBI (guaranteed basic income)....
Sysadmins. The guys behind the scenes who make sure that your tech runs smoothly. The guys who manage your networks, keep your encryption up to date, make your backups... Also known as tech support or the IT dept.
Sysadmins have more insight, power and knowledge than you might be aware of. In fact, the biggest 'Gangsta' trial of this century - that of the drug Kingpin El Chapo - hinges on information gained when his Sysadmin was convinced to become a witness for the prosecution. You can find details at Gizmodo (again).
Remember that the next time you are rude or inconsiderate to the IT guy.
For some fun videos on the trial and tribulations of life as a Sysadmin look at this playlist on YouTube. You can also try to get hold of The IT Crowd (a BBC Chanel 4 comedy). It's also fair to warn your learners that this is the most mainstream career path for many IT professionals - and I'm pretty sure you can also entertain them with many stories from your own personal experience.
It might be a good idea to draw up a profile of skills needed by IT people (and a separate one for programmers) that we can use during subject choice discussions in grade 9 to help learners know how well suited they are for the subject.
Hope you like the new format - PLEASE leave some feedback in the comments section.
All the best for the new teaching year!
086 293 2702 or 012 546 5313
012 565 6469
Copyright Study Opportunities 2016 - 2019. All rights reserved.