Pacific predictions: Political expectations for 2019
West Papua - When is a close observer too close for comfort?

The greatest threat to democracy & world order is the internet

Phil Fitzpatrick recent
Phil Fitzpatrick

PHIL FITZPATRICK

TUMBY BAY - There is at least one commentator on PNG Attitude who thinks I’m a conspiracy theorist so I thought I’d throw this idea into the mix to see what sort of reaction I get.

The idea became apparent while I was reading Michiko Kakutani’s excellent little book, ‘The Death of Truth’ (William Collins, 2018).

And it’s all down to an otherwise innocent little tool called an algorithm.

An algorithm, as you are probably aware, is a kind of recipe or ordered set of steps that if followed will result in an answer to a problem.

Computer programmers design algorithms for all sorts of reasons, including selling us stuff or influencing the way we think.

You are probably familiar with the advertisements that pop up on your computer or digital device screen while searching the web.

Believe it or not an algorithm has been at work processing your previous searches and is presenting you with options most likely to appeal to you.

Search engines like Google and sites like Facebook, Twitter and YouTube all use algorithms to provide information attractive to you. They do this based on earlier data they’ve collected about you. Many sites trade this information for monetary gain.

When you Google something the information you receive might be completely different to the information someone else receives who has used the exact same words in their query.

If you believe in climate change, for instance, the information you receive will be quite different to that which a climate change denier receives.

Most of us believe that search engines are neutral affairs and therefore unbiased.

That’s unfortunately not true. Your computer or device is pandering to your personal biases and tastes. It is actually isolating you into an increasingly narrow frame of content. Kakutani calls these “content silos”.

This might be unsettling but it becomes particularly unsettling when you look at the people behind the algorithms, not so much the programmers but the designers.

If algorithms are going to decide what information we are exposed to we need to be very careful about the intent and motives of the people controlling them.

Politicians have always played loose with truth and reality but the internet has given them a whole new means of making mischief.

In a very few short years they have managed to replace truth with opinion and the objective with the subjective. Truth has become ‘fake news’ and opinions have become ‘alternative facts’.

When the internet first appeared we welcomed it as the dawn of a new age connecting people everywhere and leading to creative solutions for many of our problems. If anything was going to democratise the world it would be the internet.

Algorithms have been with us for thousands of years and have been very useful. Modern technology has seen an incredible proliferation in their invention and use.

Little did we realise there would be a dark side.

As Kakutani says:

“The same web that’s democratised information, forced (some) governments to be more transparent, and enabled everyone from political dissidents to scientists and doctors to connect with one another – that same web, people are learning, can be exploited by bad actors to spread misinformation and disinformation, cruelty and prejudice”.

Conspiracy theories now flourish on social media. So do simplistic and inflammatory political messages like those used by people like Donald Trump and the Brexit advocates in Britain.

Politicians and others can now use algorithms on social media that can psychologically profile millions of potential voters. They have become the ultimate tool of Big Brother.

George Orwell may very well be chuckling in his grave. Or perhaps he is weeping.

Comments

Feed You can follow this conversation by subscribing to the comment feed for this post.

Andy McNabb

I have a contact in ASIO (call sign Cornflakes). We meet regularly at the breakfast cereal aisle at Coles. We check behind the Kelloggs boxes for microphones.

He is the top man in Conspiracies and he is investigating a line that Kelloggs have been putting certain substances into Cornflakes which make men lose hair after middle age and in some cases become completely bald.

He penetrated deeply to find Kelloggs have shares in the Miracle Hair Replacement organisation.

Thank heavens we have ASIO to detect these conspiracies and we can keep our hair.

Bernard Corden

Neil Postman, Andrew Keen, Thomas Kuhn and Paul Feyerabend are essential reading.

Philip Fitzpatrick

I think we have to be mindful that with the development of any system, technological or otherwise, there will always be people prepared to subvert that system to their own advantage and there is not a lot we can do to prevent them.

One of the earliest users of the internet, for instance, were pornographers and pornography makes up a huge part of internet use. Whether that is a useful thing or not is a moot point.

I think you are right about the ethics issue but how can you impose and enforce ethical standards on something like the internet? At best you can only hope that individual internet companies enforce their own codes of conduct.

That said, and as recently demonstrated, some of these companies, like Facebook, cannot control what their users do no matter what standards they set.

Peter Quodling

Let me just prefix this with some credentials - I have worked in the IT industry for around 40 years. One of my mentors was the person at Google who developed “Predictive Caching” in their search engine (the bit that seems to second guess you).

He and I have discussed at length, the failings of the technical model of most search engines - they are based on the numerical instances of any particular datum, not the qualitative aspects of the information gathering.

That said, the assumptions by the likes of Donald Trump and others, that search engines innately have a bias are demonstrably wrong.

But the issue is not the technology - it is the implementation and utilisation of that technology without the consideration of the ethics of that use.

A case in point is the Australian Health care records system. As we age, this makes more sense, but a) the government has a track record of data breaches. And b) they have offered nothing in the way of assurances that personal information won’t be unsold to insurance companies or the like, who may then use it against the interests of the individual.

Technologically, there are ways to counter these aspects of compromising privacy, but they themselves need better understanding and commitment to deliver. I am currently considering a doctoral thesis on the subject.

Verify your Comment

Previewing your Comment

This is only a preview. Your comment has not yet been posted.

Working...
Your comment could not be posted. Error type:
Your comment has been saved. Comments are moderated and will not appear until approved by the author. Post another comment

The letters and numbers you entered did not match the image. Please try again.

As a final step before posting your comment, enter the letters and numbers you see in the image below. This prevents automated programs from posting comments.

Having trouble reading this image? View an alternate.

Working...

Post a comment

Comments are moderated, and will not appear until the author has approved them.

Your Information

(Name and email address are required. Email address will not be displayed with the comment.)