Tuesday 24 October 2017 at 12:44

World Without Mind by Franklin Foer

By Eric Antoine Scuccimarra

Back when I was in college I told someone "the problem with the internet is that no one has figured out how to make money off of it." At the time that was probably true (this was the mid-90s), although some people were working very hard on the problem. During the dot-com boom it seemed like that problem had been solved, but then the dot com bust reopened the issue. Now that statement is so incorrect it could be a joke. 

The first dot-com boom was largely about trying to actually create value with the internet - it was about creating new products and services and things which could be useful to consumers. Now the paradigm for big tech has completely changed. The users are no longer the consumers but are the product,s which are sold to advertisers and corporations in the form of data to allow them to better target their advertisements. The more time people spend on websites like Facebook, Twitter and Google, the more data those companies can collect and the better the advertisers can target their advertisements. A key component of this paradigm is trying to get users to spend as much time as possible online, and this is done by trying to make the products as addictive as possible. This is why Facebook rations "likes" - if someone "likes" something you posted you may not be notified about it immediately, but the likes will be stretched out to keep you checking back regularly. This eventually leads to the premise of this book - which is that big tech is becoming a threat to democracy and freedom of thought.

At the moment I am very interested in machine learning. When most people think of "artificial intelligence" they probably think of Skynet or HAL, but in reality it is really just about data. It is not all that difficult to take in large data sets and predict variables in terms of other variables - this is essentially what machine learning is. Every time you get a new data point, you can update the formulae to get better predictions. I imagine this is what Facebook and Google do. Facebook uses the algorithms to predict which items you are likely to click on if placed in your news feeds, and then it gets more data based on whether you click on them or not. Similarly, YouTube does the same thing to determine what video to play next and then it gets a new data point by seeing if you watch the next video or not. From a machine learning perspective this is a dream scenario for unsupervised learning, you just set up the algorithms and they'll learn on their own, refining themselves constantly. 

In the past TV news operated differently - the networks were expected to run their news departments at a loss as a public service, in exchange for the right to broadcast and make money off advertising during entertainment programming. The news had no reason to bend the truth, or present partisan opinions or "alternative facts." With only three networks to chose from, the country had a set of common facts that everyone agreed on. Now, people choose the news they watch based on the information they want to hear, if you don't like what Fox News is saying you turn on MSNBC and if you don't like what they are saying you turn on a different news channel. This results in a feedback loop - rather than having incorrect assumptions challenged, they are just constantly reinforced by the self-selection of news that agrees with your existing viewpoints and opinions. 

In this book Mr Foer explains how big tech companies are doing the exact same thing and how it is detrimental to democracy. Facebook is not going to want to show you something that you are not going to like or agree with, while challenging people is good for public discource and independent thought, it does not tend to engage most people. The "confirmation bias" is the tendency to search for, interpret, favor, and recall information in a way that confirms one's preexisting beliefs or hypotheses. When people are presented with information that contradicts their opinions they tend to downplay it, or ignore it, or question the validity of it. People don't like to have their beliefs challenged, so to keep people engaged big tech is not going to try to challenge anyone.

Most people assume that we analyze the facts and data and then come up with an opinion on an issue. I believe the opposite is true. Most people already have their opinions, even on issues they have no information on, and then they choose facts and data to confirm what they already believe. Where do the beliefs come from then? Mostly from hearing opinions from other people - whether on TV news, newspapers, or social media. There are shortcuts we use to determine which opinions to hold in the absence of facts - if we hear a lot about something we tend to assume it is true, if we hear something from someone we like we will tend to assume it is true. The media (online and offline) has a huge influence on our opinions and beliefs, but they are not motivated by telling us the truth, or improving society - they are motivated by advertising dollars. The goal of TV news is to keep people glued to the TV - not to keep them informed. When there is a big news story they will play it up as much as possible rather than reporting on things that may be more important, but may not be as sensational.

Feedback loops are just one of the many problems caused by big tech, albeit one that is quite relevant today as they played a large role in the previous US election. Mr Foer provides a comprehensive overview of all of the ways that big tech is harming society in the pursuit of profits. Mr Foer uses the example of the New Republic, which he was the editor of, to illustrate how big tech is taking over people's minds. If you want details on this I suggest you read the book, but in a nutshell, the magazine went from one that would publish well researched, in depth articles on various subjects to adopting the currently prevalent click-bait model of journalism - basically a very short article with a catchy headline and photo that is designed for the sole purpose of getting as many eyes on it as possible. Mr Foer says that the change in online journalism is largely driven by social media - which now drives a huge amount of online traffic. People get their news from Twitter and Facebook now, and if an article isn't going to interest someone with a 140 character headline and a catchy photo it probably isn't going to get read at all.

I personally avoid social media as much as possible, and I am becoming disillusioned enough with what the internet has become that I am thinking of changing careers. When I started working in web technologies, the internet was going to revolutionize the world by making huge amounts of information available at a click. Now the internet is mostly about trying to get people addicted and clicking back as much as possible, all for the end of allowing corporations to have data so that they can target advertisments at you and convince you to buy things you don't really need.

Labels: technical, books


Comments

Login or Register to leave a comment..


Archives