Friday, 11 March 2016

If You Were Told The Minority Report Is Now A Reality, Would You Be Surprised?



How many years ago did you watch Tom Cruise in Minority Report and think, "Ya--as if that'll ever happen!" Well...
Chicago Police monitored by predictive algorithms?

We are in an age of big data, which is a fancy term for all the information we social networkers are hourly providing companies like Google and Facebook about our routines, our desires, our beliefs, our family lives, the foods we eat, and on and on. And what are Google and Facebook, and other companies like it, doing with this data? All kinds of things, including behaviour predictive software. 

It's simple: if you have large samples of people behaving in a certain pattern,
it's probable that those patterns will be followed in the future. And better yet, you can have sophisticated machines rapidly sorting through all that big data to make highly 'educated' predictions, or inferences, about that behaviour. 

One example of this is the Data Science Machine created by Max Kanter, a MIT graduate student in computer science, and his advisor, Kalyan Veeramachaneni, which can approximate human 'intuition' when it comes to data analysis." The article from Fastcodesign continues,

Using raw datasets to make models that predict things like when a student is most at risk of dropping a course, or whether a retail customer will turn into a repeat buyer, its creators claim it can do it faster and with more accuracy than its human counterparts.

"If you have large samples of people behaving in a certain pattern, it's probable that those patterns will be followed in the future."

Turn this to individuals, we already have people claiming that one's mobile phone carries so much data a clone of the user could be made simply by hacking into it. And with all the personal information posted on Facebook, and used in Google searches and other online activity, it isn't difficult to turn these algorithms toward individual human behaviour and predict a continuum of actions. In fact, that's precisely what the Chicago Police Department is doing with a new program to predict when an officer may respond aggressively toward a civilian.

Police will soon be watched by algorithms that try to predict behaviour, headlines an article by MIT Technology Review. It's an effort to improve relationships between police and citizens, but the process seems creepy at best. The Charlotte-Mecklenburg Police Department is working with researchers at the University of Chicago on the algorithms that will supposedly forewarn of against a spectrum of things from impolite traffic stops to fatal shootings. 

"Police will soon be watched by algorithms that try to predict behaviour."

I remember when I first watched Minority Report back in 2002. The concept of police able to predict a person's behaviour before it happened through advanced brain algorithms was stunning. How could such technology exist, I thought. How would these predictions be verified? What would prevent someone from being wrongly accused, or arrested too early? When would such technology break the tension between one's thoughts and one's actions? 

These same questions remain. While there is no question that big data is being gathered on human behaviour, creating programs as spurious as behaviour-prediction remains as concerning today as it did when millions were introduced to it by Tom Cruise. There remains a huge gap between what one could do and what one will actually do, regardless of complex algorithms. Humans are not machines running meagre software in which what happened yesterday is likely to happen today. We change. We have the power to grow, to modify our behaviour, to become better. In the same way, we have the ability to get into situations in which we act contrary to how we have responded in the past. We can fall prey to pressures and tensions that push us over the edge. This is the problem with such predictive algorithms: it foists the universal on the particular and turns it into a law--simple enough to do, but very difficult to verify, especially in court. 

In spite of what you might believe about behaviour-predictive software, there remains something just as troubling: that everyday we continue to fill the coffers of Google and Facebook, and other tech giants, with reams of data about our lives; and we're at the point now where we don't think twice about it. Having someone snoop through our purses or underwear drawers remains detestable, yet having third parties gather data on everything from what you eat to whom you have relations of various kinds with isn't even thought twice about. We have wittingly or unwittingly wandered into the transparent society, and, barring a major catastrophe from North Korea or otherwise, there's no turning back.





No comments:

Post a Comment