Algorithms rule us all – VPRO documentary – 2018

Algorithms rule us all – VPRO documentary – 2018


[Music] algorithms define our lives which political and participants do you get to see on Facebook who gets a job who is being released from prison with algorithms as the buzzword an enormous efficiency operation is being conducted taking decision-making away from humans and handing them over to the rules and laws of the computer evangelists of data are pushing this vision that we should actually sort of hand over our free will to the computers that’s freakin nuts but what are we surrender in to but when you’re giving over control it’s not to an algorithm it’s to a company and until you get clear on that you’re gonna be taken advantage of what does the world look like when algorithms determine our personal future I’m almost a year out trying to change other people’s lives had it been for a compass I would still be in prison there’s no button in the app that says hold on I have a problem with my bike or it’s just think or office this is backlight welcome to your digital future [Music] the role computers play in our lives is changing where previously reviewed computers as a tool to help us make decisions we now allow the computer with the help of big data and algorithms to take over an ever-increasing number of decisions about our lives you know the way I use it the word algorithm is a predictive tool so it’s a tool that predicts success Kathi O’Neil is a mathematician and data scientist she worked on Wall Street for a large hedge fund which she quit during the financial crisis of 2018 since then she has been conducting research into the role of algorithms in our lives to build an algorithm you really only need two things you need some kind of concept of historical information like what happened in the past and you need a very precise definition of success so for example on a daily basis if you’re deciding what to wear you’re going to care about what the clothes you have in your closet are what what which of those clothes are clean how much did they cost how comfortable were they last time you wore them do they fit you well what’s the temperature outside what are you planning to do today there’s lots of data that you have on that day to think about but really what you’re doing is thinking back in the past like when I wore this outfit before like this is the problem I have with or this is the risk of this outfit because sometimes when it’s windy this skirt flips up you know so you you incorporate your different definitions of success which is are usually pretty complicated as well as the kinds of risks that you have you’re taking and wearing a particular outfit when you build an algorithm with a computer you have to make everything absolutely precise and it’s typically a sort of watered-down toy version of something that you would do in your head these computerized decision makers are taking over large parts of our lives the bike route we take and who we fall in love with is today often determined by computer algorithms and for some this means that they literally spend their days being guided by computer code people working at delivery out just dispatchers so it’s the algorithm to decide where to send people basically anytime you used to have a difficult game a decision made by a process that was complicated and hard now you’ll see an algorithm the algorithm is a mystery for us we don’t know how does it work we are not nobody explained to us [Music] and the problem is that I have absolutely no control over the delivery system I’m just waiting for the deliveries to come the lever wants me to be R to be on my shift even if there are no not enough orders to make a minimum wage if there are no deliveries for whatever reason then I’m not getting paid the problem is that the algorithm just thinks about the profit algorithm is just it’s not a human being it just has just being programmed to do one thing and that’s all it does I think that we are now facing like a huge change and it we will be getting more and more of this kind of jobs where people are pushed into precarious conditions where all the costs the insurance costs and all the risk are pushed on the people but all the profit goes to the to the company getting a job keeping a job assessments scheduling jobs that’s also a complicated and difficult process that people get upset about any time people can sidestep that part where people get upset because they think they’re being unfairly treated well let’s turn this into an algorithm and then we could say talk to the machine which is to say don’t talk to anyone because there is no way to talk to a machine Algren doesn’t have empathy agar it doesn’t have any other goals than just to push as much as it possible from our work I think it’s reasonable to think of it as sort of the automated version of a human process it’s not new it is just automated so in some sense an algorithm is a toy version a very simplified kind of stupid version of the human processes replacing that’s called a flat tire [Music] it is very problematic to put it on as well next up hours sometimes I haven’t seen anyone like in six months from the delivery road can you like tell the app that you have a flat tire or I just have to log out and I will not but get paid for this type of course for the costs for everything there’s no there’s no button in the app that says hold on I have a problem with my bike or it’s just being on or off now that algorithms have taken over the work for example in the management of bike couriers the question remains whether they are doing a good job Mika’s zelicah is a data scientist and does research into the question of whether algorithms make good objective decisions people tend to think of the nagas rhythm as being like isolated and able to to actually think but this is not the case it is not really really intelligent in the sense that we would define intelligence for for human beings people tend to think that big data and artificial intelligence is going to save our lives right so and it was really like a gold-digger atmosphere in the beginning because people were like yeah we just have a ton of data and we feed it into the algorithm and then it will tell us whatever we wanted to know and this turned out to be wrong and especially the data scientists and mathematicians were proclaiming that math is neutral and because it’s all just math it’s it has to be fair and it has to be certainly fairer than human beings because there is no human being involved right there is just data and mouth voilá beautiful and so I think it was really hard to digest for them that their beloved algorithms could actually be worse than human beings data is not neutral in itself right because data is full of stereotypes full of the concepts that and full of our culture or a certain culture that we are living in there is like certain things where the whole world seems to agree on but there’s also a lot of things like homosexuality is a very very common example for that because there’s a lot of societies where people think that this is a crime and this that it is pervert and whatever and there is also quite some societies nowadays who think it’s just fine and if you have criminal statistics that involve homosexuality you would probably have like a lot more man there and stuff like that the trouble is what do you do now right because we have this biases in the data and so the this is what my whole research area is about to discuss and to find solutions for different approaches to deal with this because we just have to live with that that’s we are not going to get rid of this [Applause] [Music] algorithms and the data with which they are being fed sometimes appear to make huge mistakes or be blatantly racist for instance an Asian resident of New Zealand was refused a passport because in his photo his eyes were allegedly closed and Google’s advertising system showed well paid jobs to men more often than to women why did these seemingly objective systems make such big mistakes we’re realizing that these algorithms are not in fact less bias than humans for two reasons mostly the first being that the data itself reflects human bias so if you’re thinking about deciding who was successful at a company in the past you probably defined success by and who got raises who got promotions who stayed for a long time but if you think through those all those examples like they’re all culturally defined who gets a raise who gets promoted who stays for a long time those are all things that the answer could be very different depending on whether the culture is sexist or racist or something like that so when we are asking those questions and answering it with data we are embedding the bias other kinds of data problems could be just missing data that we simply do not have the data for some kinds of people and I would argue that when we talk about or we talk about crime data we don’t really have crime data but we use as a proxy we can use arrest data but we really don’t have consistent policies for who we arrest for what crimes and in particularly we have a lot of missing a missing arrest data we don’t arrest white people for smoking pot in this country nearly as much as we arrest black people for smoking pot so the data itself is reflecting the human bias the human bias either by the the the bias and the data or the missing data [Music] even though algorithms are not perfected in the United States they are being used to make decisions previously made by judges and of course it then becomes quite important to know how great the margin of error is I was trained on data in finance and in finance good enough is very very low standard the standard for good enough is your right 51% of the time you can make money if you’re right 51% of the time but in the context of putting someone in prison that’s definitely not good enough it depends on the stakes if the stakes are low then who cares just as long as it’s better than guessing you know but if the stakes are high if somebody’s human rights or constitutional rights can be violated if they’re unfairly scored it really matters whether you’re wrong all right [Music] when algorithms make decisions about an individual’s life it can have far-reaching consequences Glenn Rodriguez discovered this – his cost a judge decided on his conviction that he would become eligible for an early release after 26 years but an algorithm disrupted this unfortunately had it been a solely up to the campus risk assessment I would still be in prison today and here it is I’m almost a year out trying to change other people’s lives he’s scheduled to be here today and I was wondering what’s going on I am a youth case manager what I do it’s a lot like what a probation officer would do I have court involve youth that I am charged with the task of monitoring it’s pretty much just making sure that they’re doing all the right things and doing the right thing means going to school behaving at home attending program being home on time for curfew a week so if you miss a week straight we’re obligated to notify the judge just like if you don’t come here for two weeks straight but you definitely got to start doing something different because this is not gonna work so this was a picture of me this was a 16 year old Glen the Glen who committed the offenses for which I served 26 and a half years that committed a robbery and in the process of that robbery a person lost their life and so I was charged with murder robbery and criminal possession of a weapon at the age of 16 well it could be a lot better I’ll tell you that much I’ll show you right now the attendance attendance here hasn’t been the greatest you know you’ve actually you’ve done a lot better in the past you know I come from a very dysfunctional family background so my mother was murdered when I was a child at the age of three and my father committed suicide when I was four so I was raised by my grandmother and so you know and growing up I had a lot of self-esteem issues and so this young man here was the young man who would do anything to kind of prove himself to his friends and the people who he felt were cool and do anything to kind of fit in which is what led to my crime this kind of reinforces the fact that you know what I’m doing is the right thing at this point because I actually work with youth and oftentimes when I speak to them I see myself in a lot of them in the very beginning just quite frankly at that age as a teen I just didn’t see the the light the way I saw things was like I’m gonna die in prison this is it however once I arrived at about the midway point of my sentence I said to myself you know what if I did it right did this long if I made it this far I can actually do this again I could see myself doing the rest of this so I started to get involved with a number of programs and I developed such a love for the program that I I didn’t want to risk losing it and so I started taking gradually gradual steps and taking corrective measures to the point where I started adopting a lot of my own advice after serving 29 26 and a half years you’re eligible for parole and what happens is your appear before a panel of three parole board members and they get to decide whether at that particular point you are fit for society they kind of ask you a series of questions and based on your answers it generates this this bar chart that kind of gives them a snapshot of who you are at the time that I went to my first parole hearing I hadn’t had a Mis review misbehaviour report in over a decade however because I had misbehaved report in the nineties that document for some reason portrayed me as a very hostile individual very violent and so I was actually denied parole because of that so this is the bar chart that was generated as a result of all the questions so it lists me as under prison misconduct as an 8 so hi everything else is low unlikely unlikely for this unlikely for that low for risk of Reif of committing another felony low for risk or one for a rest risk for absconding risk criminal involvement ex quarter 1 which is the lowest you can get but when it comes to prison misconduct is coordinating and it all boils down to this section here question number 19 does this person appear to have notable disciplinary issues this person noted yes in my conversations with with other convicts um it turns out that some of them who for whom that question had been checked know had received a lower score than I had despite the fact that he had a stabbing incident just 26 months prior as opposed to me who didn’t have one answered any incidents of misbehavior and over a decade they checked yes and I score an eight all right what else a claw listen they started that group so yeah I’m going to show you where the group is it’s down the hall I gotta show you two challenging the compass risk assessment um there’s there’s no transparency as to what the algorithm is and how these numbers are calculated but no one has a clue as to how this software is actually how much weight is attributed to what questions and so you know whether or not there’s actual bias built into this no one would know these algorithms are there private companies that the government employees but because they’re private companies the algorithm is considered proprietary information as its secretive everything a secret is a secret you can’t you have no access to it and so my concern with it is that device can actually be built into the algorithm and we would have no way of knowing that in this country usually when you have a trial or you’re faced with criminal charges you have the right to challenge the evidence that’s presented against you a parole board is no different than a trial you’re pretty much going there and you’re trying to prove the case and they’re true they have evidence against you and they use it against you and the algorithm happens to be part of that evidence so you should be able to challenge that and in order to challenge it successfully it’s important to know how this is actually reaching the conclusions that is region [Applause] [Applause] Cynthia Conti cook is a lawyer and often takes on cases where an algorithm plays a role in the court’s decision she is familiar with the case of Glenn Rodriguez when the judge is making a decision the judge has to make a record about what factors they considered in coming to a final decision and we if we think that the judge has made the wrong decision based on that record can appeal that decision and ask a higher court to review what the the lower judges thought processes were and whether they were in line with other laws like the constitution with an algorithmic model that sort of just pops out a yes or no answer our attorneys don’t have the opportunity number one in the first place to argue whether or not the tools come out with the right answer and number two in cases where the judges cannot override the decision made by the algorithmic model the judges themselves don’t understand what is influencing its final outcome and so it makes the process for reviewing either the judge’s decision or the algorithmic models decision completely impossible if you’re the appellate judge trying to decide whether or not for example a constitutional would constitution was followed in a case like Glenn’s would you say the error is I think it’s probably two things the error would be in the way that the data point exists so it the general question is this person a a disciplinary problem shouldn’t even be included in there the other part of the problem is that based on sort of a very rough survey of maybe half a dozen of these we think that that question is more heavily weighted than other questions so we think that whether that question is yes or no can be determinative of whether someone is low or medium risk when other questions might not be as determinative I’ve been out now approximately about going on ten months now I’m sure you remember today that you may love it yes I never forget that day what was it like to step outside after such a long time oh man I listen forget the day the day I got the decision because they give you the paperwork first and they tell you you’re leaving that day I was like crying uncontrollably when I read that paper they give you know they call you down to the counselor’s office and they issue you an envelope that’s sealed and you don’t know what’s in there it can be very easily be a denial as it can be a release so when I open that and I was kind of hesitant to unfold as I unfolded the sheets of paper and when I finally saw the result I just started crying and controllably I just couldn’t contain myself because the day that I had hoped for and wished for for so long you know I finally comes [Music] the compass algorithm which determined where the Glen Rodriguez could be released early is under close scrutiny in the United States thanks largely to research conducted by the journalistic nonprofit organization ProPublica which investigates just how precise these risk scores actually are and this is a nerd alley so this is our whole data journalism nerd team which they don’t mind being called I think we’re best known for our analysis of criminal risk scores so this is software that used across the u.s. to give a score predicting whether somebody who has been arrested is likely to go on to commit future crimes in the next two years and so this is you know software that has not been analyzed by anyone independently so we went and obtained through public records requests the scores that were assigned 18,000 people in two years in one jurisdiction in Florida then we went to see where was it accurate did they really truly predict whether those people went on to commit crimes and it was only about 60 percent accurate so a little bit better than a coin toss but when it was wrong it was wrong really differently it was twice as likely to say that a black defendant was gonna commit a crime when they actually weren’t and it was twice as likely to say a white defendant would not go on to commit a crime when they did so it had a real disparity in the error rates which was something that no one in that field had ever really noticed or recognized before and it’s kicked off like a nationwide debate in computer science about how to balance this type of error rates in the future I’ve only had algorithms in all sorts of fields so Facebook algorithms I’ve been looking at algorithms used by Amazon algorithms used by the car insurance industry to set prices so kind of anywhere that I can find an algorithm that has really high stakes in the result and are there many places where you can find these are oh there’s more algorithms than I could possibly tackle you know we’re in the process of automating everything in the world right and that’s got a lot of good things you know maybe we’ll never have to drive again but it also means that we’re turning over to computers a lot of decision-making and a lot of that is really opaque and we don’t have independent regulators or independent ways of analyzing those decisions and so that’s what I’m trying to do you know I wish I had a name for it I think it needs a name this new field you know I call it data journalism but that means a lot of things to a lot of people but you know I think of it as truly driven by the scientific method the idea that you come up with hypothesis you test it find the data see if there’s enough evidence to support your hypothesis and that you use teams to do investigations you know we have a researcher a programmer me you know so the idea of like an interdisciplinary team tackling some really hard questions it feels a little bit like science so I’ve been looking at Facebook mostly because I just the question that I was really wondering was like what happened in 2016 like what happened on Facebook you know there’s a lot of theories about whether Trump had better ads or Hillary you know didn’t have a good strategy or whatever but the truth is we don’t know because you can’t see those ads you know it’s not like a TV ad or a print ad where you can everyone in the world sees that the only people who see the ad or the person to whom it was targeted so I wanted to make sure that future elections weren’t that opaque and so we built this tool to basically allow people to donate to us the ads that they see on their Facebook newsfeed so we have thousands of users who use this little tool and whenever I had to show up on their feet it gets sent into our big database then we had to build an algorithm actually have to do algorithm reporting so we built an algorithm to determine which ones are political ads and which ones are not and we sort them out and then we display the political ones to the public so that the public can see in real-time what kind of ads are happening on Facebook now we don’t have all the ads on Facebook we have a small sample but we’re hoping by extending this tool around the world that we’ll get a lot of them what percentage of a newsroom will be like programmers and people writing our evidence rather than do stuff like traditional journalism you think in the 10 or 20 years we have at least 10 or 15 people on that so that’s pretty sizable considering there were 100 people max but I do think in the future it’s gonna be more important and I hope that newsrooms will get funded to do that because that’s that’s the real question that’s holding back it’s not lack of interest it’s the lack of a business model first and whose business [Laughter] do you want to go to her desk yeah okay they do everything from rating BOTS that sort of scroll around the internet and sort of gather information to like pulling huge data sets from like large filings we trying to understand from the outside but a lot of these companies are too I will tell you the win algorithm that’s really hard to analyze and that I think is really important and that I’m frustrated that I haven’t gotten my hands on is the Facebook newsfeed so how and why Facebook decides what things to rank higher and lower as you know it’s become like a societal conversation you know about whether they’re like putting fake news too high but the fact that that thing is so unanalyzed able is like an endless frustration to me the Facebook algorithm seen by many as the most influential in the world is receiving a great deal of criticism recently because the way the algorithms function on social media do not always appear to correspond with the users interests we define success in a very narrow biased way in the way that works for us but not doesn’t work necessarily for the people we’re targeting doesn’t work necessarily for the for the public good that I think a good example there is the Facebook algorithm the Facebook News Feed algorithm has been optimized to to engagement which is to say to keeping us on Facebook which is a proxy for their profit but it has not been optimized to something that would actually be good for the public namely truth or a civil dialogue or or just knowledge or information with their primary goal being to get us to click and spend as much time online as possible Facebook and Google’s algorithms appear to have gone wild in recent years Jaron Lonnie a a tech pioneer and writer observed with ever-increasing wonder several online protest movements and the backlash is against them we can observe a pattern that’s repeated multiple times and I think we should accept it as a phenomenon that is real and the phenomenon is that somebody uses modern social media in a political way as a social movement as a political movement to improve the world and then somehow in the same setting other people come along who are not just bad actors but horrible actors and then apply the same techniques to make the world worse and much worse so the backlash is is more than one would expect you know throughout history there’s been plenty of backlashes against social movements and yet lately we’ve seen a series of backlashes that are really extraordinary so one example was the Arab Spring so when the Arab Spring started in Silicon Valley there was all this self-congratulations you know like it is the Facebook revolution it’s the Twitter revolution and all we have to do is let these people use our social media tools and there will be peace on earth and prosperity for everyone but then what happened after wall is is a much more difficult outcome where and for various reasons and and it’s different in each country and it’s not as bad in summers and others but we tended to end up with a rain of terror more than a rise of democracy and and and prosperity and we’re still not through that [Music] so the question is why are the backlash is worse than the movements and the reason why is that the social media by itself doesn’t do anything it’s waiting for energy coming in from people it’s people’s attention and interest that is the fuel that allows the social media engines to make money through engagement and so when the often times these social movements are started by young idealistic people who are the most engaged the most tech-savvy as they say the most able to do so and they start generating engagement the the young young people in the Arab Spring the young people in black lives matter and so forth so suddenly they have input this fuel into the system now the algorithms have to say what can we do with this fuel and so one option is we will allow this feel to only engage those people who started it that is not very efficient what’s more efficient is to say how can we use this fuel to engage other people and let us be clear about something there is a an asymmetry which is tragic but it’s very real and the asymmetry is that negative emotions are more engaging and powerful and easier to bring up and efficient and profitable and intense than positive emotions so if you have all these positive emotions coming in saying we want something better we want to support these people the algorithms naturally without any evil plan will turn them into negativity by finding the people who are annoyed the anti people and milking and milking and milking because negativity is more efficient more profitable more engaging so if you want to have an anti-fascist message that’ll be used to find all the fascists and introduce them to each other and put them into a cycle of more and more and more incitement and that is the only way for these companies under their present business models to maximize their engagement and maximize their profits [Music] the key issue here is addiction and which in the term within Silicon Valley for addiction is engagement that’s art our sanitized term two of the biggest tech companies make almost all their money from addicting people to services manipulating their behavior for pay and in order to do that they must emphasize negative emotions like fear resentment and hostility over positive emotions because that’s more efficient and those two companies are Google and Facebook and they must change their business model or else humanity will not survive so when you’re giving over control it’s not to an algorithm it’s to a company and until you get clear on that you’re gonna be taken advantage of and these companies appear to do far more than simply show us personalized advertisements the British company Cambridge analytical for example used Facebook data to analyze how American voters could be manipulated psychologists and data scientist Michael Kaczynski designed a model to psychologically analyze people based upon their Facebook profile but then refused to sell his model to Cambridge analytic a and yet the company still used Kaczynski’s method on the Facebook data of 87 million Americans at the time do you think they copied or stole your research doesn’t really matter because these days computers are doing science which basically means that if you know that something is possible you don’t need to know my magic sauce and my magic equation you just ask your own computer to devise to design your own so it doesn’t really matter the these days computers are doing science and not humans Cambridge analytical uses what Kaczynski calls psychographics a model in which an algorithm uses online data to exactly determine someone’s personality with this extent if knowledge the best way to influence people during elections can then be determined alexander nix the suspended CEO of cambridge analytical explains because it’s personality that drives behavior and behavior that obviously influences how you vote so if I’m a human psychologist and my brain is just the human brain I’m limited to those like systems models that contain five variables or maybe seven variables maybe 10 or maybe 16 variables that’s what humans can handle now if you’re an AI psychologist if you’re a computer algorithm you do not have to reduce people to five numbers you can do well from statistical point of view and from a predictive power point of view it’s still actually worth to reduce people somewhat to to lower M kind of to less dimensions but you would reduce them to 500 numbers or 5,000 numbers now you would actually call those five thousand numbers five thousand psycho demographic traits something akin to personality but it’s way more complicated and allows you to capture way more variance in human behavior allows you to be way more accurate about predicting the future we should be afraid and we should be cautious because algorithms artificial intelligence like any new technology is first of all we don’t really understand it very well it definitely brings a lot of risks as well as great amazing advantages now we’ve seen it in the past with nearly any new technology that humanity came up with that there were advantages and disadvantages of it now in order to reap the advantages and to avoid the risks we have to talk about the negative side of things we have to be worried of manipulation of abuse of AI and maybe AI doing things that are undesirable all computer algorithms are getting better at turning the digital footprints of behavior that we’re living behind into very awkward predictions of our future behavior and of our intimate traits on the other hand we are living increasing amount of digital footprints behind because we are increasingly surrounded by digital products and services you want to walk around with your face uncovered and talk to people freely now just this data that we voluntarily leave behind is already enough for a good algorithm to create a very good prediction of who we are and what are we going to do in the future so spaces of facial traits or like my recent research suggests even based on a still image of your face computer algorithm can reveal a lot about our psychological or demographic psycho demographic traits so now going forward we basically going to have to no privacy whatsoever and the sooner we start talking about how to make sure that the world when there’s no privacy is still a habitable and safe and nice place to live in the sooner that are basically there the larger a chances that we’ll be able to survive this transition and not only survive on an individual level but also survive as societies as democracies [Music] do you believe that this is just like a temporary phase where we have these crude algorithms but they will work better in the future automatic yeah I do think that’s true I don’t necessarily think that’s good sometimes things work really well and they’re evil just to be clear things that predict poverty or predict wealth can work really well and it the better they work the better they can cause poverty in Cosmo I think the most important point is that algorithms don’t just predict the future they cause the future because all algorithms are actually working in consonant with each other and they are separating the winners from the losers in similar ways and as they get better and better at that job of separating winners from losers they’re causing winners and causing losers because they’re not saying oh you’re losers we’re gonna help you get to be winners that’s not what they do because they’re built by individual companies that are looking to profit the larger point is that as they get better and better that’s a problem for us that’s when they get more efficient at increasing inequality unless they’re used for good look I’m not saying that it has to happen we could have very accurate algorithms that do good but right now they’re being used to commoditize us and they’re very good at it and they’re getting better at it we are no longer in other words we are no longer seen as human people you know who deserve dignity just by dint of being humans as citizens of the world we are seen as potential purchasers so if we are not high-value customers that we do not exist and that’s that’s how we’re sized up so when you when you talk about algorithms getting really good at their job it makes me worried I actually kind of like that they’re bad at it sometimes [Music] I’d like to thank everyone for showing up first of all greatly appreciate everyone showing up to try and get our message across to the folks and then there’s a conflict between us and d-algorithm [Music] it’s been pronounced to bring the company the biggest profit possible no matter the costs and I’ve written doesn’t care because all the costs are pushed on the workers Fedora announced one day before this protest surprise surprise they’re going to start covering our repair costs they said they’re going to give us a court with a euro extra per hour worked to cover our repairs up to a maximum of 42 euros a month already then think about how much workers need if you actually worked 168 hours in our sorry as you can tell this makes me kind of Angry just you know to make sure they get the message I present [Music] [Applause] how do you see the future you see algorithms in every major life decision that is there I feel like that’s already true you know people talk about the singularity they talk about the singularity is the moment when computers take over I don’t worry about that at all I feel like we are handing our our selves over to the computers we’re doing it it’s our singularity there are actually people the evangelists of data from Silicon Valley who are pushing this vision that the algorithms that are so good at predicting for us should be given more power that we should we should listen to the algorithms to decide who to marry and what a career to have and you know where to go to school that we should actually sort of hand over our free will to the computers that’s freaking nuts it is freaking nuts technology should be giving us more options not last we are becoming slaves to the technology we are entering in that vision of the world like we are enslaving ourselves to this technology makes no sense to me [Music] thank you for watching for more on this subject take a look at the playlist you can also watch this recommended video don’t forget to subscribe to our Channel and we’ll keep you updated on our documentaries [Music]

Leave a Reply

Your email address will not be published. Required fields are marked *