Skip to main content

tv   Inside Story 2019 Ep 239  Al Jazeera  August 28, 2019 2:32pm-3:01pm +03

2:32 pm
the device is always listening what's happened with these revelations is that it turns out that in some cases snippets of your audio were being reviewed by real people 3rd party reviewers who were transcribing the information and trying to check whether the digital systems were getting the right information out of your requests to the device so it's not an all the time listening but it's may be more of a human component than users realize some people any idea what that might look like is a home office full of people we've had phones and listening to me trying to put my shopping list together. it very well could be very well could be you know google was discovered to have been subcontracted out for its google systems and there was a flemish news agency. reported on these contractors. reading or listening and transcribing these google voice calls and so there could be quite
2:33 pm
a number and we actually are unable we don't know how many there are and we're unable to quantify so one might be asking then why would anyone use these devices in full disclosure i love my alexa i don't use google home or siri though and all of those are what people are talking about so here is one heath on twitter who uses these devices he says i use them for timers calendars reminders news updates trivia games general knowledge questions phone call smart home control weather reports the list goes on and on he says privacy is less an issue with these devices than functionality i actually want these ai systems to become more conversational and less robotic asking google home for a search is no different to me than typing it into a browser window when it comes to privacy alix what do you make of that last part of heath's comment there he's comment is absolutely correct. it is the same as a text import from a machine. so all this information that we're ending over to companies like google
2:34 pm
or apple facebook amazon it's all being reduced to bits and the question is how comfortable are we sharing that information heath obviously has no problem with it and i think most people frankly shouldn't have a problem with letting siri know that they want to know what the weather is or that they want to play their playlist i think where you run into the margins which is what sam alluded to is what if there is really sensitive stuff in these conversations and we're assuming that these systems are listening when in fact they are and some of your probably had some scary instances where for example maybe you were googling something about a medical condition and then ads might show up in your g. mail which is a little to the fact that they know that you're doing this well the same thing as a play and for systems as well let's just explain very simply how one of things 1st come on systems actually so let's look at the amazon. really easy to understand how it. in order to
2:35 pm
a voice recognition artificial intelligence you need a massive data set and literally everything someone might say to their phone when you speak to an alexa device on the data is stored in the cloud that means amazon can do more to study that data the system learn your habits and interests so you can better guess what you might search in the future. meanwhile some of the great things that come out of that device is actually listening to us we talk to it and call it say it helps us out can you go for a couple of the stories that you know well these devices have really been helpful in people's lives. yeah i think that's the thing that i always talk to people about when they are thinking well should i use amid these devices or not i think it comes back to a sort of risk assessment or cost benefit analysis for each person for me i don't usually use these devices or smart assistants things like that because i don't find that it's much more useful to me than typing something into my phone or choosing
2:36 pm
when i want to receive information through a device like a laptop bed other people if they have mobility issues if they're new parents if that you know lots of things could be going on where it's not that easy to get to a device where even get to a light switch something like that where just being able to speak a command is a massive quality of life improvement over doing some of these daily tasks so in the case where you're getting a huge benefit out of it in your daily life i think there's a potential like that commenter was talking about the to say well the trade off is worth it to me you know maybe there's a privacy risk but on the other hand i'm getting this enormous benefit for me as an individual user personally i don't find that that tradeoff it makes sense for me right now but i think if you're thinking about it that way you'll be in a better position regardless of what happens next you know if something were to happen to your data or something and you were getting a lot out of the device you would feel better about that than if it was really out
2:37 pm
of left field and you've never considered it so every i think i got to think about it i agree with really but i also tend to think about it a little differently. the 1st point is that. whether it's useful or not everybody that's a sort of a personal decision. for many people will be incredibly useful and for some people the american scared about the security of this by giving up their 1st rights fuel is really the issue is that they're not telling you that they're doing this in the background and i think a lot of cases if they just told you if they just let you know that hey we doing this would really. if you had a lot of these other concerns a 2nd point which i think both lily and sam know about as well too it's like that clip to talk about apps of data systems or maps with lots of data the best not true anymore they can actually train a very effectively on a much smaller data sets than they could even 2 or 3 years ago you see this with for example some of the research on solving captures and so i think that's almost
2:38 pm
a false choice they don't need the same amount of data and you also see some sisters systems like opt in where for example mozilla open source crawl organization has a project called common voice where they have people opting in to donate their voice under conditions where the person identification is obscured and that allows them to create trainable corpus that anybody could use that's not necessarily dominated by big tech companies alex i think that that's an interesting point you made 1st of all in that the users are not immediately told they don't always know that this is happening and then what is have is happening with that data i want to share a video comment we got from a staff writer covering technology at the atlantic his name is sidney and here's his take on that point more more of these products are coming into our lives and we're figuring out main later way after the fact how they actually work and whether or not we we want them and so what is happening right now with all these different scandals clinic where we're learning how it works the debt is coming due in terms
2:39 pm
of how these products actually affect our lives when it's in that thing should ask is if we had known this debt was coming if we had known how these products actually work would be have bought them in the 1st place as relates to intellectual debt the debt is growing would bang more of these products and because the products are becoming more sophisticated. what we're risking is eventually because products will know more about us than we know about them sam what do you make about his point on the data and if we had known would we have bought the products although again full disclosure i knew it would be a bit a bit creepy to have a microphone in my home and yet the utility is worth it for me but what do you think. i mean i think sony is absolutely right i mean we have billions of devices connected to the internet you know we have our digital companions alexis siri google home and you know which there are now imbedded into you know appliances and
2:40 pm
home speakers and even cars and so you know that it's becoming you know so ubiquitous and right now. we have this reactionary approach to data privacy you know what when something is breached and we end up looking at all the laws and the rules and seeing you know ok well what happened and how do we rectify it but we should you know we should be addressing these before hand so i think that's what's what's really missing at this point right now one of the reasons get it talking about this says to me alex just one moment to it just just at the end the story is one of one is that some of our audience wanted to hear about more odd devices listening to the other one is fiscal admits potus lead to more than 1000 private conversations with google assistant so there will issues here what we have to say so is going to say to points the 1st part to sam's point we really this has been an afterthought it's been reactionary and i think that for the most part
2:41 pm
people don't really care about their privacy until they realize something bad has happened it's almost like a car in a car insurance you don't care you haven't got off to a crash and i think that the tech companies probably could've done a better job of creating interfaces and ways for us to understand the implications and i think that's part of their job because they are making money off our privacy make no mistake about that the 2nd piece is that i asked people to put this in perspective when i talk to them which is that compared to what happened with the equifax leak for example where in the u.s. at least over 100000000 people's very intimate financial data was essentially leaked or stolen this is really small potatoes and i think that very often we get worked up about you know someone's listening in on maybe. some slightly personal stock where the whole world has your most you kill throat financial information and it's sold in the dark well for a sense perec are right that i don't play it very that it's sort of small potatoes because of the point that people don't really understand what they're buying and
2:42 pm
there isn't transparency from the tech companies about what is happening to the data so yeah in theory a social security number is a lot more sensitive than what you're saying to your amazon echo or whatever it might be but in practice if people don't know what they're buying and they don't understand that there's an exposure there are potential exposure and they're giving these devices as gifts to their family members at a holiday or a birthday or something like that then it's just propagating more and more and people aren't really aware of you know maybe they could be setting their services security number out now out loud to their spouse in their home not realizing that they also said the wake word and it's being captured so i definitely think there's you know a major problem there that isn't unique to the on the uninitiated the way quite tell it what about the way what it's like having a safe flight. down into a nut. you know tell me what to do with the way that part of the search somehow
2:43 pm
tyson is what the what what is for people who don't have an absent guy had great use devices aren't actually listening to you all the time they're only listening when you say a word to wake them up so that they start listening just and that's a privacy protection that was put in place so it's not so creepy to have something that's always on but one of the concerns with these devices has been that sometimes the device thinks it hears the wake word when it actually didn't and that's a lot of the issue that's happened with the snippets of audio being someone committing a crime doing an intimate act something like that you know they're not intending for the device to be listening in that moment but it misinterpreted something as the wake word so that's kind of another area of this that's important to understand . alex. so sam before you jump in there then i will because i want to share this from you to elizabeth that this listening and data collecting turns
2:44 pm
people and their data into the products which feels a bit too 19 eighty-four for me so it's that point that i want to pivot off of because we just got this tweet from courtney who says my privacy concerns are more about 3rd party sharing than what amazon collects i don't care if they record me for a product improvement i do care if they sell that data up and i care about my data being stored indefinitely so i think this is the crucial point for a lot of people sam talk to us about who this data gets sold to who wants it who are these 3rd party organizations. you know and that's that's a great question you know and that's that's the problem itself is we don't know who these 3rd parties are and we don't we the united states at least lacks any federal data privacy law that discloses or requires that transparency and so luckily in the e.u. you know that you can have the g.d.p. our law but right now we don't know you know who they're sharing it with and so you
2:45 pm
know a lot of times you know it's a little you alluded to you know if you know that they these companies they may you know collect or have our data our voices but a lot of times you know it extends beyond just these devices you know it could be your apps and these are browser extensions or browser apps and they can they can collect your pictures they can they can collect your data about your d.n.a. your travel itineraries and your personal documents and then and then all of this may be captured by 3rd parties and so and then be shared with that that there are parties going to go in this really cuts to the core of the sort of the market mismatch in the digital economy the way it felt we we are accustomed to getting lots of valuable things for free in the way that money is recouped as they sell our information it's transparent it's not true i mean it's completely opaque as to where it sold and to how much the value of our information is but in general at least to date we have a good necessarily willing to pay for things once we've been exposed to using the
2:46 pm
prefer so i think that this is almost like an original sin of the digital economy going way back to the early days but we are paying for. devices we are paying good money for them we're paying money to use them so it's not exactly so then a ship a simple team security into that i'm just curious sam when these new devices have pulled out is there a process in the testing states where i think ok what are the ethical issues that we need to examine is that just happen later after the point has happened. i mean companies like amazon google you know facebook you know they they do take these into consideration you know they're working on it but we need more stringent you know safeguards and so you know they may not be looking at the whole economy and as you know selling their data to somebody else we don't know what that 3rd party is doing with our data so they may tell us and say ok we gave it to company y. but what is whose company why sure i'm not with them and so there's all these
2:47 pm
degrees of separation that we don't really know about and all of those really need to be looked into and explored more and this is where i think it comes back to the military for each individual because you know a company that i do think that all the companies take pains to at least make their devices not seem creepy but that's because they want to sell more devices that you know even if they mean well or that you know they don't mean anyone ill their ultimate goal is just to sell the devices so i think it comes back to what an individual is getting out of it and not just trying to sort of have something that's trendy or cool because it is going to be opaque what they're doing with these things the question is are we as consumers getting belle you from them because otherwise you're just sort of giving them your money and all your data and everything you know you really need to be thinking about what you're getting back
2:48 pm
in deciding what you use here's someone who would agree with you this is alvin allen on twitter says ideally people would get a fuller understanding of how these devices work and all the possible things that can happen with the recordings of their voice before deciding to purchase companies have made this difficult though in my opinion because they know these details may hurt sales exactly what you were saying there but i want to play a video comment from someone i was whose echoing an idea we brought up but really honing in on it this is sarah freer she's a tech reporter at bloomberg and here's her take. companies need to be honest about the fact that when we give them data for artificial intelligence purposes it's not just machines looking at that data often it's humans who are parsing through it transcribing it cataloguing and testing the computer is efficiency and.


info Stream Only

Uploaded by TV Archive on