Skip to main content

tv   Disinformation Democracy Discussion at Computer History Museum  CSPAN  June 16, 2020 6:48pm-8:02pm EDT

6:48 pm
6:49 pm
good evening everyone, thank you for coming out here tonight. my name is rachel miro, i had to make sure by looking at the sheet of paper. i am the senior editor of our silicone valley desk here in san jose. i say here in san jose, but here in silicon valley because we are in mountain view. joining me on the stage is the director of internet ethics, alex stall moos, former chief security officer, and the director of the stanford internet observatory. director of journalism and media ethics at markkula center. this evening is co-presented with the markkula center for
6:50 pm
applied ethics. this is part of this series on common ground. it is an initiative bringing people together for civil discourse featuring journalist hosting provocative conversations about politics, policy, arts and culture, science and technology. reckoning with the force of disagreement among us about how to face the future of economic, culture and environmental uncertainty. this serious asks what are our shared responsibilities to the common good. next in the series, if you have an open calendar this tuesday at 7 pm, at the san francisco exploratory, we will be looking at how we overcome the polarization of tonight's topics. on to tonight's topic. democracy is under attack worldwide. populism is on the rise, disinformation is tool number
6:51 pm
one, and social media is the platform of choice. what we do about? it we can start by talking. it and i take it you have some show intel to offer us about the topic for today. perhaps you've already read in the new york times that russia is again attempting to influence the american election for president. >> yes, that is what we read in the times. that was a briefing that was given to the house subcommittee on intelligence. it but there are no details. we don't know exactly what they mean by that. from my perspective there are five different kinds of interference, it is not clear if they are doing the same playbook sore doing something different. >> show us a few examples of what we remember from the 2016 election. some of us don't remember it because we never saw it on our feeds. >> when the audience was not
6:52 pm
the total target. but surprisingly, against conventional wisdom you will find that a lot of the russian disinformation was aimed at both left and right. the two major, if i can get my slides up. the two major types of disinformation from, or information operation is the term we use, it to make a concerted attempt to change a information environment. it had two big directions. first, was medic warfare. it is about driving division by creating radical means that are injected into political discourse. in this case there are three examples from both sides of fake profiles, fake persona's that were created by a group created by them -- that belongs to a russian oligarch.
6:53 pm
the one on the left is supposed to be a pro lgbt group. in this case it is a lgbt coloring book for bernie sanders, this is a funny little thing that you might post with the goal of getting people to join your group and disappear content. and then most of the content had nothing to do with elections or politics, it was content like this to draw people in and that would allow them to object messages. -- -- a lot was anti immigrant sentiment. this was a twitter account that was the tennessee republican party, but it turned out the tennessee republican party social media internet lived in st. petersburg, russia. not florida. here's a more from instagram. as you see it comes from both sides. all the tones are nude, get
6:54 pm
over it comes from a fake account on instagram called blacks to. graham a big topic was black lives matter, a big goal of theirs was to try to build african american support for these fake persona's and then inject messages about hillary being racist. it was stolen from burning as well as messages that might have been seen by conservatives and then seen as very radical. i will give you a second look at this before i ask you some questions. check out this post. this is from a fake black lives matter group, it reads black panthers were dismantled -- so, obviously that you would probably notice the strange diction, the english isn't perfect. this kind of work by the
6:55 pm
russians is being done not by intelligence specialists, the internet research agency are effectively lentils with english miners that could not find better jobs in russia. these folks are not professionals. the language will not be perfect. now that i'm a fake professor, i will ask you guys, this being posted by somebody in st. petersburg, russia is illegal? raise your hand if you think it is illegal? only one person. you are right. this is not illegal for somebody out of the country to have an opinion about the black panthers, even if they are lying about themselves. it is a violation of facebook's terms of service. they do not have a force of law. is this fake news? raise your hand if you call this fake news. >> that is interesting, the thing here is they are not making any falsify ball claims. this kind of argument of what was the reason why the u.s.
6:56 pm
government prosecuted the black panthers is the kind of thing that you might find in a freshman studies. this is an argument in political discourse that they are trying to amplify. >> remind us, the over ten window. >> do one of the real professors want to talk about that. i'm not a political scientists. >> we are all fake professors. >> the overtaken window is the idea is what is the range of acceptable discourse, these are the things that are allowed in any society, in this case americans aside the. that window can shift back and forth based on people being on the extremes. this is a real email, want to guess who received it? . john podesta. the real tough for the russians to figure out his email, this is what he received telling him that somebody had tried to sign in to his account, it was sent
6:57 pm
by the main intelligence director of the kremlin. we are talking real intel people, people that like to kill people overseas. they're little joke here is that the person trying to break into the account is from ukraine, those oh those guys are hilarious. it and instead it goes to you or l shorten or that sent him to a site called google dash accounts. it was a perfect looking google login. he asked the i tea people at the dnc whether it was legit and apparently that guy replied that it looked okay. but he meant to say it does not look okay. the most important typo in the history of the human race. he logs into that and gives his use and password and they go and download all his email. they also broke into the dnc with more technically
6:58 pm
sophisticated work. when they had that information, they were not releasing fake accounts. they were not releasing fake information, a cherry picked the emails that told the story they wanted to tell. that was the story that bernie sanders was ripped off in the dnc primary. to do so they power that message through real emails, where people were saying not nice things about bernie sanders. they did it through fake profiles. that failed. so they try to get through the organization that they were pretending was a real leak site like wikileaks. d.c. leaks reached out to a bunch of journalists and said, here are some documents from jon podesta and the journalists complied. politico ran a blog of what they were going through. and even the new york times ended up running with the
6:59 pm
stories over and over again that they wanted them to run. if you go to paragraph nine or ten it says, this might be part of a russian information operation, but it doesn't matter when that is your headline and that's what people are reading. some other examples of disinformation, these are to real whatsapp messages. india, people use the internet differently. something like 400 million people have accounts on what's up. it is not like facebook where you can post something million people see, you can send a message up to 200 people. folks in india are parts of many groups. family, school, work. so messages get passed along by individuals copying and pasting messages that are injected in. the one on the left is from attacking the conservative political party in charge of india. possibly support of the congress party.
7:00 pm
it basically lying about the price of gasoline in other countries. the one on the right is racist propaganda. disinformation looks a lot different because when you see, when you look at this, it is saying i am from this black lives matter group. when you see this disinformation it is coming from your uncle, and her coworker. it is personal. it is harder to amplify but what's happening in india is that you have these groups that work for political parties. the theory is that there is about 1 million people who have signed up to push disinformation. they believe it is the right news on the behalf of political parties. they get a notification and they copy and paste it and send it out to 400 million people. we are still seeing this kind of russian led activity around the world. this is up a report that we
7:01 pm
wrote -- what we found was a disinformation that work in africa run by russian groups, which is a company he owns that has paramilitary mercenaries. people that go into countries to kill people on behalf of auto cracks. they are supporting autocrats on the ground with guns and disinformation. to do so it is not for just foreign policy but for financial benefits. he has things like diamond mines and the like. he is back in two of the six people vying for control of libya to get gas and oil rates in the future. the interesting to changes from 2016 now is one, this is no longer people sitting in st. petersburg. they have been hired in those
7:02 pm
countries, and they are reporting back to people in st. petersburg. one of the guys doing it posted a picture of a picture of moscow on his instagram account. that is kind of awesome. the people doing this work in the sudan are actually sitting in sudan so there's much hutch are hard to catch them. the cultural language is much better. it is multi media. this is a whole newspaper it seems like a legit newspaper. it is mostly not about russia. it is just newspaper. and it is owned by him. he also owns the radio station. they are building the entire pipeline, they can manipulate the media that they also create their own media and amplify that media on line. let's start with that. >> that is a little overwhelming. yes, thank you. it there's so many different things to parse out in alex is presentation. but one of the first things that occurs to me, that question of whatsapp.
7:03 pm
there are so many people around the world who are on encrypted platforms even though you can argue that journalist and regulators are not doing a very good job where the information is out in the open, they don't even less good job when the information is encrypted. >> if you take india's example, whatsapp is a particular case where there is cultural acceptance, whatsapp is used in india in restaurants to pass out menus, that is used also to order from those restaurants. afterwards your restaurant owners will share more with you. there is an inter personal acceptance of liberal privacy in the sense that, i don't mind you sharing something with me even though i don't really know you other than a transactional relationship. in the u.s. it is a different
7:04 pm
kind of sensibility. if i get a whatsapp message with the video from somebody i don't even know, first of all, i may not have a whatsapp connection with people in that sense. there is a huge advantage that these actors have in places like india where whatsapp is literally an interpersonal thing. and transactional thing at the same time. one of the things that will help to understand is it existed because of the centralized paradigms of ownership in the media from print to radio. as long as media was owned by a few organizations, ten to 30, there is a cultural sensibility and acceptance of values built in. that all broke with social media. there is no such thing as acceptable window for what is acceptable in a democracy for public speech.
7:05 pm
that is what is broken. >> part of the question though is who is responsible for the changes in the window? it was not that long ago that social media started. when it started people were not sharing news articles. remember, the idea was to connect you with your friends and family. at some point that paradigm shifted. part of the responsibility lies with the social media companies which have certain afford insists. they grind you as to what you should be using it for. in 2013, marks october came out and said we want to be everyone's personalized newspaper. facebook was not something that people thought of as a newspaper but suddenly there is this encouragement that you should be sharing new stories. suddenly, maybe aunt fellas is endorsing this message. that is how comes across. some of the responsibility lies with the platforms themselves. some lies with us win.
7:06 pm
i think one of the promises of the internet was that with the advent of blogging, we are all journalists now people said. it turns out, no we are not. but with social media we are loudspeakers for peoples message. that is a different role that we are all journalists. we all kind of bought into this role and find ourselves doing it. when we talk about responsibility, we have to talk about these different layers. i found an interesting poll that was done last month by npr, they asked people who should have the main responsibility for addressing disinformation. addressing disinformation is vague, do you mean to not do it, to not respond to it, highlight it? but the numbers where these, 39% said the media have the main responsibility about
7:07 pm
addressing this information. 18% pointed to technology companies, that is half as many. 50% said the government and 12% to the public. of those who felt it was the media's primary responsibility, 29% of them were democrats assigned the made responsibility to the media and 59% of republicans do. we are polarized even on who is responsible for doing something about this. i'm >> as a journalist, people do not always want to accept the information they are receiving. you say, this is it this is that information for the question that you had. but the response is, that is not what i believe. as if it is a matter of opinion. >> because some people are not looking for information, they're looking for confirmation and they're looking to signal identity and be part of a certain group. so increasingly i'm reading about this that people know they are sharing misinformation
7:08 pm
and they are okay with that because that is not the point. the point is not to inform people, it is to show what you believe. i think what is really interesting for the rest of us is that we have these calming human weaknesses that make us do the same thing, even if we don't mean to. what i have learned, my colleagues will tell you that i have to check myself, if i find something that is absolutely the best illustration of what i believe, like this just confirms everything that i believe. i have to sit on it. more often than not it is a setup. it is designed exactly for people like me to respond that way and share it with others and outrage. and then perpetuate that this information, which as alex points out is not an outright lie. it is out of context. or it is made to push a certain
7:09 pm
narrative. >> i kind of wonder, as recently as two years ago, i was among those who were keen to laugh at politicians and regulators as so behind the times. unable to find their left or right hand. of course it would be no position to craft laws that would be out of date as soon as the ingress dry. now, i do not know. the whole disinformation situation online is such a dumpster fire. i don't know if there's anybody who's on top of it. even if they did nothing but read facts all day. what is this message for what we can do from a regulatory standpoint to try and control some of this. or is that hopeless? >> in the united states, we are extremely limited by the first amendment. yesterday i was in washington, i may look like i'm asleep with
7:10 pm
my eyes open, i was at an event by the department of justice which was about the 2:30 -- with a say that tech companies need to fix things. the vast majority of things people complain about, with what is called 2:30 is the first amendment. political speech is almost never any criminal or civil liability. even if it is false. the supreme court has said if you lie intentionally, in most cases it is not a crime. it can't be a crime and cannot be punished. there's two about it. but for political topics you can't. most of the stuff we are talking about is something that you would never be able to adjudicate as false, even if we are on a different country. the regulatory in the united states does not have a lot of options. even other countries things --
7:11 pm
the most effective regulation has been the next law, which is a 27 letter german word that i cannot pronounce. that is a law that requires the tech companies to enforce german hate speech laws. but that is hate speech law. it is not about true or false from, and even that has had some real issues when it starts to apply things likes her chasm in comedy. in the u.s. the regulation is the place, not the content of but the mechanisms for which people can do political advertising. i would like to see some rules that figure out everybody is guessing what the rules are for political advertising are. the fcc is not ruled about how these eighties and nineties law apply to 2020. i would like to see something
7:12 pm
that has restrictions for political ads about kind of target getting you can do. a minimum size, you cannot target nra members in one town. let's not pretend that people here, multiple people and here have given to pledge drives. some you could limit so you could -- you have to advertised too much broader sense of people and have rules around transparency. my colleagues at stanford who studied these issues, believe that you can have such laws as long as they're content neutral and apply to the actions of the platform. to actually regulate the speech of individuals will never happen in the u.s.. >> any thoughts on that? >> i want to take a step back and recall this slide of the
7:13 pm
different examples, one thing that is common is the regionals in them. look at what they're trying to do with our behavior. in my mind, i have ended up having to ask three types of questions, what actors are in these post, good actors or bad actors. a different type of vocabulary is the supply or demand side. where you? are you on the demand side because you are a leader? if so, are they supplying new information? then it could be people doing anything. journalists are not the only people posting. any organization can post. the third thing is, what kind of behavior is actually going on? they are expecting you to instinctively believe me because they are going to your sense of balance, there is an
7:14 pm
implication of me what is called fast thinking. there is a fairly famous book, thinking fasten slow him. the easy way to understand this is that they are making you think fast. when we think fast we act with our biases. we are not going to take decisions based on this and more deliberation. the idea overall on social media post is click, click and click. really quick activity. that is the implication. millions of shares can happen before people even know or care if it is true or false because it plays to what they want to share. the approach to that is to slow people down. if we ask ourselves as the public, in the midst of all the other ideas that we have, of
7:15 pm
what should the social media companies do, journalistic, government, what should we all do. the only thing that i can ask for is if you ceta something that is too good to be true or you feel a feeling creeping up, feel something happening, then you know that feeling strife behavior. but that is where you make a mistake. the first thing we all have to do is slow down. that is one thing. and there is one line that came out of famous south indian film. it is about an corruption actor. it essentially says, for the truth to when you need evidence, you need authority, proof. fertilize to when you just need to sow confusion. confusion is fast, the truth is slow. all good things in democracy, all good things take time. if you look at the anti daca example, there was an example
7:16 pm
saying hit 69.3 degrees. but if you read the story in the post and elsewhere, they took great pains to say that this weather station is a valid number. we want to verify all the other weather stations. why they took the worker and high and nobody ever has heard of antarctica hitting that high, they are saying we are not certifying this yet. this is what it takes for science stephen say something. it is slow, painstaking. it is very easy to go out there and say these things. i don't know if you heard of tread cruises tweet which was actual disinformation, but he deleted it after that. and they should be any politicians approach, when you post a tweet that is wrong, you should not just a league. it post a corrective tweet that says what is wrong and here is
7:17 pm
what happened. and then hope that that gets shared. those type of behaviors are implicated in our fast click based world. one is slow down, identify the actors, and three look at the supply chain. if you are on the demand side then you have to regulate our own behavior. if you are on the supply side, there are codes of ethics. >> i will challenge you. in the olden days when i flipped open the newspaper i would encounter ideas that i disagreed with some. now i am in a silo, even if i don't want to be in a silo. because the platforms are ambrose and put me in a silo. they have algorithms that make sure that nothing encourages me to move my eyeballs off that site. there is filtering happening before i encounter it. >> on this point, if you have algorithms and personalization,
7:18 pm
the way we encounter news feeds is like an exercise. if i am looking for shoes on amazon, i don't mind personalization based on shoes i've ordered earlier. but journalistic content, is a public good. you cannot take a public good in the form of posts that other people share and if you try to personalize it with algorithms, you are most certainly going to get it wrong. that is the reason people start seeing a value based expression that all accumulates. because these algorithms do not know the difference between fake and fiction. they just don't know. there's a lot of intelligence now, but that is coming in now. there are technological issues. first of all, they can't tell it is fake and what is not until they are trained.
7:19 pm
but to, even if there was no fake news at all i'm personalizing public good-like news forces democrats to be on one side. you will see left value system and everybody sees only what they want to see. when the alternative to that is asking us to devise a fire on facebook friends. but that is hard to do. for me to be friend a conservative just because i want to do that is not something that would actually happen. it can happen incidentally. i'm not going to do it just because i want to see the news that they see. i'm going to do it because i met a friend on the road and happens to be a different point of view. the way that people see social media for news, we should take it out of the picture. why should we share -- all of those things are good,
7:20 pm
and that is what social media is good for. that mix-up is a big mix-up. it is the design that they haven't even thought of, what is news. the minute you start saying what is news, you have to start white listing all these organizations. how do you define them. what is news? what is journalism? those type of definitions have not been taken to account him. now when they come in, there's also free speech. >> i'm not sure what we're going to define or how to say what news is. we are talking about a lot of different things at the same time. i will say that, disinformation, the easy and old answer is that you need media literacy. you need to understand the right sources are. and how do you analyze the message? it turns out that the researchers are now pointing out that the language of the
7:21 pm
media literacy and critical thinking itself has been weaponized and used in order to push this information. think for yourself. don't believe what the mass media tells you. go out and find your own sources. those are the things that used to be taught, to question established media. now we are finding that's what sense people off into these outliers and increasing radicalization. how do we respond to that? there's a researcher who has written about this. and a little bit in despair, she like many of us believes in the notion of media literacy being helpful. i will say, i'm giving my own terms through what she proposes, i think we need to additional things. we need social media literacy and understanding of how social media works and the impact of what happens when the news is
7:22 pm
shared by friends and family as opposed to some outpost. we also need self literacy. we need to understand, about our own cognitive limitations and blight spots and how we can be manipulated. the only way to respond and prevent disinformation is to be aware of all three of those layers. they are different layers. how the media works? how social media? works and how we work? it is interesting to me that we haven't mentioned so far bots or deep fakes. they are the challenges that we are finding new and different than 2016. and they are still there, but the social media platforms are getting much better at identifying them. we have already many stories
7:23 pm
about how those efforts out there, that is one of those things that we said with black lives matter, i hope we all learn to distrust anything that makes whatever people from the other side, if you see something that makes them look egregious an extreme, it is probably the russians. there are some. they will find those couple of people who are that extreme and try to make it seem like that whole side is that way. it makes it much harder to have those friends on facebook and have those conversations. it is just mixing all kinds of subjects together. some of it is the nature of social media. it is very rare that you find yourself in a room that includes coworkers, grandparents, people you met 20 years ago in school. maybe somebody you met at a party. >> it sounds like a funeral. >> but often that is social media and the environment.
7:24 pm
what we found over time is that researchers call it concept relapse, you don't know how to talk to all these people at the same time. and with context collapse, and with conflict, when people who do not know each other we'll start to argue, you tend to shut down. there is something called the spiral of silence. in the face of this disorganized conflict with people you don't know how to talk to, people then tend to shut down. researchers found that people would argue less on social media and then even heard at the dinner table would talk less about some of these conflicting issues. that leads to polarization. because we are not talking to each other, we don't understand each other's perspective. something in the design of those platforms and in that kind of interactions that they
7:25 pm
created, led us to where we are now. one of the things that is important, we all learn how to have constructive conflicts with each other. how to talk in disagree and not everything has to end up in a flame war. not everything has to end up with exaggerations of each other's positions or ad hominem attacks. how we talk to people really come from different perspectives than us is another thing we have to figure out. >> it is interesting that you say that. i feel that experiencing more of this as in person conversations with men and women, that this kind of desire to broadcast, to keep talking and are not really interested in what you are saying, even in response to what they are saying. i'm going to ask each of you to help me stop this spiral and despair and resignation.
7:26 pm
kick us off. >> you picked the wrong people here. i'm just gonna say. this is not the happy future crowd. >> point out a good actor, somebody who is doing something clever or effective, it could be government, journalism, it even could be a platform technologist. who is doing something that we could look at and say, they are headed in an interesting direction. >> on governments, nordic and scandinavian governments have launched social media literacy campaigns. just like everything else, it is rough to say the danish are doing and have work in the u.s.. even on health care. but we will find out about that. i am a little -- look, their numbers are looking good. they are happy and good with feedback for that. they have really high trust in
7:27 pm
their government media. there is a number of countries in which the bbc, abc, these government supported media is independent enough from the government but also constrained by their board of governors. so they haven't chased the quickness. it is very difficult to think of a media organization in the u.s., even those that have maintained journalistic principles that hasn't become very strict in its political liberals. it >> certainly, but has looked at the fox news model and wants to make money like that. and they have found it very profitable model as being like
7:28 pm
fox is to obama as equivalent to trump. i think part of it is, he would be great to get back to a world which there are media outlets that are seen as being more mature carriers. i don't know if that is a genie you can put back in the bottle. i feel that there is a lot of non empirical preferences. it is easy pivot to say it is the algorithms that are making us crappy human beings. but it is the human beings. people go on social media to be reinforce in their own beliefs. they search out the information that says who they are. that is true for everyone. you have to be really careful about saying things like, you want to algorithm likely expose information. they mean other people, not themselves. this audience would never be okay with facebook saying we noticed you've read too many atlantic articles, now you have
7:29 pm
to watch a half hour of alex jones. there have been a november of empirical experiences that shows it is more mixed than conventional wisdom. there is a -- the quality of people's dues feeds go down if you turn off the algorithm. it is because of the ratio of how much people post of crappy news posts rather than writing something up about their kids or a birthday. it is a ten to one ratio. even in 2016, -- we have to be careful. the youtube algorithms that show studies that show that it can lead to radicalization. but there are other studies opposing that. there are ways that people are trying to change the acceptance, maybe slowing people down. slow down the speed at which
7:30 pm
information will flow. in cases like instagram they did an experiment less about fake news and more less about fake news and more about making people nicer to each other, slowing people down as they respond to somebody in a way that sounds like it is mean. it is saying do you really want to respond this way? there are a number of experiment like that that might be helpful. >> i notice some of you have cards that you may have questions that you want to ask. we have some folks in the audience collecting those cards. while we are talking, take this moment to finalize that thought and make sure gets into the hands of our card collectors.
7:31 pm
>> in terms of solutions, i feel that one example of a good organization working in the real world and trying to help is first draft. there is a simulation data. there is a simulation where they ask news organizations and other actors and individuals to act out. a deepfake has emerged, it looks like it is from the election of a country, what will you do? all of the big news organizations get into a tizzy. they don't have the time to look across multiple sources. they want to compete with each other with who is going to go with the story. that is a fundamental and ethical issue. because of the way first draft is asking news organizations to do these simulations ahead of time, it is helping news organizations realize.
7:32 pm
what is the problem with the journalistic process. you are relearning the process and you're nothing gatekeepers anymore. the process by which you build stories has not really changed. we have a finite supply of tweets to source from. some of those are just lies. in effect, there is a call to reinvent a story that is built. that is all for the good. the way we are building stories has to change. as far as personalization goes, i agree with what you said. it is behavior that is giving signals to the algorithms about what you want to see. but the design of the platforms is to have us hate around engagement methods and paradigms that say click click click. if you ask facebook what is your metric for engagement that
7:33 pm
is not based on clicks alone, not based on shares, if it is engagement based on what people have learned from this post. what people have comprehended about a particular thing. we are not a news organization but with the distribution, there is a fundamental question about what people are doing to each other with the algorithms. it is making our worse angels play out. we also have better angels that come up when we slow down. that is why we have the emphasis on going slow. on the distribution site of design, but i would ask you to ask of these firms, when you see something that google's organizations have said is false, why are the social media platforms urging you to share it? why are they urging you to like it? this feature called like is affirmance. it is universally offered on all of the posts regardless of
7:34 pm
whether or not they themselves are marking it as false. there is a need to think about what we offer and -- when we offer and what you and -- what features we offer. >>,. ,. . who is tracking the network travels of information, who are the key notes and why it is being distributed in a certain way. kate starboard, she was speaking in santa clara years ago about how they have a lab that they found the same people were sharing black lives matter hashtag tweets and blue lives
7:35 pm
matter. highlighting the sharpest aspects in either direction. these researchers are doing really important work, if some of the companies had listened to them years ago, we would be in a better position today. i do want to put the responsibility on the companies as well. you don't want to get me started on the facebook algorithm, because i am big on choosing for myself. the idea that somebody will know better what i should see really bucks me. i am big on autonomy. but i think they really do encourage certain behaviors. it ethics talks about what we can do ourselves to build our own practices in their own habits, and one of them is self control. i was saying earlier, be aware when something really seems exactly the story that confirms everything you believe, and don't cherry pick. but i will go broader than
7:36 pm
that. if you have any doubts about a particular story, don't share it. your friends are not going to be less informed just because you're not broadcasting that story on some social media. if we learn some of the self control techniques, which are made harder by the fact that companies are putting in place various avoidances to make us share quickly, we are going to stop playing a part in this whole misinformation ecosystem. >> do we have some cards? let's get started. >> i just have to say, we have to be careful talking about solutions. how much power we put in the hands of in very powerful companies. after 9/11 we asked intelligence agencies to keep us safe with no counter veiling activities. that was bringing all these heads to the cia, and yelling
7:37 pm
at them about 9/11. that is how we got the iraq war. that is how we got all that and it and i say disclosures. we have a tendency in the u.s. to feel like something bad has happened, we are going to yell at these powerful entities to fix it. if we put them in charge of that they will fix it and we might want that power back, but we never will. let's be very careful. it is emotionally satisfying to say that they are to blame, they have to control the speech of other people. and nobody ever means themselves, they always mean other people. just be careful what we ask for. >> my concern is, governments can't do it in the u.s., and if companies don't do it, then who? are we supposed to just accept the status quo? because of hate speech and disinformation and tire droves
7:38 pm
are driven out of the conversation. >> when you say they should do ex, it has to be followed with the limit is why. like turning off the ability to like or re-share because it was marcus disputed, it starts to make me a little uncomfortable about the amount of misinformation. >> let's start with the first question, i'm a highly skeptical facebook employee. it mark zuckerberg has been hitting talking poise that we should not be setting ground rules, what is your take on facebook's ultimate role? is it responsible civic, raw >> this is pointed out you. he has just changed his mind on this this week. i saw his interview where he said we should be regulated summer between the phone company and the newspaper. but we are now something new.
7:39 pm
then that was followed with, there is an op-ed in the back it is a super hard thing. they are getting tired of everyone complaining. you can sense or too much or too little. someone has to make it a decision. but they also want to wipe out their competition. they want their standard to be you must be this high to carry speech by people online. that standard of how high you have to be as one inch blower facebook is. this is where google is, this is where snapchat is, this is where tiktok is. something that would be really great would be legal requirements to do content moderation. but others can't afford to do it. so i think he has changed his mind on that. >> i would add that facebook already does a lot of what we are saying it should do or should not do.
7:40 pm
it already moderates content and all kinds of ways, it amplify some voices and suppresses others. it is not like this would be a new task for them. it just may be applied to different speech that it is now. but they are it already. as we have seen with something like hate speech, if you drive it off platforms like facebook, you will drive it onto smaller platforms where they are not doing this kind of control. >> that is fine. >> you can make rules around hate speech and be controversial but they can be based around risk. true and false on political speech is the most sensitive issue you can do. they are doing it but all of the arguments from the left is they need to do more censorship. we need to be careful about having them make judgments about first amendment protected
7:41 pm
speech. how many people think facebook should have taken down that polosi video? that is something that was done on the jimmy kimmel show to donald trump over and over again. should jimmy kimmel be kicked off? he would post these videos -- i don't think so. i don't think facebook should censor people. i think we have to be super careful about corporate of a speech where you're making fun of politicians. that that should be controlled by a trillion dollar intermediary that pays a ton of taxes and has a bunch of regulatory exposure and who can come under the control of the incredibly powerful executive branch any time. it is kind of nuts to say that we want them to do that kind of work. >> i think one of the core debates that does not happen often enough, alex is not new to this, it is the difference
7:42 pm
between amplification power and taking down or keeping it up. they have to find the truth and actually posted out. their job is being made hard by social media organizations. they will create confusion in our minds and then we don't trust that the news is true or not anymore, especially if it comes from a value system that is not ours. if i don't trust fox and they are right, i will not trust fox as a brand anyway. likewise, on the right, if they don't trust msnbc, they will not trust an individual expose by by that outlet. the biggest question here is if something is actually a lie and
7:43 pm
it is being posted -- let's reduce this down to political ads. if you ask the question should lies -- what is happening is the goal of the advertiser is to target a bunch of people to make them believe that this is actually true. if you keep doing these kinds of ads, 1000 times, 5000 times, you are going to make a reality emerge in a bunch of people's minds and they will not see any other kind of reality. you have a situation where you will have 5000 people believe five -- and another group of 5000 elsewhere that will believe that thing did not happen. that can happen if you start putting lies in ads. the law part, the question from an ethical standpoint, this is democracy. you cannot have democracy without truth. you cannot. these companies, including news companies, media organizations put lies in their headlines.
7:44 pm
both are similar in this space. they are borne out of a democracy. they entire cabalistic engine that led to these successful companies, lots of good has come from this, if you are born and run in this kind of government, there is more responsibility than just the first amendment. we have to say what is the truth about. if we keep asking the question, we have to be neutral, that is coming from a different angle. it is actually coming because one part of this country, the conservative, part finds that if they apply neutrality, then they will have to take down a lot more speech on one side and less on the other side, the left. then there is proportional takedown, there are also some
7:45 pm
of questions here, about why social media companies don't want to use truth as a lever. i think those questions have to be talked about more than they are. not in a private company context, that is a journey that america is experiencing now. >> one more thing, we are all, if we are interested in this conversation, that means we have been pretty aware about some of the debates. but there are a lot of people out there who all they get is each side accusing each other of fake news and disinformation. journalists wrote about this years ago, about how this was done in russia long before it happened to us here. and what it does, the technique is just muddying the water to the point where people don't know at all what to believe, what is going on. it creates a paralysis that is
7:46 pm
not good in a democracy. if we want people to vote and have opinions about things, participate, we can't live in that soup of i do not know what reality is, every source is bias. a pox on all of their houses. we have to figure out how to have some kind of measured skepticism but also trust in certain outlets and in certain groups. enough to keep the initiative to function as citizens of a democratic society. >> one of the group's most vulnerable to disinformation are older adults are less tech 70. they vote. how can senior citizens be educated to be less gullible? >> that an excellent question. almost always you hear young people today, young people don't know how to read the media. so i actually think we need to
7:47 pm
do our reverse stream of education in which, first let's get to young people in school so they understand what is going on. all these things that we are talking about can be explained to young people. but then seriously, have them go and talk to the older folks. i think we should not assume that the older folks are not interested in finding out about this. maybe, engaging across generations is the way to do it. maybe the grandkids have something to teach the grandparents. >> i've response to that. one of the problems that won't get addressed in this. version that we speak about is journalism itself. the reason older people get tricked is because there is some other sense of alienation, discontent, uncertainty. all kinds of stresses that people have. older people are under stress.
7:48 pm
when older people are stressed, they will not have their usual skepticism to things they see on their feed. when you ask what kind of things local journalists should be saying into their actual community, the necessary community building that journalism used to do is not happening as often in areas where there are not as many news organizations. usually it is somewhere that has a democratic actor working. people will talk about this and say did you see that post question mark, i was about to share it what happened? if those conversations are not happening anymore, that means because there is no journalistic role elsewhere. then you will have people just posting away. it makes them outraged that
7:49 pm
they have to see something. there is a deeper problem in the u.s. that you guys may know about that can't be handled at the social media level. it is upstream of that. that has to be included as well. >> we are talking about the weaponization of cynicism, do we have time for another question? this is about labeling news content. there has been a push by platforms to label content, you know saying that this is russia funded, or a tweet with an orange heart saying it is harmful. this is not content neutral but gaining traction. is this a good and sustainable solution? do you want to tackle that? >> it is reasonable, but you have to be careful with these labels. but number of experiment show that you tell somebody that something is false that they are more likely to engage with it. people don't want to be told what to think. >> rivers psychology.
7:50 pm
>> even if they know it is truly false, not a big thing that flashes, something more subtle, you have to be very careful about using real research and the language you use. there are two different things there, context and identity. labeling the identity of groups that are known to be actors or sponsored by specific government is a different thing. the companies have been building up teams to ferret out that activity, to look at coordination between different groups and actual collaboration with the intelligence community in the u.s., to understand who the actors are. and what their technical specs have been looking like. that has been very effective for covert propaganda. the overt propaganda, a variety of different groups, that labeling is fine and it should not be that controversial.
7:51 pm
you have to be careful. >> are there any software tools to identify bots, disinformation, or is facebook and twitter implementing any tools like this? any flagging? >> those are two different questions. a bot means it's a computer program. twitter has done a ton of work on this and in fact their numbers are a lot better with how much automated posting there was. that was never a problem on facebook. the majority of the people, all that stuff you saw pushed, that is people doing it. you don't need robots in a lot of cases because these actors are operating in low-cost environments. some of the biggest domestic propaganda groups that were shut down by facebook where outsourced to vietnam and pakistan. they could get lower cost
7:52 pm
talent to create the content and push. it so there is a bunch of work around detection of a new account, there are 10,000 tweets in the last week. you could be on a real meth binge, or this seems to be a professionally shared account. there has been work around that and that will continue. but they are also hybrid accounts. there is a mix between bots taking over to make it harder for them to be designed. i want to go back to where we started at the beginning, with a sense of urgency that i feel like we are still responding to the things we are learning about in 2016. we have an election coming up, we have all of these things acting to shape our behavior right now. we can't afford, if the labeling will play apart and let's do it. it might not be perfect. we will have to do a lot of other things on different levels. but we need to do that now. we can't talk about it as if it is a theoretical discussion
7:53 pm
about what might happen in the future. if democracy is being undermined right now, we won't get a chance to get to do those things in the future. >> since we are speaking about twitter, and i would like alex to react. they announced this week that they are going to start labeling lies. six months ago, twitter did a consultation, they asked everybody and now they have come up with this approach. it is labeling. if i lie to my 20 followers it will get leaked. what do you think? >> i think we have to see. this is the problem, there is a lot of guessing with what people think works. the empirical evidence actually looks different. i don't know twitter is done any testing. we did a ton of testing on facebook, and found the exact
7:54 pm
wording was really key to get people to engage with the alternate information. my hope is that twitter is on the same testing. i can't speak to it though. >> we have to wrap up now. we saw this week, the democratic presidential hopeful michael bloomberg spending money on paying individual californians head of the primary to post positive things to him, to text their friends with positive statements, is this taking things into a whole new realm? or is this just more of the same? and we have also seen facebook saying they're not going to take down such posts. that they are accepting this as a new normal and how are campaigns work. >> that's not totally true. >> they are allowing it on instagram, the problem with
7:55 pm
what they are doing now is that none of it is marked. i'm not sure how facebook is supposed to know that if somebody likes michael bloomberg that they 25 dollars. there is no guidance for this, the laws have not been updated. it is not clear whether this is legal or not. >> i agree with alex. the federal election commission needs to do much more. right now i don't think we have enough commissioners to do anything. >> thanks to the fact that republicans will not fill the board. >> we can hope for new rules, but that is not likely to happen. so what we do in the meantime? to your point, yes i don't think we had run into this type of payment to individuals at this scale before. we >> know. the russians had to do the fake stuff because they can afford to pay people 3000 dollars a pop. it is hard to categorize this thing. is it disinformation? it is just bizarre.
7:56 pm
the problem with instagram influencer culture, -- >> there are a lot of things that led to this. can we do mass shaming of anyone who would participate in this? i speak of someone who hates that it would never advocate that. but i am worried about these new things coming up. and estes sitting on her hands and saying, that's an interesting new phenomenon. i wonder what it will do to our country. we have to do something. >> i think there is an opportunity to learn about the organic look lays a shun of food, that is happen for a while from farm to table. apply some of that to the content that we see. for example if you take the bloomberg case, it is an example of inorganic activity. it is an actor spending money on the my 1000 friends to go and see this type of content. that is not organic content. so we need a label that says
7:57 pm
this is in organic. but that means vocabulary. we have to know what label literacy is. you have to see that social media has the supply chain of information and you have to have a way of saying that the content that i see my news feed has these labels. because when you go in the supermarket we see the labels, we see that it is organic, or whatever. if we can learn our way to do that, there is a responsibility there. a responsibility of actors, of people, we are at the point where this thing so deep and fundamental that there has to be a deeper evolution across the whole supply chain. some labels are important, but i agree that blind experiments, or in very small firms, just shift the label out just saying lies.
7:58 pm
i would like labels that would slow people down. if you have any idea that of an idea that will slow us down, please tell me. >> that is one of the things that is happening is that there are journalists who are examining these practices and letting us know about it. >> the one good thing about the bloomberg action is at the only chance of fixing these problems is a bipartisan belief that we need to change the laws. the only way is if the republicans lose this year. honestly. the only way we end up in 2021 having any bills around changing the rules, change and campaign finance, fixing our election security and having rules around online ad, the only way that happens is if we have a democratic president partially elected by a billionaire oligarch, doing it from the inside because then we will finally have some bipartisan support that this is not how our politics should work. >> we will have to leave it
7:59 pm
there, thank you so much for coming out tonight. please think our panelists. (applause)
8:00 pm
coming up, on american history tv, presidents and first ladies. we start with our series, first ladies, influenced an image. and it look at the life of dolley madison, the country's third first lady. and then a conversation on elizabeth monroe and luis adams, the fifth and six first ladies. after, a look at the relationship between james and dolly madison, and then a discussion about john adams and his son, john quincy adams.
8:01 pm
dolly was both politically adept and savvy. >> madison is not a lot of laughs, but he was his best friend, and she compensated. >> it was aaron brought a letter that james madison wishes to meet her. >> she carved out a space


info Stream Only

Uploaded by TV Archive on