Skip to main content

tv   Samuel Woolley The Reality Game  CSPAN  February 22, 2020 3:30pm-4:16pm EST

3:30 pm
discuss the origins of online disinformation. the national african-american read and. an event that promotes literacy during black history month. later, author christopher looks back at the millions of people in florida in the 19th 40s. it all starts now on book tv. >> i would like to welcome you all, we are thrilled to have samuel woolley with us here tonight. celebrating a very important new book called "the reality game". how the next wave of technology will break the truth. it's from our friends at public affairs books, doctor woolley is a writer and researcher specializing in the study of a.i., emergent technology, politics, persuasion and social media. he's an assistant professor in the school of journalism and program director for propaganda
3:31 pm
research at the center for media engagement at the university of texas at austin. professor wally founded and directed the digital intelligence lab at the institute for the future which is a 50-year-old think tank based in the heart of silicon valley. also cofounded and directed research team at the computational propaganda product at oxford internet institute at the university of oxnard. he's written on political manipulations technology for a variety of publications including wired, the atlantic monthly, vice, techcrunch, the guardian and many others. his research has been featured in publications such as the new york times, washington post, "wall street journal" and also made appearances on the today show. in frontline, is presented to members of nato, u.s. congress and uk parliament. it's a great honor to have him here with us tonight during this incredible and important research.
3:32 pm
give him a warm welcome. [applause] >> hi, everyone. great to be here. i think my bio sounds kind of fake. crazy. this is the last leg of my tour. happy to be here in san francisco, specifically here at city life. thank you everyone for having me. i can think of a better place to end this to her and talk about really a book on democracy at the end of the day and the way we rebuild technology. a lot of people assume when they talk to me about my work that i am a computer scientist. that is essentially not true, nothing could be further from the truth. i kind of thought maybe i should try to play the game and be like yeah, i took a few classes, i know a little bit about hdmi output at the end of the day, on the kind of person who studies by talking to people.
3:33 pm
i spent time in places, i spend time with people and i go deep. i go deep on subjects. for the better part of the last decade, i've been going deep on the subject of what i call propaganda. a fancy term for the ways in which automation and computer code, algorithms and things like that get used to manipulate public opinion. what we've seen in the last four or five years, the u.s. election and referendum in the end the massacre in india recent problems caused by what's up that led to off-line violence, we seem social media become used as a tool for manipulation and disinformation. a lot has changed. in the early 2000, we had a perspective that social media was going to be something that would be the savior of democracy in many ways.
3:34 pm
that's shown through google kind of talking phrase of do no evil. was also showcased by a lot of the work that came out about digital utopia and cyber libertarianism. that's not where we are now but we are not lost. everything's not lost yet. it's not just a book about how screwed up everything is and how scary the world is. it's a book about solutions, every chapter in this book ends in a solution. after spending nearly a decade working on this, i realized there are a lot of things we can do. i will end the talk today on those things. first, let's talk about storytelling and what it means to be somebody who studies technology by talking to the people who build technology. tonight i'm going to introduce you to four people, four places and for ideas that i learned in
3:35 pm
the last several years. these are people, places and ideas have been instrumental in this book and how i've been thinking about technology. the first person is named phil, he's who the book is dedicated to. he's my advisor, he's not the director of the internet institute at the university of oxford and he took me under his wing when i came to the university of washington in seattle. phil, at the time, had been studying, he'd been in tunisia starting the people who were using the technology to communicate about democracy, to organize protests, to do all of these things. he had written a book with the press called the digital origins of dictatorship and democracy. the discussion in this book was all about the ways in which the internet played role from the beginning of the internet and countries for books facilitating democracy, helping people to
3:36 pm
realize freedom but also for helping people to realize control. so phil obviously was thinking about these kinds of things very early on. i had just come from being a fellow on the obama campaign 2012 and i had become enthralled when i was working on the campaign with the ways they were making use of data. i was blown away by how sophisticated the data was. there's a lot of excitement about the community organizing aspects of the campaign but really, none of that would have been anything without the connection and data the obama campaign had. so what they did, they married the data and masses amounts of works with personal stories and humanizing the data in a way that was able to reach people. a resounding message of hope and
3:37 pm
things like that. when i met bill, phil taught me something really important. something i had kind of known but really i'm saying to all of you, technology and politics today are inherently connected. you can't have one without the other. some extent, you think of technology simply as tools or media and the ways media is used to communicate with people, this has always been the case but in today's world, technology and politics are very much intertwined. the campaigns that tend to do the best around the world these days are the campaigns that have the most technological savvy because the reality is, if you have a lot of data and if you can marry it to a sophisticated a.i. system, you can do hyper specific targeting to people and
3:38 pm
speak to them in a way they would like to be spoken to. so phil in seattle, university of washington taught me that people and technology are intertwined. the next person i want to introduce you to is a person named andrew. i met andrew in england and i had taken a job at the university of oxford we had gotten grant money to study propaganda in about 2013 and european research. they wanted to know the way in which russia was using social media to influence democracies. so i got offered a job and said would you like to go to oxford with me? yeah, twist my arm, of course i want to come. one day i was at a conference and i was standing around, kind of scared about it and this guy
3:39 pm
approaches me and conspiracies theorist want to talk to you. they wanted to know who i am and are you going to start talking about aliens or anti- vaccine stuff? i have to carry on with you? i don't want to talk to you about that stuff but andrew said to me hey, i make and build automated profiles on social media in england. i was like what? are control 100,000 accounts on twitter, i do on behalf of the labour party. they don't pay me to do it or anything. i believe in it. i said wow, this is pretty crazy. we got to talking and struck up an unlikely friendship and he really taught me a lot about the ways in which people can use technology to amplify their
3:40 pm
voices online. a lot of what i talk about is about social media, campaigns look like people on social media but that are not actually people what they are automated profiles. one person could have many, many accounts online. you can use the accounts to retweet messages and get up likes and the evolution of machine running in a to talk to people in a more sophisticated fashion. it taught me that there's always a person behind the technology. the technology doesn't exist on its own. social media today would have you believe the algorithms are a political debate, they don't have values, they make decisions in a way that no one will figure out or decide upon but if you follow the work of people, they have something called social
3:41 pm
media collective that does fantastic open work, you know that algorithms and software and technology always have human value in the. the people who build these things and coat them have their own beliefs. for instance, if you train a machine learning tool and what you have to do is go through the process of taking data and you have people do that. if all the people who tagged the data wightman, the algorithm and being racist. they should go in a certain neighborhood hypothetically. the neighborhoods of color, where the bus goes, the places that get free passes for the bus and not getting as much bus is coming through. it's an algorithm or tool with a set of values. when you amplify information, when you use social media, it
3:42 pm
tends to replace them. the algorithms that prioritize certain information, there's politics there and decisions that go into that process. if i had a dollar for every time someone said it something arbiters of truth, i would have $10000 because they don't want you to think that they arbitrate it. i'm here to tell you that's not the case. algorithms that prioritize sensitive information to people, they prioritize the things you see. for the longest time, even tod today, organizations like facebook, twitter, have make decisions about how to prioritize news to people. think about that. that matters. andrew taught me that you need to look at the person behind the tool. it's not enough to do quantitative research, we often need to know about whose
3:43 pm
building for their building and who they are doing it for. you might think it's savvy political campaigns during the work but it turns out when you dig down deep, defined shadowy pr and marketing organizations that say i have your social media profile and have 10000 accounts in the next few weeks, surprise surprise. they are using fake profiles and fake information. it's a whole weird world out there. the third person or to introduce you to is a woman named marina, she was my boss at the institute for the future. i never even, a day or two days after trump 12016 election, the first time i've ever been there and they flew me out with a bunch of research scientists and politicians who are really concerned about the reputation of a.i. a.i. hadn't been weapon iced, it
3:44 pm
had the word like smart and change their mind about politics, it was more thought a.i. was used behind the algorithms but marina listened to all of the speech and in her way, set at the end of the talk, this is a continuation of russian stuff, she grew up in ukraine and the former soviet union and she said maybe what we need to think about isn't that this propaganda is new because it's not. the tactics aren't new. they are continuations of things for a very long time, it's the technology and the way it's being leveraged that's making it potent. if the automation we see, we see things things with problem of scale. quantum computing. all of these issues are things that we need to consider. we also need to consider is the
3:45 pm
people are behind it. rena taught me we need to look at history. i think the way that they have done it in the past and understand not just the way the propaganda is used but the technology we used to assess it. what we say matters. right now, there's sort of an epidemic in this country, i'm using the word fake news. it's been weapon iced. if you want to ask what you can do as an individual, don't use the term fake news, use the term dis- information which means purposely spread false information or you can say mount information which means stupid. jump news. the terminology matters, it's no surprise to me that people who
3:46 pm
spread these lies are taking on the terminology and making the same exact arguments about how you say you are doing this but then they say you are doing this. it doesn't necessarily change their minds, is to create confusion and generate empathy. it makes people mad and polarized. we think there's sophistication in the sense that their coming and talking to us and making us suddenly become interested, that's not what they are doing. they are there to make you not wanting to engage in democracy. think of the system is broken so you are so angry that you speak vote for somebody who speaks your anger. it's right that this comes out
3:47 pm
of the soviet playbook but you should know the russians are the only people do propaganda. they benefit from speaking about propaganda and talking about all the time. it's the thing that only happens from that. propaganda and what we called operations happens all over the country these days. it suggests that during elections and in 70 or 80 countries, this stuff has been weapon iced by government and campaigns. at the marker citation of propaganda, almost anyone can do it these days. my next book, it's more scholarly so probably even more boring. my next book is called manufacturing consensus and it's a chunk of consent and the idea
3:48 pm
behind the book is that we use these technologies for popularity of things. the more you make something look popular, the more it seems like a viable idea. one more person. the last person. last idea. kathleen. she's my boss at the university of austin. she's the dining editor and before that, a sports reporter. i don't know how you make that transition. she's fantastic. when i went to ut, i lost hope because of everything going on. my spending time thinking about the ways it's broken? i was at journalism and kathleen is the director. kathleen taught me that we need to place faith in institutions we arty have. we don't need to create
3:49 pm
brand-new things, we have the federal elections commission, we don't need the federal misinformation commission. we don't need one more commission to do things in washington. more specifically, we need to invest in journalism. in this country, is done amazing things. there are so many people who work for great publications around the united states that want to do good work and want to protect democracy. i talk about the ways in which journalism has been not just challenged by the digital era, or organizations who can't handle the individual era, organizations google, google news, youtube, facebook, twitter massively benefit of the work of journalists without getting money to them. these organizations like the media, youtube faced misinformation, what is it duplex started making wikipedia
3:50 pm
articles. wikipedia is a nonprofit. youtube was using it as their resource that youtube would go to with misinformation. google news belonged to snippets of articles. when people started researching it, you can reach the actual article. when you click the actual article, sorry, the research showed that no one read the full full article. they just read the little piece. they put all the work in doing this investigation, writing the full article, post a snippets and no one actually reset. we wonder why the news industry is failing, why is having a hard time. maybe not failing maybe that's the wrong word but we can reinvigorate it. i'm going to tell you exactly what my argument is.
3:51 pm
the technology firms around the country should have to put 10 billion or $20 billion into a public trust. let it be overseen by groups, people who have a stake in making sure the money is spent wisely and well. they have committed to $250 million or so for the initiative, google gives out that money. they make partnerships with organizations and they make decisions about who gets and who doesn't. what google has done when they have experienced backlash of other policies and algorithms not prioritizing full articles, they do prioritize news. so that's not good enough. the technology companies have helped this problem and they have exited to it. we've seen mark zuckerberg sitting before congress thing and no analytical and fat stuff, we have a handle on that and it's kind of our faults but they
3:52 pm
haven't really given back. i haven't really systematized their response to this problem. they've done some things and they've been working hard in many ways but it's not enough. these are multibillion-dollar company richest companies in the world. they get treated more like nation these days and they are treated like a regular company. so kathleen taught me to reinvest in journalism and be skeptical of what we see today and not think that journalism has failed. journalism is a lot better. with all these things in mind, we have an interesting picture and we have this book and this is a book about the future. i spent a lot of time about what we've been through. this is the next wave of technology, deep fake videos, virtual reality, automated voice systems and it sounds just like
3:53 pm
a person. it's a lot about the ways in which this next wave of technology is more potent of misinformation. the subtitles provocative for a reason. it's supposed to scare people. best case, you will prove me wrong. you will not let the technology be the truth. i'm going to do a little reading and i'll end with some solutions. conclusion. human rights in mind, finding solutions to the problems with misinformation and political manipulation is a daunting task. this information landscape is vast and it extends beyond our current ability to contain it effectively. moreover, the internet grows larger every day. in 2017 report, on the state of
3:54 pm
the software which combines research from companies, weekly 2.5 quintillion bytes of data every day. i don't even know what that number means. moreover, the number of internet users by 1 billion or 3.7 billion active users in the five years previous to that report. 2012 -- 2017, the internet grew by 1 billion users. 2018 forbes article says 90% of the on my data available in the world generated in the previous two years. 90% of the online data available in the world generated in the previous two years. this means people working to gain public opinion in social political oppression using all my tools have almost unimaginable amounts of data available with new information deeming out to them every millisecond. caps off access to potential targets and can leverage online enmities, automation to remain
3:55 pm
on trackable. important ethical and legal considerations along with the possibly of finding a skillful operative made prosecution a poor strategy for propaganda. he said, we've got to get the system. it is time to build, design and redesign the next wave of technology at the forefront of our minds. thinking about responses to the rising propaganda, i find it helpful to break them down for the short-term, medium-term and long-term. because of the nature of technology today, i consider dual or technology-based responses to be the shortest term of all. many of these efforts have approaches focused on triage help for the most egregious issues associated with 2.0, the internet of social media. tweets to social media algorithms, identifies trends or software patches to his existing tools.
3:56 pm
they include new application for identifying news or plug-ins that track advertising. they are useful as far as they go but they are constantly evolving. what works are bots on twitter today may not be useful years from now. in fact, many of the applications i haven't built for such purposes quickly becomes dysfunctional made by the social media from. lack of funding or upkeep from a propaganda finding a simple way to gain them. there are useful products of this kind like box checks, propaganda on twitter and fake news sites like using a browser. these programs need to be harshly updated and translated to other platforms to stay relevant and useful. they present a promising start for tools that alert users
3:57 pm
misinformation but they must be combined with action and technology firms, governments, news organizations and others to be truly effective. another example of a propaganda is to have a dashboard, a product from democracy of the german marshall fund. it was built to track russian twitter accounts. although it's important to identify and report automated social media traffic, equally important to notify users that they may be encountering false news reports, these efforts are two passes and two focused on user based propaganda. also, it's important to remember a good deal of research shows fact checks do not work. social media firms are engaged in the constant battle for new types of innovative but human -based operations. more than anything, i want to communicate that anything is not lost. the only researchers, policymakers around the world but also the technology firm.
3:58 pm
the digital propaganda. employees of facebook managed to dismantle misinformation, advertisements from loans to which candidate should get a vote. they confirm military research and manufacturing. it's also clear to me that today's arch tech firms have to get real with themselves. they are now media companies. yes, arbiters of the truth. they owe a debt to democracy and the free market and it doesn't mean they can ignore the former. one of the things you might not know and you probably don't know, why would you? the english version of the book is subtitled how the next wave of technology will break the truth what we can do about it. for some reason, it didn't make the american version due to editorial decision but i like it.
3:59 pm
so i will tell you a few things. we talked about the short-term, medium-term and long-term. the short-term, the best think you can do is pretty simple. read the whole article, i think that i myself, the other day, caught myself tweeting an article i hadn't read the whole thing off. i was like, what am i doing? why my doing this? i should know better than this. one of my closest friends with a phd at the university of washington, whose brilliant, got a message saying that they, i want to say who, had shared a known piece of russian information on twitter. this is a person who studies this stuff and knows it really, really well. if they can be fooled by it, if i can be fooled by it, then we all can be fooled by it.
4:00 pm
we have to read the whole article and think very carefully before we share what we share. but we are seeing right now isn't the proliferation of deep thanks, it's not like sophisticated a.i. that makes donald trump saying something that is not the same. it's a weapon that's coming and we'll see more of it, what we are seeing is regular people sharing videos that are edited online, videos of joe biden that look like he's racist and are edited and nancy pelosi looks like she's drunk. ... i have the credentials revoked and then reinstated through an episode of mr. magoo. so we have to be careful of what we share, the thing that people can do is talk to the people that they love and care about. i just wrote a report for the national for democracy in d.c.,
4:01 pm
it is called demand for this information. the main take away i got from the report, mcauley katie joseph deserves the credit for the work, the big take away i had the only really people to change their mind when it comes to these issues and the polarized nature of the united states at the moment is to talk to pete about the people they care about and love. you don't change your mind based upon the conversation you have on facebook and the argument that you have was someone that you do not know. it needs to be a conversation that is civil and about a topic that you care about. that needs to be conveyed. those are short-term things. in the medium-term, we need policy, we need regulation i am so sick of hearing from people that this is a user issue and self-regulation don't resolve this. you can't like google, facebook and twitter and the powers be make decisions internally about what they will do.
4:02 pm
what ends up happening when that's the case is researchers like me, your research is flawed, it's not scientific, they say you did not have all the access to the data that we had. and you didn't have a representative in the quantitative enough and i say, maybe you could share a whole data set. and i'll do the analysis, how about that. yeah, we'll figure that out and you have social science one that is supposed to do that but it's not happening, there's no regulation that holds them accountable for what's going on in the platform, there's no oversight. nearly 2000 and the federal election made a decision that basically they were not going to look at any political munication during elections online. they look at tv, radio, magazines and they make sure the campaign is doing illicit messed up things in this area but in my
4:03 pm
mind they don't look at it at all. that is hugely problematic. the government has a huge role in this. and the government needs to do something. currently, nothing will get done, there is policy that has been created waiting to go before i left polarized congress but i'm very happy with that i think is well-informed but right now there's not much appetite to it. however, we can look at other countries and other places throughout the world to figure out the ways in which they're dealing with the problem. some countries in south america, and then there is a long-term fix, the book the tagline is you got a desire with democracy in line and human rights in mind. my belief, the platform used today which is not much of a belief it's a fact, was designed for engagement, they were
4:04 pm
designed for scale to grow for sizes, we don't have to accept that, it's not something we asked for, it's something that exist that is infrastructural and society, facebook has over 2 billion users, they were designed to make money at the end of the day. and what happens when that's the case. i think we get what we have now, we have gotten what they built and what they built was a system that did not prioritize high-quality information or engagement or civility, inbuilt systems are prioritize the opposite of those things. so i have a belief that it's possible to create technology in the interest of democracy and human rights and we sing it to some extent and we can see more of it in jane who guided me on this book, he is a designer and offer of reality, i created this
4:05 pm
with the funding from the video network and you can find it for free it's a tool the it offhanded that gives technologist a bunch of publications about how to think about the problems that can happen with the technology they're building as the building is a toolkit to think through things. introduction in computer science use it as stanford and we talked about, nadir of using it in which they can leverage it. it's really exciting because this seems to be more questions we can ask. the long-term is more of a challenge. simple level we have to reinvest in critical thinking and public schools, we have to reinvest in literacy in public schools. it's not fair to expect that this is a user fix and a people
4:06 pm
base problem as we have been told when half his country doesn't give critical thinking until you get to college and a lot of us don't get there. it's time for us to create a robust system of technologist. in the 1950s a bunch of foundations came to the united states to create public interest law because if you are poor you cannot get a lawyer. and you might say to some extent that's true today and i would say you are right but we came along way. we need this intake for technology. we cannot always have the best and brightest going to google and facebook to make money because they think they will do something good, they are sold that way. we need the best going to nonprofits and universities and we need people to understand to help build legislation and we need people to understand how it informs how propaganda. while political hero called the
4:07 pm
abuse act that was put before congress before the senate a year or two ago was laudable and also not feasible at all. because i don't think technologist had really looked at it because the bill was written in a way it did not understand it was structural to the web. over half of all internet traffic comes from automated accounts. that probably would not happen if we had public interest. when we saw study germany we thought we would find a lot of people sharing this information but we did not. and we were wondering why the heck that was. we knew the far right in germany had become very powerful they have a really robust public meeting system. they had a robust program for
4:08 pm
critical thinking in public schools, after world war ii with what germany experienced with the not seized, they made it illegal to talk about white supremacy and things in the public. they were very afraid of this country to think about speech because were really conservative rightfully so in some way. but we cannot always treat the first amendment as if the odds with the right to safety in a right to security. we have got to figure out a better way. the introduction to this book i begin with a quote from betty reed who is from the bay area in the east bay in her late '90s, she's a park ranger at the museum and she's amazing. my wife took me too see her give a talk and i was reluctant to go and i found myself in tears at the end of the talk because she's the one of the most inspirational speakers i've ever heard. she's so amazing. i recommend looking her up on
4:09 pm
youtube. at the end she let you ask questions and she says every generation i know has to re-create democracy and the time because democracy will never be fixed. it was not intended to. its impetus of a tory form of government and we all have the responsibly to form that more perfect. [applause] >> now i think we can do q&a, i'm happy to manage it, don't be shy, any questions anyone? >> i have, then a question. first off it's interesting how you brought in the component and when you brought up kt be in the 50s, we've seen the stuff before but to me when i think about emerging i think about as you scale some of these things up you get more than just the stomp of the parts.
4:10 pm
we do see the emerging effects of some of the stuff at scale that does render them the new beast in a lot of ways. it kind of think about the protection stuff, for a few years will be able to do it for sure, five years maybe, ten years optimistically. but ultimately a lot of the techniques are asymmetric to the attacker. and i think that is something really important and against loss in the discussion. the other question i wanted to ask about was what you brought up in the book, a lot of the blame gets put on the people in the delusions, i totally agree to libertarian thinking when it comes to whose building the platforms and algorithms. but ultimately to me those systems are just answering to a board to shareholders and to me
4:11 pm
all of this is symptomatic of capitalism driving things at the base i guess to get into the heart of that, for everyone else we see the campaign to the public and often their disinformation when they begin. when i was in jigsaw, it was interesting it was an experience that really shook me. what they called it was seeding and fertilizing, you basically plant the seed and then fertilize it and then you let the other people do the work and spread the propaganda. it is difficult to track, there is a problem where you can't
4:12 pm
figure out where the snake's mouth begins in the tail ends. i think with the second question, yes. this is a problem of capitalism and the free market, there's no way of getting around, when you prioritize for profit you prioritize systems that take advantage of people. not to out myself but my masters in critical society, and i spend a lot of time taking about the spring and i believe that serious changes have to be made but i don't think we have to throw the baby out with the bathwater. that's what we have right now, it's the best system that we have but the quote gives me hope because it's the idea that democracy can look different in different generations and i don't quite know what the answer is, i cannot answer your questions, i wish i did have the answer and act like i don't know
4:13 pm
anyone and probably just leave. but we can rebuild democracy in different ways that interact with capitalism in different ways and we need to speak the language of the market and i've been talking to people at facebook and google and maybe it'll be good business for you. too actually do something beneficial for society. think about that. how do we market that is business. other questions? that's great short and sweet. that's the sweetest of all of them. thank you very much for having me. this is great. [applause] >> here is a look at books being published this week. historian eric larson looks at
4:14 pm
winston churchill leadership of britain during germany's attack on the country and world war ii. in the nation city former chicago mayor argues that local government is the catalyst for innovation and national change. in revolution, deputy national security advisor for the trump administration provides an inside look into the white house. also be published this week new york times columnist argues that america's success has led to nation and personal discontentment in the society. in the watergate world joan reflects on her legal career that included the watergate case and the free assistant special prosecutor. christian a former white supremacist offers his thoughts on confronting hate speech and violence in breaking heat. talk radio host argues that the politics of outrage endangering democracy and grace cancel. and facebook wired the history of the largest social media
4:15 pm
platform. look for these titles and bookstores this coming week and watch for many of the authors in the near future on book tv on c-span2. and tonight on book tv in "primetime" colonial historian looks at the late revolutionary war a good ancestor podcast host addresses white supremacy and racial injustice. george mason university professor examines the possibility in the union. put some prize-winning journalist look on issues facing the working-class and rural america. and they argue on the focus on what is bad. it all begins at 6:45 p.m. eastern, trigger program guide for more information.


info Stream Only

Uploaded by TV Archive on