Skip to main content

tv   Meet the Press  NBC  January 2, 2023 2:00am-3:00am PST

2:00 am
. >> this sunday, democracy disrupted. >> stop the steal! >> social-media platforms are reshaping our politics. >> we try to do what's best for our community and the world, acknowledging that there are difficult trade-offs. >> the algorithms of silicon valley are now some of the most powerful forces fighting for our attention. >> we approach our work without political bias, full stop. >> it's technology that critics say is fueling misinformation and polarizing content for clicks and profit. >> a safer, free-speech respecting, more enjoyable social media is possible. >> this morning, we're going to look at how the technology companies built platforms connecting the world that are now challenging the very foundations of democracies. >> i'm very scared about the upcoming generation.
2:01 am
>> frances haugen, a data scientist who became known as the facebook whistle-blower, joins me to discuss the solutions. plus... >> social media is out of control. >> big tech now faces that big tobacco jaw-dropping moment of truth. >> congress is concerned about how big tech controls what content we see. >> you have used this power to run amuck. you have used it to silence conservatives. >> we do have national-security concerns, at least from the fbi's end, about tiktok. >> democratic senator amy klobuchar of minnesota and republican congressman mike gallagher of wisconsin will discuss what congress can do to regulate social media. joining me for insight and analysis are new york ti technology reporter cecilia kang, former republican congressman carlos curbelo, former homeland security secretary jeh johnson, and elizabeth nolan brown, senior editor at reason. welcome to sunday and a special edition of "meet the press."
2:02 am
>> from nbc news in washington, the longest-running show in television history -- this is a special edition of "meet the press" with chuck todd. >> good sunday morning. happy new year. 2023 is here. this morning, we're going to take a deep dive into the social-media platforms that profit from grabbing onto and monetizing our attention to the tune of billions of dollars a year and with almost no regulation. a majority agree social media's influence on our democracy and our national security is a big problem. sixty-four percent of americans believe social media has been a bad thing for our democracy -- two-thirds of the country -- creating polarization and division and eroding civility in our politics. all of this according to a new pew survey. it's an attention economy whose business model depends simply on persuading you that you and your way of life is somehow under attack in order to buy your time and attention, as whistle-blowers from these companies have come to capitol hill to warn us about.
2:03 am
>> i'm reminded of one conversation with an executive when i said, "i am confident that we have a foreign agent," and their response was, "well, since we already have one, what does it matter if we have more? let's keep growing the office." >> and rather than address the serious issues raised by its own research, meta leadership chooses growing the company over keeping people safe. >> during my time at facebook, i came to realize a devastating truth. almost no one outside of facebook knows what happens inside of facebook. >> eighty-five percent of americans say social media makes it easier to manipulate people with misinformation. we've seen it from russian efforts to influence the presidential election, to qanon. in fact, one 2019 report tracking a dummy social-media account set up to represent an anonymous conservative mother in north carolina found that facebook's recommendation algorithms led her to qanon in less than a week. and then there's the thriving antivaccine groups that the president himself called out last year.
2:04 am
>> on covid misinformation, what's your message to platforms like facebook? >> they're killing people. >> facebook was used by members of myanmar's military in a systematic campaign as a tool for genocide, and social-media platforms from facebook to twitter poured gasoline on the fire of the capitol attack on january 6th. a whopping 79% of americans say the internet and social media have made americans more politically divided. growing shares of both republicans and democrats say members of the other party are more immoral, dishonest, and close-minded than other americans. perhaps it's because they only hear about the other party via social media and not normal interactions, like we used to have in the pre-social-media world. and social-media companies are profiting off of americans' anger online. starting in 2017, facebook's ranking algorithm treated angry emoji reactions as five times more valuable than likes. why? well, anger generates clicks, and clicks generate profit.
2:05 am
what's happening on social media is the equivalent of using the same pipes for your drinking water and the sewer system. >> the better you are at innovating a new way to be divisive, we will pay you in more likes, followers, and retweets. has partisanship in television and radio preexisted social media? yes. have we ever wired up the most powerful artificial intelligence in the world pointed at your brain stem to show you the most enraging content on a daily basis? and the longer you scroll, the more you get? we have never done that before. >> we are experimenting on brains, america, and the business has never been bigger. when pew began tracking social-media adoption in 2005, just 5% of american adults used at least one of these platforms. now the number is 72%. eighty-two percent of americans are on youtube. seventy percent are on facebook. and ready for this? four of these companies have more than a billion worldwide users. that's more than the population of every country in the world but two.
2:06 am
none of these companies has a financial incentive to change. social-networking sites in the united states brought in more than $72 billion last year. >> the reality is our country is deeply divided right now, and that isn't something that tech companies alone can fix. >> for us, it's much more important to sort of look at, like, the big ideas that might influence the way that tech evolves in the future, and, more importantly, to build a strategy that does not rely on government intervention for our success. >> twitter has become kind of the de facto town square. so, it's's just realally importt that peoplple have bototh the rereality and d the percepeptiot they're able to speak freely withthin the bouounds of thehe . >> so, we invited meta, twitter, google, snap, and tiktok onto this broadcast to defend their practices and simply have a conversation about the future of their platforms and what can be done here. all of them declined. we did receive a statement from tiktok, and we got links to previously written blog posts
2:07 am
from the other companies. the last real legislation that spells out who is legally responsible for content on the internet was signed into law 27 years ago, the last century. in 1996, only a fifth of americans had ever booted up the world wide web. section 230, as it's known, says, "no provider or user of an interactive computer service shall be treated as a publisher or speaker of any information provided by another information-content provider." in other words, these companies are not to be held liable for harmful or inaccurate content a user posts on their sites and can't be sued. of course, the question is the minute they use an algorithm, do they actually become a publisher? this law was written before the algorithms had taken hold. now, in washington, section 230 is under more scrutiny than ever, with more than 30 bills proposed on social media during the last congress. but despite all those bills proposed, none of them have passed. twenty-four years ago, 46 states and big tobacco reached the largest settlement of civil-litigation claims in u.s. history,
2:08 am
and tobacco companies changed their marketing practices and paid states more than $200 billion in restitution. when we realize products are toxic for us, we pass laws to change them, or we hold companies accountable in a court of law to force the change. facebook whistle-blower and former data scientist frances haugen became one of the greatest sources this century when she turned over thousands of confidential company documents, sharing them with regulators, journalists, and with lawmakers. >> when we realized big tobacco was hiding the harms it caused, the government took action. when we figured out cars were safer with seat belts, the government took action. and when our government learned that opioids were taking lives, the government took action. i implore you to do the same here. >> and frances haugen joins me now. welcome to "meet the press." >> thank you. >> we want to start with why facebook is so afraid of any government intervention. and i say this because they have
2:09 am
helped kill -- all those bills i showed that none of them came into law? they -- and they're not alone -- but they had a lot of lobbyists kill those bills. what are they afraid of? >> when you look at the history of facebook's stock price -- and i did this before i came out -- over the course of the five years before the facebook disclosures began to become public, facebook stock only declined versus the nasdaq by more than 5% about 25 times, 27 times. overwhelmingly, those events, when the stock price declined, were when something came out that demonstrated facebook was going to have to spend more money on safety. facebook is scared that if we actually had transparency, if we actually had accountability, they would not be a company with 35% profit margins. they'd be a company with 15% profit margins. >> is that so bad? >> no. they'd be one of the most profitable companies in the world. >> fifteen percent profit margin is pretty good. >> it's amazing for a company this size. >> a 15% return in any savings would be amazing. >> i know. >> so, it's just simply it wouldn't be as profitable. >> it wouldn't be as profitable. >> so, you took this job in the civil integrity team.
2:10 am
>> mm-hmm. >> you had a specific motivation to do it. tell me about it. >> back in 2016, i had a very close friend who had helped me relearn to walk. so, i was very ill. i was paralyzed beneath my knees. and this person was originally an assistant who became a dear friend. and over the course of the middle of 2016, after bernie sanders lost the primaries, he fell down a rabbit hole where he became more and more radicalized. and watching him drift away, at the same time, i was working on the algorithms at pinterest. you know, i was the lead product manager for ranking at pinterest. and as i began to look at the internet, particularly at facebook, i would see these glaring deficits. they would have these things like a carousel under every post that would show you other posts, and you could tell that those posts were ranked based on clicks because they were always the most extreme version of whatever you saw. so, this is things like you'd click on an article about the election. the carousel shows you a post saying, "the pope endorses donald trump." >> right. >> and it turned out that
2:11 am
there was a whole macedonian misinformation factory going on. there was a cottage industry of these little blogs that would make these fake news stories. and facebook was just asleep at the wheel. and so, when i got offered a chance to work on civic misinformation, i thought back on that experience of watching these deficits, of watching this person who i really cared about spiral into a world of alternate facts. >> right. >> and i said, "this is my chance to do something." >> so, let's talk about an algorithm. let me put up a definition here, facebook algorithm... and what your disclosures found is how often they change the algorithm. >> mm-hmm. >> and to me, it shows you they know what's happening. like, they can do this, but... you were in the civil integrity. did you feel as if they wanted you to succeed? >> when i initially was hired,
2:12 am
i came in with a lot of optimism. you know, facebook had built this center of excellence inside the company, actually one of the best civic-responsibility units that was available in the industry. and it wasn't until they dissolved that unit immediately after the 2020 election that i realized the company wasn't committed to this enterprise, that if you want to have successful change in an enterprise, you have to appoint a vanguard. you have to have executives say, "these people are the future. they're going to lead us in the right direction." and when facebook dissolved civic integrity, i saw that they weren't willing to make that commitment anymore. >> you said something in your statement to congress. you said you saw facebook "repeatedly encounter conflicts between its own profits and our safety. and facebook consistently resolved these conflicts in favor of its own profits." give me an example. >> one of the most effective things for reducing misinformation is a very simple intervention. it's actually free-speech respecting. it's if you look
2:13 am
at a chain of reshares -- so, alice writes something. her friend bob reshares it. her friend of friend carol reshares it. it lands in dan's newsfeed. alice doesn't know dan. dan doesn't know alice. alice could be a misinformation agent. you're outside of that social context. if you said, "hey, dan, if you want to share this, you can. but you're gonna have to copy and paste. you know, we're gonna gray-out that reshare button. you have to make a choice. you can't just like knee-jerk reshare this." >> they have to make a little bit of an effort. >> yeah. yeah. be intentional, intentionality in sharing. that has the same impact on misinformation as the entire third-party fact-checking system. and it doesn't choose which ideas are good or bad. it just says, "let's have humans make choices, not just reflexes make choices." but, at the same time, that reduces the amount of content spread in the system. it decreases profits very slightly. and facebook declined to make that choice. >> all their business model is about you and me spending time, right? there is no other business
2:14 am
it really has other than selling advertising based on how much time i spend on it. is that right? >> yeah. >> so, if this was taken away or held back, this would massively change the company. >> the way to think about safety on social-media platforms is there's lots of very small choices where you make them, and you lose 0.1% of profit, 0.2% of profit. the problem is these industries are so sensitive to growth that when they don't grow at the level that the market expects, their stock price crashes. and so they're afraid to take even these small actions because they will decrease the profitability of the company, even if by a little bit. >> i want you to react to nick clegg. he's done a lot of writing. >> ah, nick clegg. >> and this is one where it feels like he might as well have used the shrug emoji. >> [ laughs ] this seems to be the reaction of facebook on everything... >> oh, yeah. >> ...which is "hey, this isn't on us. this is society.
2:15 am
we're just the mirror." >> nick clegg wrote an amazing blog post in march of 2021. he said, "it takes two to tango. you know, you're blaming us for our algorithms, but you chose your friends. you chose your interests." and yet in march of 2021, they had already run the same study at least four times, where they took a blank account. so, this is an account that doesn't have any friends. it doesn't have any interests. and they followed some center-right, center-left issues. and then all they did was click on the content facebook gave or follow groups facebook suggested. and in two weeks they went from, you know, center topics, things like fox news, to white genocide just by clicking on the content. the algorithm pushed them in a more extreme direction. it's true there are many forces in society, but our information environment does have consequences. >> at least we compartmentalized radio. we could compartmentalize tv. you can turn that off. this has been much harder to compartmentalize. >> well, it's also a question of when the tv is on, if you tell a falsehood,
2:16 am
other people can see it and respond. on radio, everyone has the same airwaves. when it comes to social media, you can spread lies, and they're invisible. and facebook has resisted even minimal efforts at transparency that might allow us to begin to reconverge on a single-information environment. >> so, what should government regulation look like? >> i'm a big proponent of transparency as the first step. i think people aren't aware of how far behind we are. social-media companies for 20 years -- and remember there were social-media companies before facebook -- have all been very intentional except for maybe twitter. twitter has something called the fire hose, where you can see a stream of the tweets that are being posted. most social-media companies have resisted even simple efforts to bring more people to the table to ask questions and find solutions. so, things like the platform accountability and transparency act, which was recently proposed, i think is a great first step. >> when you say transparency, should the government have to approve an algorithm?
2:17 am
>> we are at such a basic level of understanding right now. like, i really want to emphasize this. this is like we're back in 1965. you know, we don't have seat-belt laws yet, and we're just opening the pages of "unsafe at any speed" and saying, "oh, my goodness, there's all these ways we could have safer platforms." we're at that level of, like, nascent understanding. and so, but we have to have transparency so we can have enough people have conversations about how we move forward. >> should government be focused on user protection, consumer protection more than trying to regulate the companies? >> that's a great question. other industries are kept safe because there is something i call the ecosystem of accountability. there's lawyers that understand what a cut corner looks like. there's investors that understand how to manage for long-term returns. remember, facebook's stock price is down, like, 70% right now. that's informed citizens like mothers against drunk driving. that's legislative aides that understand what's possible. right now, that entire ecosystem
2:18 am
is missing because the social-media companies hid the information. and so, when we talk about should we be protecting users, we are so far at the beginning that it is difficult to even put everyone at a table and say, "this is the menu of what's possible. let's negotiate what the floor looks like." >> i know you don't have as much insight into other tech companies, but should we assume that this opaqueness on algorithms and how things work is similar at twitter, at tiktok, and at youtube? >> a hundred percent. so, one of the most important things that elon musk could do to prove that he wants to have the public square is he could publish the algorithms. >> yeah. open source. >> yeah, open-source it. he'd have more help. it would be cheaper for him. he'd be more profitable. but companies like tiktok have the exact same problems, if not more so, because tiktok is a company that is designed around being censored. you know, it comes from china. it's designed to amplify things so much that only a few pieces of content make up
2:19 am
80% of all of our feeds. and they manually screen those. we deserve to know what those policies are because they're influencing what information we get to see. >> alright. well, i'm going to get a couple lawmakers on here and see what they have to say. frances haugen... >> thank you so much. >> ...the facebook whistle-blower, thanks for coming in and sharing your perspective. >> happy to be here. >> when we come back, what can congress do to regulate social media? democratic senator amy klobuchar of minnesota and republican congressman mike gallagher of wisconsin both have some ideas, and they're both here. what if you were a global energy y company? withth operationons in scotl, tetechnologiststs in india, and customomers all on differerent systemsms. you neneed to pullll it toget. so y you call inin ibm and r rt you neneed to pullll it toget. toto create anan open hybrbd cloud platatform. you neneed to pullll it toget. now datata is availalable anywywhere, secucurely. you neneed to pullll it toget. and your d digital trtransformatition is helplg find new w ways to unlnlok enerergy around d the wo. (c(cecily) what's s up, einstetein?
2:20 am
(einstein)n) my netetwork has g gone kapu! and your d digital trtransformatition is helplg (cecily) you tried d to save a a buck on? (e(einstein) not so s smart. (cececily) wellll, there isis a smarterer o save. (e(einstein) oh?! (c(cecily) switch to o verizon! (vo) that's s right. fofor a limiteted time getet vn unlimiteted for justst $25 a l, guaranteteed for 3 y years. (e(einstein) brillianant! (vo)o) only on n verizon. (hororn honks) i'm the teamam mascot, a and b, am i i running l late. but i'i've gotot lead in m myt and spspirit in mymy fingers. wowoo! ha ha whatat a hit! and ifif you havee cucut rate carar insurancee the costst to cover r that, might tanknk your seasason. soso, get allslstate and be b better prototected from m mayhem... soso, get allslstate like me. soso, get allslstate (laughining) wowoo!
2:21 am
(v(vo) 'tis the seseason to swswitcho verizon.n. it''s your l last chancece tot this amamazing deal.l. (laughining) (scroogege) 'tis? (cecilily) 'titis! (v(vo) thisis holiday s season, veren gigives you the new ipiphone 14 prpro. on u us and in s stock now.. ththat's a a value of u up to . (scrcrooge) wow! (vo)o) don't mimiss out. veverizon. . >> though lawmakers in washington have talked about a moment of truth for social-media companies, they seem to have lacked some urgency. each party is worried about a regulation hurting their side and benefiting the other. joining me now is democratic senator amy klobuchar of minnesota, who has introduced lots of legislation to attempt to regulate online political advertising, address problems in social-media algorithms, and to restore competition online by reining in big tech companies. and republican congressman mike gallagher of wisconsin, he recently introduced a bill to ban tiktok, which he calls digital fentanyl, due to its parent company's ties to the chinese government.
2:22 am
welcome to both of you. senator klobuchar, i want to start with you. i want to start with something just how powerful the social-media lobby is in this town. look -- i put up that list of legislation there. a week ago, you thought you had a bill that was at least going to -- it was designed to help journalistic organizations, both big and small, to get properly funded by facebook and all that, a similar law to australia. you thought it was a done deal, and it was gone in 24 hours. how powerful is this tech lobby? >> so powerful that you literally can have a bill that got through the judiciary committee with strong bipartisan support. you can get promises from leaders that it's going to be a major end-of-year bill. and then, within 24 hours, it's gone. it's vanished because one company, two companies, in this case facebook and google -- by the way, google made
2:23 am
$66 billion in one quarter in advertising while we are going to lose one-third of the nation's newspapers by 2025. we had such strong support for this bill. but these guys just make a few calls and they just say, "hey, you know, this is going to hurt us," just like they did in australia. the difference was in australia, their government stood up and said, "no, we're going to do this, and we're going to say, 'you've got to negotiate with these news organizations to get fair price for their content.'" and it happened. and they have a better system in place. right now in the united states of america, these companies have basically started dominating our thought processes. and i think the work frances has done is incredible. it is about going after these algorithms, making for them transparency. that's one of the bills we have. and it is about getting compensation for our news organizations. and then, finally, it's about getting rid of archaic law, section 230, that gives them immunity from lawsuits. >> i want to talk about this.
2:24 am
so, section 230, we'll put it up again here... this law was passed in the '90s. >> mm-hmm. >> this law was passed when it was message boards, and it was forums. i was somebody that used to go on college-sports forums. and yeah, there was some crazy stuff in there. it makes sense for that. this was pre-algorithm. this was pre, sort of pre, before the iphone. we didn't know what was coming. can this be amended rather than gotten rid of? >> yes, you can amend it and focus on certain kinds of speech -- misinformation, disinformation. and all you're saying is we know people are going to put stuff on your alleged "town square," which has become really a communications company. your network, other news organizations have limits in place and standards. an argument's going to be if you start making money off of it, if you amplify it, that's a whole different thing. your angry emojis and all these things you're doing is to make money.
2:25 am
>> if you've changed the newsfeed, your newsfeed versus my newsfeed, hasn't the company become a publisher? >> they are a publisher. and let's just start facing the facts and stop pretending they're some little company in a garage. maybe one day they were. but now they are mega companies, and this is starting to happen all over the world. we are lagging behind, and it is time for 2023. let it be our resolution that we finally pass one of these bills. we have gotten through the committee some of the first bills since the internet began to finally take them on. and so, it's not just that we've done nothing. we've gotten bipartisan agreements. we pushed these bills onto the senate floor. >> let me ask you about this issue of polarization. i mean, this seems to be what holds back a good bill in congress is that there are -- a democrat might think, "is this going to hurt our status online?" and vice versa on the republican side? >> it's actually even more insidious. when we had a bill that said you can't self-preference
2:26 am
your own products -- amazon at the top all the time over small businesses; support from small businesses all over the country for this -- they ran over 150 million and much more in ads all across the country. so, what that said to the members -- there were red-state ads, angry guys with pickup trucks. there was blue-state ads. they ran these ads, and it said to the members, "hey, if you start getting on a bill like this, or if you support it, we're going to come. this is kid's stuff compare to what you're going to do." >> it's almost like you're being extorted. >> this is how it's working. and so, it is only going to change. and i thought frances' work on kids would change it. not yet, with the children's privacy bills. it's only going to change if the people of this country say, "this is enough. this is corrupt. you got to do something. put some rules in place when it comes to social media." and they've got to be liable when you've got situations where literally deranged people are believing their stuff and going in and taking a hammer and bludgeoning the husband
2:27 am
of the speaker of the house or hundreds of thousands of blog posts that are allowed to go through with maps of the capitol that they used to create an insurrection. at some point, when they can't control their own platforms, while they're raking billions of dollars in from the american people, and over, as you point out, two-thirds of americans say it's hurting our democracy, come on, congress. stop hiding behind this and get something done. >> look -- let's be realistic. the tobacco companies changed after being hit with hefty lawsuits, more than government regulation. after the lawsuits came more government regulation. >> right. >> do you think if you open these companies up to lawsuits -- and, by the way, facebook is being sued overseas for what its role in an ethiopian civil war and some other places. do you think their behavior would change? >> yes, because then in order to continue, they'd have to put safety measures in place instead of sending out sweet, little notes about all the good work they're doing. they would actually
2:28 am
have to do something. and so, that's why changing section 230, which was developed for a whole different moment in the internet, is an answer. the other is taking on monopolies so you can allow competitors to come into being that would have different bells and whistles on privacy and the like, regulating online political ads, which they're still escaping despite some work from the federal election commission. this is a bill i have with senator lindsey graham. there are many, many things we could do here, but we need floor time, we need votes, and people need to say where they are. are they going to side with these companies, or are they going to side with the people of this country? >> would you like to see a law that was similar to the eu's digital services act, which would target online ads that basically say the companies cannot use online ads targeting ethnicity, religion, or sexual orientation? >> i would like to see major work -- i'd want to look at that exact bill -- on ads. and this bill that we have requires disclosures, disclaimers so you know who's paying for them, which is a major problem.
2:29 am
but in this last election, and maybe this is going to help with my colleagues, over 30% of americans said the number-one reason they voted democratic, including a whole bunch of independents and moderate republicans, was democracy. and it was voter suppression, yes, but it was also about this kind of misinformation and just the fomenting of the lies on both sides sometimes that have caused people to do what they've done. and i just think it's a major issue. i'm not giving up. i'm not giving up. i'm going into 2023 ready to go. >> i was just going to say your passion comes through in a big way. senator amy klobuchar, thanks for coming on and sharing this and happy new year. >> thank you. >> let me bring in congressman mike gallagher. and i want to start with tiktok. you call it digital fentanyl. tiktok, which is owned by a company named bytedance, which is based in china, essentially reporting to and/or owned by the chinese government, however you want to look at it. but explain why you call it
2:30 am
digital fentanyl. >> well, it was fcc commissioner brendan carr who originally called it digital fentanyl. i think the comparison is apt for at least two reasons. one, it's highly addictive and destructive, and we're seeing troubling data about the corrosive impact of constant social-media use, particularly on young men and women here in america. it's also digital fentanyl in the sense that, as you alluded to, it ultimately goes back to the chinese communist party. tiktok is owned by bytedance. bytedance is effectively controlled by the ccp. so, we have to ask whether we want the ccp to control what is on the cusp of becoming the most powerful media company in america. that is very troubling. and so, i was glad to see my colleagues in the senate pass in unanimous fashion a ban of tiktok on government devices. i think we should do the same in the house and expand that ban nationally. >> look -- i want to put up tiktok's statement. they did give us one, and this is what they say.
2:31 am
but this is not about misinformation with tiktok. i want to get more at your concern, and i'm curious. are you more concerned about the chinese government having our data, which one might argue they already do, or are you more concerned about the fact that the algorithm we know nothing about, and they can turn it on or turn it off to say what they want to say in any moment to billions of users around the world? >> well, i'm concerned about a few things. i am concerned about the chinese government effectively compiling dossiers filled with our data. but you're right to suggest that predates tiktok, right? i remember getting a letter after the opm hack because my military records potentially would have been compromised. that gives them enormous leverage. for example, any time an american is operating in china
2:32 am
or if there's something our intelligence community needed to do, i'm concerned about that. i'm concerned about tiktok's ability to track your location, track your keystrokes, track what websites you're visiting, even when you're not using the app. i'm concerned about the lack of transparency around the algorithm, which is addicting kids. but i think what's most pernicious is the fact that since a lot of young men and women in america increasingly turn to tiktok to get news, what if they start censoring the news, right? what if they start tweaking the algorithm to determine what the ccp deems fit to print? that's incredibly dangerous. that's as if in 1958, we allowed the kgb and pravda to buy the new york times, the chicago tribune, and the washington post. that actually probably understates the threat. i think it's a multipronged threat we need to look at. >> is this something with tiktok that you think, "can they create an american version?" or do you think there's just no way to split this company up to protect americans?
2:33 am
>> i think one acceptable outcome -- and this would be allowed in the bill that i have, which is a bipartisan bill with my colleague raj krishnamoorthi. so, here you have not only a democrat and a republican working together, but a bears fan and a packers fan working together on this issue, chuck. it would allow for a sale to an american company. that option was explored during the trump administration. oracle explored a version of that, microsoft. ultimately it fell through. i think there is a workable solution there. what we don't want is some quasi-solution where there's a data center in singapore, but the ccp and bytedance effectively retains control. so, the devil is in the detail. but i'm open to having that discussion with tiktok, and i really want to have that discussion with the biden administration. i don't think this should be a partisan issue. i want to work with them. and i think the senate vote that we were talking about earlier is evidence that this isn't a partisan issue. >> what's your level of concern about the russian investment into telegram and the saudi government's investment into twitter?
2:34 am
>> well, i guess my broad concern, of which both of those are part, is where we see authoritarian governments exploiting technology in order to exert total control over their citizens. and that's really the concern with the ccp. they seem to be perfecting this model of techno-totalitarian control. it's most prominently and perversely expressed in xinjiang province, but they're exporting that throughout the rest of the country. they're using it to shut down the protests that we're seeing in china right now. and, ultimately, it's my belief that that's a model that will not stay in china. that's a model they're going to export around the world. you know, another thing we could do here, chuck, that our social-media companies could do, is insist on basic reciprocity. for example, chinese wolf-warrior diplomats, they're propagandists, are all over twitter and facebook, pushing propaganda, attacking america, when, at the same time, of course, american citizens aren't allowed access. i mean, chinese citizens aren't allowed access
2:35 am
to those apps in china. there's no level playing field. i think a simple rule, and i actually asked jack dorsey when he ran twitter to do this, if your government doesn't allow your citizens access to the platform, we're going to deny your government officials access to that same platform. i think that would be a useful step that would apply not only to china but also to russia. >> look -- you're not somebody that wants to see a lot more government regulation. but what is the best way to regulate social media? is it to make them -- is it to get rid of 230 and, you know, let the courts have at them, like, you know, no more special protections? is it a new law? is it algorithm transparency? what is, in your view, acceptable regulation? >> well, i do thinkncmoaround the algorithms is necessary. and i liked what senator klobuchar was saying on that front. my only concern with the 230 repeal is that it might accidentally increase censorship on social media.
2:36 am
in other words, if these platforms are now liable for what people that use them say, would they not just kick people off proactively? so, i think a better framework might be mandate data portability across platforms so you're able to bring your network to whatever platform has the best content-moderation policy and best transparency that you like, combined with something called "neutrality in the stack," where in contrast to twitter or facebook, which are private companies that can have different content-regulation strategies, amazon web services, for example, kind of the infrastructure of the internet, couldn't deny someone access to their services just because they don't like their political belief or what they think about this or that political issue. i think that's kind of the framework i have in mind, but i'm open to having that conversation. i listened to your interview with senator klobuchar closely, and i'd love to go over across the hill and talk to her about her ideas. >> just to put this -- is this a case of regulating the companies or protecting consumers? like, should the focus be more on consumer protection or more on trying to regulate
2:37 am
the companies? >> for me, it's the latter. it's consumer protection. and, you know, one thing that we don't really think about is, you know, these complicated user agreements that we all just click automatically. perhaps it's unreasonable for us to expect your average american citizen to read a 5,000-word... >> there's a great "south park" spoof about that. >> that's right. but we in congress should do a better job of understanding that and translating that and communicating it to the american public and to our constituents in a way that they understand. so, i think there are a variety of things we can do. when it comes to our kids, i think -- you know, listen. the government can't raise your kids, can't protect your kids for you. i have two young daughters. it's my responsibility to raise them into healthy adults. but there are certain sensible things we can do in order to create a healthier social-media ecosystem. >> well, to do on those service agreements, we made credit-card companies cut down the amount of paragraphs they do. i think we could make social-media companies and tech companies do the same. congressman mike gallagher --
2:38 am
go, pack, go -- republican from wisconsin, appreciate you coming on and sharing your perspective. >> thank you. >> when we come back, social media is seen as mostly good for democracy across the globe, but not here in the united states. here, it's seen as much more destructive. we're going to go inside those numbers next. (v(vo) sailil through t the heart of histstoric citieies and ununforgettablble scenery with v viking. unpack o once, and g get closeo icononic landmararks, local l , and cultltural treasasures. becacause when y you experieie eueurope on a a viking lonong, you'u'll spendnd less timeme geg there and d more time e being t. vikingng. explorining the worln cocomfort.
2:39 am
type 2 diabetes?s? discscover the o ozempic® tri-zonene. in my ozozempic® t tri-zone, i i lowered mymy a1c, cv r r, and d lost some e weight. inin studies, , the mamajority of people reachehed an a1c u under 7 and maintatained it. ozozempic® l lowers the e rk of majajor cardiovovascular evs such as s stroke, heheart atta, or d death in adadults also with h known hearart dise. and yoyou may losese weight. adults losost up to 1414 poun. ozempic® isn't t for peoplele with t type 1 diababetes. don'n't share e needles oror , or reuse n needles. don'n't take o ozempic® if you o or your famamily ever had m medullary thyroid cacancer, or have mumultiple endndocrie neoplasia a syndrome t type , or if allelergic to itit. ststop ozempicic® and getet medicacal help rigight away if you g get a lumpp or swewelling in y your neck,
2:40 am
sesevere stomamach pain, o on alallergic reaeaction. serious s side effecects may include papancreatitisis. gallblbladder proboblems may o . tellll your provovider aboutut n problemsms or changeges. taking ozezempic® witith a sulflfonylurea o or insulinn may increaease low blood susugar risk.. side e effects likike nausea, vomititing, and diararrhea may l lead to d dehydrationon, whwhich may woworsen kidndney problemems. ask k your healtlth care provir about ththe ozempicc® tri-zon. you may papay as littltle asas.
2:41 am
. ♪♪ >> welcome back. "data download" time. social media's impacts are felt all around the world. but pew research center's new global attitudes survey shows that the united states is actually an outlier in how americans perceive social media's impacts on democracy. bottom line, americans are a lot more skeptical. social media's impact on democracy -- a bad thing? thirty-five percent around the world say it's a bad thing. but here in the united states, 64% say it's been a bad thing. how about whether it's made us more divided? this, there's a little more continuity -- 65% around the world, 79% here in america. are we now less civil? forty-six percent globally agree with that. sixty-nine percent believe that here in the united states. and how about social media -- does it make you more informed? globally, people think it does, 73% versus americans at 64%.
2:42 am
but despite our skepticism, as americans, we're using this more and more. look at this. in 2012, about half of americans were using social media. now essentially three-quarters. and look at this, by age group. 18 to 30 -- this won't surprise. i'd like to know who the 16% who don't use social media are, but 84% of folks under 30, 81% 30 to 50, and even 60% of those folks 50-plus. so, we may not like it, but we're becoming more addicted to it. up next, social media has already changed us and our politics. so, what is the best path forward? our panel is here. i remembmber when i i ft started flying, and d we would expeperience tururbulence. i wowould watch the fligight attendadants. if thehey're n not nervousus,n i'm not goining to be nenervo. i wowould watch the fligight attendadants. finanancially, i i'm ththe flt attetendant in t that situata. the e relief thahat comes overer people ononce they knw theyey've g got a guidede toto help themem through,,
2:43 am
i definitetely feel prprivid to be inin that posisition. theyey've g got a guidede toto help themem through,, ♪♪♪ theyey've g got a guidede toto help themem through,, i amam here becacause they revolutitionized immmmunothera. i i am here bebecause they saw h how cancerr adapts to o different t oxygn levels andnd starved i it. i am here e because ththey switd off f egfr gene e mutation anand stopped d the growthh of tumoror cells. there's s a place ththat's mag one advancnced cancer r discoy after anotother for 7575 yea.
2:44 am
i am here.e... i i am here...... bebecause of d dana-farbere. what we e do here chchanges lives eveverywhere.. i am h here.
2:45 am
. >> back now with a terrific panel. new york times technology reporter cecilia kang, coauthor of "an ugly truth: inside facebook's battle for domination," former homeland security secretary jeh johnson, former republican congressman carlos curbelo, and senior editor at reas magazine, elizabeth nolan brown. welcome to all of you. cecelia, i want to start with you because you've written this book on facebook. it does -- kara swisher has a sort of a take on facebook and even frances haugen. mark zuckerberg didn't know what necessarily he was creating when he started creating it. and that's what it feels like with all these social-media companies. they were started with sort of some good intentions, and it got out of -- they lost control. >> yeah, i mean, there's two guiding forces. they wanted companies that would grow and grow fast and grow big and scale to the point where they're global and they would have historic and lasting impact. and the second thing is, we have to remember, these are businesses. these are companies that are motivated by profits.
2:46 am
and the business model is built on the idea of getting eyeballs and attention to serve up to advertisers and to make money that way. and that -- i think if you miss those two important points, you sort of miss what's happening here. >> but they never -- what i want to get at is, they never thought, "hey, we're going to be trafficking misinformation." >> they never did. >> that seemed to shock them. >> yeah. and i think a lot of these companies are built oftentimes by actually sort of young, male, idealistic entrepreneurs in silicon valley with big dreams. and they got lots of funding. and their idealism is what really attracts the funding and attracts a lot of interest by engineers who want to work there because they do want to change the world. but they're not motivated to look around the corners for potential problems because their eye is always on growth, and their eyes have always been on really growing that business model. >> elizabeth, would you argue, is the problem the companies, or is the problem us? >> i think that the problem, a lot of things that get blamed
2:47 am
on social media are just human problems, and social media just makes them more visible, yes. >> more visible, but do you accept the idea that social media may ramp it up, accelerate these issues? >> i think that social media -- i mean, things that people don't think about algorithms is that they actually help suppress a lot of the bad content. without algorithms, we would be seeing a lot more hate speech. we'd be seeing a lot more offensive content. i think that they actually do a lot of good for your average person online. >> jeh, when you were at homeland security, you were just beginning to deal with this misinformation issue. how did you tackle it then? and, looking back, how would you be tackling it now? >> looking back at the 2016 election, and i should mention that as a lawyer, i have clients. >> yes. you work for a firm that does, paul, weiss, that does represent some of these social-media companies we're talking about. >> made that disclosure. >> yes. >> this was the trojan horse of 2016. 2016, i, as secretary
2:48 am
of homeland security, was very focused on potential cyberattacks on election infrastructure. and just before i left office, i declared the election infrastructure to be critical infrastructure. we were worried about, you know, ballot counting, reporting, and so forth. turned out that was not the issue. hasn't been the issue. wasn't the issue in 2020. wasn't the issue in 2022. the trojan horse was the extent to which the russian government invaded our american conversation, and it spelled out in the mueller indictment. but this is an issue that we've really yet to get our arms around because it does implicate free speech. >> carlos, i feel like your national political career sort of encompassed this moment where we went from social media good to social media problematic. is that fair? >> yeah, and i think, chuck, one of the big problems, especially on the right, is that social media has sown a lot of distrust because the right feels under attack by big institutions.
2:49 am
obviously, these big companies represent mainstream america. and because we don't really know exactly how these companies operate, it breeds a lot of conspiracy theories. it makes people paranoid. so, one of the reasons elon musk has become so popular on the right is because this concept that he's unveiling -- right? -- raising the curtain on everything twitter did throughout all these years. so, yes, it has hurt our democracy. why? because it has diminished trust in society. >> go ahead. >> social media, in my view, accentuates the point that our greatest strength as a free and open society is also our greatest vulnerability. it's no coincidence that at the same time the rise of social media, more people are participating in the american political process. 1992, 105 million people voted. >> it's an important point. i mean, our voter turnout in the last decade has been going up, up, up and up. >> sixty-six percent. >> one might argue that that voters in this country start voting when they're worried that
2:50 am
the democracy is -- >> well, there is that, too, yes. and you had this person called trump that got everybody's attention. but through social media, social media raises political awareness. you can mobilize a movement behind social media. look at how effective president zelensky in ukraine has been using social media. but this is also our vulnerability, too, obviously, the echo chamber, fake news. >> elizabeth, i want to start with you. if we're going to regulate this, should we come at it consumer first or company first? >> i think we definitely need to consider consumers first. i don't think that regulating it just for the sake of keeping these companies being big is a good thing. you know, we already see facebook, twitter, their power is diminishing. like, new things are coming along, tiktok. mastodon was getting thousands of users an hour recently. i think that the market will take care of unseating these dominant companies if we don't overregulate, because when there's too much regulation, only facebook can keep up with it, and it entrenches facebook's power. >> well, it is -- look.
2:51 am
none of these companies want regulation. i mean, i was talking about it. i mean, it is astonishing how fast facebook killed that bill. it was a done deal in the defense-authorization bill. it was done, and then it got stripped out. she didn't know how it happened. >> yes. you know, i've seen dozens of bills be proposed on regulating technology companies, and none of them have passed. and i really don't think that regulation is where we're going to see accountability first. i agree with elizabeth. i think consumers will vote with their feet. we're already seeing facebook see that users are not using the site as much and not visiting as often. and we're seeing them go to -- and innovation is winning in that other sites, be they good or bad, are attracting new users, like tiktok. the other way we're going to see real accountability is probably in the legal system. we're probably going to see lawsuits. and we saw that. >> so, you would like to see 230 gone or amended so that you saw more lawsuits? >> well, i think regulating speech is going to be hard, in general. i think getting rid of 230
2:52 am
is going to be very difficult. i think republicans and democrats agree that it needs to be revised in some ways, but for very different reasons. they come at it completely polar-opposite reason for why. but as far as 230 goes and speech, i think what you're going to see is more individual lawsuits. you saw, for example, with the lawsuits against alex jones by sandy hook families. you saw dominion voting machines. the company had to sue fox news -- >> and this is not holding the company, the tech companies. it's going after the individual that used the tech platform. >> and so i think users -- that will probably create bottlenecks potentially in the spread of disinformation, the creation and the spread of disinformation. >> algorithm transparency -- it's easy to say. i don't know what that would look like. do you know what that would look like? >> well, but that's the key because this is the mystery. how is information getting boosted? how is information getting suppressed? that's what makes people paranoid. >> i'd like to make the choice. you and i were talking about this earlier. i want to make the choice. hey, i want some use of the
2:53 am
algorithm, but i want to know. i want to pick and choose when i let you decide, a.i. computer. >> it's great when they let you choose to see things in chronological order still. i wish all tech companies would do that. but, again, i think that's something that users need to demand. it can't come from a top-down mandate. >> last point. >> you asked frances should the government approve algorithms? i think that comes perilously close to regulating speech. >> yeah. >> very interesting exercise -- if you put in to your phone a well-known right-wing commentator and the words "great replacement" you get a site from the anti-defamation league arguing that he should be deplatformed. >> an interesting way you can come to that. this was a terrific panel. thank you. and when we come back, a special "meet the press" minute looking back at the first time this program looked into this emerging technology. (cecily) what's up, einstein? (einstein)n) my network has gone kaput! (cececily) you u tried to s save a buckck ? (einststein) nonot so smartrt. (cecily)y) well, ththere is a s smarter wao saveve.
2:54 am
(einststein) ohoh?! (cecily) switch to verizon! (vo) that's right. fofor a limiteted time getet vn unlimiteted for justst $25 a l, guaranteteed for 3 y years. (e(einstein) brillianant! (vo)o) only on n verizon. and it's easier than ever to get your projects done right. with angi,i, you can c connect anand see ratitings and rerevi. and whenen you book k and pay tg you're c covered by y our happi check out t t today. angi..... and donene. what if f you were a a globalk whwho wanted t to superchahae your aududit system?m? so you tapap ibm to unun-o your d data.
2:55 am
and start t crunching a yeas worth of t transactionons against ththousands of c compliancee cocontrols witith the helplp o. against ththousands of c compliancee now w you're makaking smar decisions s faster. opoperating cocosts are lowe. and d everyone from y your auditotors to y your bankerers feels s like a milillion buck. let's s create smamarter ways of puttiting your dadata to w. ibibm. let's c create i'm ththe captain n of the pee wewee footballll . we m may be smalall ibibut we arere...ate kids: : mighty! we m(tired))smalall ibibumightyty.e...ate kids: : mighty! wannoyiying. andall iyou are e so tiredte from plalaying defense e against usus, thatat your focucus is about toto ride the e bench.s, (screamingng, cheeringng) go, go! ththanks, mom.m. (screamingng, cheeringng) (tired)) bybye. (kids whooooping) man: c come on, cocome on. bybye. (crarashing, gasasping) and if y you have cucut rate car i insurance,, then not e even a hailil y will s save you fromom paying fofor this. so, get alallstate, and be b better prototected from m mayhem likeke me. from mawesome.likeke me.
2:56 am
2:57 am
. >> welcome back. before there was facebook, twitter, or tiktok, blogging was the earliest way for people to share content and build communities online. but in 2004, blogs were still a novelty in the world of campaign politics, at least here in washington. in my first-ever "meet the press" appearance, tim russert asked me to explain how this new social media was being used by presidential candidates that cycle. >> and here to help us is chuck todd of national journal's hotline.
2:58 am
what is a blog? >> well, blog -- so the actual term itself, by the way, is short for "web log." and it's, you know, you drop the "w," and you get the blog. i'll just describe what howard dean's blog is, since it's the one that has the most traction and the most attention. it's essentially like a digital bulletin board saying, "hey, look. this is what we're up to today. this is our message today. these are some of the things we're doing today." and then it allows a section to comment about what's going on during the day. and this is where you find out who the bloggers are. yes, i used to have quite a big head of hair. that's all for today. thanks for watching. happy new year. we'll be back next week because if it's sunday, even in 2023, it's "meet the press." ♪♪ ♪♪ ♪♪ ♪♪
2:59 am
3:00 am
breaking overnight, marvel actor jeremy renner is hospitalized in critical condition after an accident involving a snowplow his family is by his side. dozens of high water rescues as deadly flooding overtakes parts of california neighborhoods. we'll have the latest. and breaking news. multiple fatalities after two helicopters collide and crash near the seaworld theme park in australia today. new information surrounding the new year's eve times square attacker who swung a machete a possible motive ahead.


info Stream Only

Uploaded by TV Archive on