Skip to main content

tv   Snapchat Tik Tok You Tube Executives Testify on Kids Online Safety -...  CSPAN  December 2, 2021 4:22pm-7:32pm EST

4:22 pm
watch "american history tv" saturdays on c-span 2. and find a full schedule on your program guide or watch online any time at c-span.org/history. >> up next, a hearing on the effect social media platforms have on children. executives from snapchat, tiktok and youtube testify before a senate subcommittee addressing several issues from deadly viral challenges to eating disorders. this is three and a half hours.
4:23 pm
>> welcome to this hearing on protecting kids on social media. i want to thank the ranking member, senator blackburn who has been a very, very close partner in this work as well as chairman cantwell and ranking member whitmer for their support and we are joined by senator amy klobuchar and ed markey. they've all been extraordinary leaders in this effort and today
4:24 pm
is the first time that tiktok and snap have appeared before congress. i appreciate you and youtube for your testimony this morning. it means a lot. our hearing with the facebook whistle-blower frances haugen was a searing indictment with the gigantic corporation that puts profits ahead of people, specially our children any there has been a definite and deafening drumbeat of continuing disclosures about facebook. they have deepened america's concern and outrage and have led to increasing calls for accountability and there will be accountability. this time it's different.
4:25 pm
accountability with the parents and to congress, accountability to congress and accountability to investors and shareholders and accountability to the securities and exchange commission and other federal agencies because there is ample, credible evidence to start an investigation there, but today we are concerned about continuing to educate ourselves about how we can face this crisis. what we learned from miss haugen's disclosures and reporting since then is absolutely repugnant and abhorrent about instagram algorithms creating facebook's own researchers as that person said, it exacerbates a downward spiral and harmful to teens that fuel aid and violence and prioritize profits over people that it hurts and the algorithm
4:26 pm
amplifies depression, anger, hate, anxiety because those emotions attract and book kids and others to their platforms in effect, the more content and more extreme versions of it are pushed to children who express an interest in online bullying and eating disorders and self-harm and even suicide, and that's why we now have a drumbeat of demand and
4:27 pm
accountability for the drumbeat of disclosures. but we're hearing the same stories and reports of the same harm about the platforms that are represented here today. i've heard from countless and medical professionals in keck cut and elsewhere around the country about the same on snapchat, youtube and tiktok. in effect, that business model is the same. more eyeballs, and more dollars and everything that you do is to add users especially kids and
4:28 pm
keep them on your app for longer. >> i understand from your testimony that your defense is that we're not facebook. we're different and we're different from each other. being different from facebook is not a defense. that bar is in the gutter and it's not a defense to say that you are different. what we want is not a race to the bottom and really a race to the top, and we want to know from you what specific steps you're taking to protect children. i want to know what research has done and similar to the studies, and the data that has been disclosed on facebook, and we want to know whether you will support real reforms and not just the tweaks and the minor changes that you adjusted and recounted in your testimony. the picture that we've seen, an endless stream of videos that it's automatically selected by the sophisticated algorithms shows that you, too, subscribe to find something that teens
4:29 pm
will like and then drive more of it to them. if you learn that a teen feels insecure about his or her body, it's a recipe for disaster because you start out on dieting tips, and the algorithm will raise the temperature and will flood that individual with more and more, treatment messages and after a while all of the videos are about eating disorders and it's the same rabbit hole driving kids down those rabbit holes created by algorithms that leads to dark places and encourages more destructive
4:30 pm
violence. we've done some listening. i've heard from nora in westport who allowed her 11-year-old daughter avery on tiktok because she thought it was just girls dancing. avery wanted to exercise more with the shutdown of school and sports so like most people she went online. nora told me about the rabbit hole that tiktok and youtube pulled her daughter into. she began to see ever more extreme videos about weight loss. avery started exercising compulsively and ate only one meal a day. her body weight dropped dangerously low and she was diagnosed with anorexia. avery is now luckily in treatment, but the financial cost of care and extreme burden and her education has suffered. we heard from parents who had the same experience. so we not only listened, but we checked ourselves. on youtube, my office created an account as a teenager, like avery. we watched a few videos about
4:31 pm
extreme dieting and eating disorders. they were easy to find youtube's recommendation began to promote extreme dieting and eating disorder videos each time we opened the app. all too often videos are teens starving themselves, as you can see from this poster. the red is eating disorder related content. the green in all of the other videos and one is before and the other is after the algorithm. you can see the difference and we also receive these recommendations each time we watched other videos, mostly with eating disorder contents. there was no way out of this rabbit hole. another parent in connecticut wrote to me about how their son was fed a constant stream of videos on tiktok related to eating disorders and calorie counting after looking up athletic training. as scientific research has
4:32 pm
shown, eating disorders and body comparison also significantly affect young men and on social media, young men often feel compelled to bulk up to look a certain way. again, i heard about this all too often. i created an account on tiktok. troublingly, it was easy to go from men's fitness to steroid use. it took us only one minute -- one minute to find tiktok accounts openly promoting and selling illegal steroids. we all know the dangers of steroids. the research and disclosures send a message to america. you cannot trust big tech with
4:33 pm
your kids. america cannot trust these apps with their children. and big tech cannot say to children you must be the gatekeepers and the social media co-pilot. you must be -- parents should not be able to bear that burden alone. real transparency and real accountability and i want to market where the competition is too protect children and not to exploit them. the competition for the top. we have said that this moment is
4:34 pm
for big tech, a big tobacco moment, and i think there's a lot of truth to that intention because it is a moment of reckoning and the fact is that despite knowing its products can be harmful and it's thought to associate itself with celebrities, fashion and beauty and everything that appeals to young audience and like big tobacco, facebook hit parents and substantial evidence that instagram could have a negative effect on teen, but the products are different. big tech is not irredeemably bad like big tobacco and the tobacco products is the way the manufacturer intended and our
4:35 pm
goal is not to burn facebook to the ground. it's to bring out the best to impose accountability. she said, we can have social media, we enjoy what connects us without tearing up our democracy and putting our children in danger and showing progress around the world we can do better and i agree. thank you, and we'll turn now to the ranking member. >> thank you, mr. chairman and thank you to our witnesses for being here today. we do appreciate the sense. mr. chairman, i thank your staff for the work that they have done working to facilitate these
4:36 pm
hearings on holding big tech accountable. we appreciate that and today's conversation is something that is much needed and it is long overdue. for too long, we have allowed platforms to promote and glorify dangerous content for kid and teen users. in the weeks leading up to this hearing i've heard from parents and teachers from mental health professionals who are all wondering the same thing, how long are we going to let this continue? and what will it take for platforms to finally crack down on the viral challenges and the illicit drugs and the eating disorder content and the child sexual abuse material? we find this on your platform and teachers and parents and mental health physicians cannot figure out why you allow this to happen. it seems like everybody day that i hear stories about kids and teens who are suffering after interacting with tiktok, youtube and snapchat. kids as young as 9 have died
4:37 pm
doing viral challenges on tiktok and we've seen teen girls lured into inappropriate sexual relationships with predators on snapchat. parents, how can you allow this? how can you allow this? i've learned about kids and teens who commit suicide because they've suffered on these sites and the platform's refusal to work with law enforcement and families to stop these harassments when asked. if it were your child, what would you do to protect your child?
4:38 pm
it doesn't matter to you? my staff has viewed abusive content featuring minors and videos of people slitting their wrists on youtube. it's there. yet all the while kid asks teens are flocking to these sites in increasing numbers and the platforms love it. as they know that youth are a captive audience, one which will continue consume the content that is fed to them through these algorithms even if it puts them in danger. they're curious. they get pulled down the rabbit hole. they continue to watch and these platforms are getting more and more data on their children, but do we know what they're doing with that date? in the case of facebook, we know they're using it to sell their product to younger and younger children. those who cannot legally use their services, but these platforms, you all know you have children on these platforms that are too young to be on these platforms and you allow it to
4:39 pm
continue because it's money. money in the bank. it is money in your paycheck and obviously, money trumps everything and with some of our witnesses today we have real reason to question how they're collecting and using the data that they get from american children and teens. i've made no secret about my concern that tiktok who is owned by dance is paving the way for the chinese government to gain unfettered access to our children and teens on tiktok despite vague assurances that they and i'm quoting, store data outside, end quote, has not alleviated my concerns in the slightest. in fact, earlier this year they changed their privacy policy to allow themselves to collect even more data on americans. now they want your face prints and voice prints in addition to your data and your key stroke
4:40 pm
patterns and rhythms. they also collect audio that comes from the devices connected to your smartphone like smart speakers. they collect face and body attributes from videos as well as the objects and the scenery that appear in those videos. this makes sense to some degree to create videos for china to surveil their own citizen, why should we say that tiktok isn't doing the same to us? most americans have no idea this is happening and while they hope on the face of things to reassure us by creating u.s.-based offices for their high-priced lobbyists and marketing personnel, it just is
4:41 pm
not enough. we see through it all as we get through today and tiktok on easy policies give them an out to share data and the chinese communist party. yet most americans have absolutely no idea and this is a chinese communist party is getting their information. the time has come to secure american consumers' program data. as a mother and a grandmother, i know this is doubly important when it comes to our children. as this hearing will show, we can't afford to wait on that. so thank you to the witnesses. we look forward to your cooperation. thank you, mr. chairman. >> thanks, senator plaque burn. and the vice president of global policy and part of snapchat she
4:42 pm
spent most of her career in government work other for then-senator joe biden at the department of state. vice president and the head of public policy in tiktok. he joined tiktok in february 2020 and leads the government relations and he is the founding president and ceo of the internet association. i am joined by leslie miller with public policy at youtube. the youtube public policy for google. why don't we begin with your testimony? thank you. >> thank you, mr. chairman.
4:43 pm
>> chairman blumenthal, ranking member blackburn and members of the subcommittee. thank you so much for the opportunity to be here today. my name is jennifer stout, and i am the vice president of global public approximately see at snap. i've been in this role for nearly five years after spending almost two decades in public service, more than half in congress. i have tremendous respect for this institution and the work that you are doing to ensure that young people are having safe and healthy online experiences. to understand snap's approach to protecting young people on our platform it is helpful to start at the beginning. snapchat founders were part of the first generation to grow up with social media. like many of their peers they saw that while social media was capable of making a positive impact, it also has certain features that troubled them. these platforms encouraged people to broadcast their thoughts permanently. young people were constantly
4:44 pm
measuring themselves by likes, by comments, trying to present a perfect version of themselves because of social pressures and judgments. social media also evolved to feature an endless feed of unvetted content exposing individuals to a flood of viral, misleading and harmful information. snapchat is different. snap chat was built as an anecdote to social media. from the start there were three key ways we prioritized privacy and safety. first, we decided to have snap chat open to a camera instead of a feed of content. this created a black canvas for friends to communicate with each other in a way that was more immersive than tech. second, we embraced strong privacy principles and the idea of ephemera, and it has a permanent record of conversations online and in real
4:45 pm
life, friends don't talk out their tape record tore document every conversation. third, we focused on connecting people who were already friends in real life by requiring that both snapchatters opt in in order to be friends and communicate because in real life friendships are mutual. we have worked hard to keep evolving responsibly, understanding the potential negative effect of social mead why, we made proactive choices to ensure that all of our future products reflected those early values. we were influenced by existing regulatory framework that governs broadcast and telecommunications when developing the parts of our app where users have the potential to reach a large audience. discover, which is our closed
4:46 pm
content platform and features content from professional media publishers and verified users and spotlight where creators can submit creative and entertaining videos to share to get reach. >> our design protects our audience and makes it different. we have adopted responsible, design principles and rigorous processes that consider the privacy and safety of new features right from the beginning. weep take into account the unique sensitivities of young people. we intentionally make it harder for strangers to find minors by not allowing public profiles for users under 18. we've long deployed age-gaining tools to prevent minors from viewing ads. we make no effort and have no plans to market to young children. individuals under the age of 13 are not permitted to create snapchat accounts and if we find them we -- additionally, we are
4:47 pm
developing new tools that will give parents new oversight over how their teens are using snapchat. protecting the well-being of our community is something we approach with both hume ility and determination. over 500 million people around the world use snap chat and 95% of our community says that snapchat makes them happy because it connects them to their friends. we have a moral responsibility to take into account the best interest of our users in everything we do, and we understand that there is more work to be done. as we look got future we believe it is necessary. given the speed at which technology develops and the rate at which regulation can be implemented, regulation alone
4:48 pm
can't get the job done. technology companies must take responsibility to protect the communities they serve. if they don't, government must act to hold them accountable and we fully support the subcommittee's approach to investigate these issues and we welcome a collaborative approach that keeps our young people safe online. thank you again for the opportunity to appear today and i look forward to answering your questions. >> thanks, miss stout. mr. beckham? >> chairman blumenthal, members of the subcommittee, my name is michael beckerman, with the public policy for tiktok and i am also the father of two young daughters and i am passionate about our children staying safe online. i joined tiktok after a decade representing the internet and industry at large because i saw an opportunity of tiktok responsibly grow to a platform. it is not a social platform, and not an app to watch what friends are doing. you create on tiktok. the passion, creativity and community has new cultural
4:49 pm
trends and chart-topping artists and businesses across the country. it has been a bright spot for families who create together and even members of the senate and your staff how joyful, fun and entertaining tick-tock content really is, and i'm proud of the safe work that our safety teams do every single day and that our leadership makes safety and wellness a priority, particularly it on protect teens and in the context of the hearing today we see that from experts and stakeholders to constantly improve. we find areas or find flaws where you can do better. we hold ourselves accountable ask find solutions. turning a blind eye to areas where we can improve and most importantly, we strive to do the right thing protecting people on the platform. when it comes to protecting
4:50 pm
minors we work to create age-appropriate experiences for teens throughout their development. we have proactively built privacy and safety protections in mind. for example, people under 16 have their accounts set to set private automatically. they can't post live streams, and they can't send direct messages on our platform. and we don't allow anyone to send platform videos, images or links via direct messaging. these are perhaps under-appreciated product choices that go a long way to protect teens. and we made these decisions which are counter to industry norms or or own short-term growth interests because we're committed to do what's right and building for the long-term. we support parents and their important role to protect teens. that's why we built parental controls that empower a parent to link their tiktok account in a simple way from their own device as a parent to their teenager's account to enable a
4:51 pm
range of privacy and safety controls. and i encourage all the parents that are listening to the hearing today to take an active role in your teen's phone and app use. and if they're on tiktok please check out family present. visit our youth portal. our tools for parents are industry leading, innovative. but we're always looking to add and improve. it's important to note that our work is not done in a vacuum. it's critical for platforms, experts and governments to collaborate on solutions that protect the safety and well-being of teens. that's why we partner with common sense networks. we also work closely with the national center for missing and exploited children, the national parent teacher association, the digital learning lab and our u.s. content advisory council. tiktok has made tremendous strides to promote the safety and well-being of teens. but we also acknowledge and we
4:52 pm
are transparent about the room that we have to grow and improve. for example, we're investing in new ways for our community to enjoy content based on age appropriateness or family comfort. and we're developing more features that empower people to shape and customize their experience in the app. but there is no finish line when it comes to protecting children and teens. the challenges are complex. but we are determined to work hard and keep the platform safe and create age-appropriate experiences. we do know trust must be earned, and we're seeking to earn trust through a higher level of action, transparency and accountability as well as the humility to learn and improve. thank you for your leadership on these important issues. i look forward to answering the questions of the committee. >> thanks. ms. miller, i hope you're with us. please proceed. >> sorry. i think i'm having a bit of technical difficulty.
4:53 pm
can you hear me okay? >> we can hear you, and now we can see you, thanks. >> okay, wonderful. chairman blumenthal, ranking member blackburn and distinguished members of the subcommittee, thank you for the opportunity to appear before you today. my name is leslie miller, and i'm the vice president of public policy at youtube. as more young people spend more time online and given their changing needs as they grow up, it's crucial to put in place protections that allow them age-appropriate access to information. we do this by investing in the partnerships, technologies and policies that create safer environments that allow children to express their imagination and curiosity and empower families to create the right experiences for their children. our teams work close lewith the product teams to ensure product design reflects on understanding
4:54 pm
of children's kbru neek needs and abilities and how they evolve over time. the advice from trusted experts informs our growing improvements for youtube kids and our child safety policies. our child safety specific policies, which i describe in greater detail in my submitted testimony prohibit content that exploits or endangers minors on youtube. using a combination of machine learning and human reviewers across the globe, we commit significant time and resources to removing this harmful content as quickly as possible. between april and june of this year we removed nearly 1.8 million videos for violations of our child safety policies of which about 85% were removed before they had ten views. we're constantly working to improve our safe guards. we have also invested significant resources to empower parents with greater control over how their children view
4:55 pm
concept on youtube. in 2015 we created youtube kids as a way for kids to more safely pursue their curiosity and explore their interests while providing parents more tools to control and customize the experience for their families. videos on youtube kids include popular childrens videos, diverse new content and content from trusted partners. after we launched youtube kids we heard from parents that tweens have different needs which weren't being met fully by our products. to develop a solution for these parents, which we call supervised experiences. we launched this earlier this year on the main youtube platform. parents now have the option of choosing between three different content choices. content generally suitable for
4:56 pm
viewers age 9 plus or 13 plus and then most of youtube. the most of youtube option excludes all age restricted content on the main platform. we want to give parents the controls that allow them to make the right choices for their children. on youtube kids and even youtube for all under 18 users auto play is off by default. in the coming months we'll be launching additional parental controls in the youtube kids act including the ability for a parent to choose a locked default auto play setting. our take a break reminders and bedtime settings are also on by default in these experiences. youtube treats personal information from anyone watching personal content on the platform as coming from a child regardless of the age of the user. this means on videos classified as made for kids we limit data collection and use. and as a result we restrict or
4:57 pm
disable some product features. for example, we do not serve personalized ads on this content on our main youtube platform and do not support features such as comments or live chat. to be clear we've never allowed personalized advertising on youtube kids or youtube surprise experiences. there is no issue more important than the safety and well-being of our kids safety online. i again wtd like to thank you for the opportunity to appear today and look forward to answering your questions. >> thanks, ms. miller. we're going to do five-minute rounds. we have votes at 11:00. we have three votes, but i think we'll be able to juggle the questioning and the testimony, and if necessary we'll take a short recess. let me begin. as you know by now in august i
4:58 pm
wrote to facebook asking whether they had done any research, whether they had any facts that showed harm to children. in effect, they denied it. they dodged the question. they disclaimed that instagram is harmful to teens. let me ask you, and i think it's pretty much a yes or no question, same question that we asked facebook, have any of your companies conducted research on whether your apps can have a negative effect on childrens or teens mental health or well-being and whether your apps predict addiction-like use. have you done that research? >> senator, we have conducted research. much of our research is focused on our products and how we can improve our products and services to really meet the needs of our users and our
4:59 pm
community. and as i mentioned in our opening testimony, some of the research that we did shows that 95% of users say that snap chat makes them happy. >> would you make that research available? >> yes, senator, we would. >> we believe that research should be done in a transparent way, and for us we partner with external experts and stakeholders to get their feedback. and additionally, we've actually supported passage of the camera act, which could have additional funding at mih and we'd love to see this done in a transparent way. >> >> now, i've asked whether the research has been done that could show negative effects or addictive-like impacts. you've all indicated that you've
5:00 pm
done that research. and ms. stout has indicated her company will make it available. i'm assuming, mr. beckerman, your company will. you're nodding. and ms. miller? >> yes, senator, we have published some research, and we would make additional available. >> let me ask about the black box algorithms. as you know these algorithms exist. they function to drive toxic content at kids, more of it and more extreme versions of it. the consequences are potentially catastrophic. but the companies are in effect creating their own homework. they're evaemting their own effects on kids when it comes to addiction and harms.
5:01 pm
you provide external independent researchers to algorithms, data sets and data privacy practices? in other words, if an academic researcher comes to you on child psychology and wants to determine whether one of your products causes teen mental health or addiction, they get access to raw data from you without interference? ms. stout? >> so, senator, i think it's important to remember on snap chat very little of our content is sorted algorithmically. >> i'm going to interrupt just to say the question is about access to independent research. on those algorithms that you do use, and there's no question that you have algorithms, correct? >> correct, senator. we do algorithms, but they just operate differently. to your question whether we've
5:02 pm
had any requests for outside researchers or mental health specialists to come and access that, to my knowledge we have not, senator. >> but you would provide access to it? >> yes. it's important to understand that algorithms for us just operate differently. so to compare them against different platforms is -- >> that's one of the facts an independent or external researcher would verify. >> indeed. >> if they're different and how they're different. >> yes, senator, we believe transparency for the algorithm is incredibly important. we're one of the first companies to publish publicly how the algorithm works. we invite you and your staff to come in and see exactly how the algorithm works. in your feed on tiktok if you're not interested you can add that.
5:03 pm
>> external access, okay. ms. miller? >> senator, we're very transparent as it relates to the way in which our machine learning works. for example, our quarterly transparency report that summarizes the videos and channels wii removed based on violating our community guidelines. earlier this year we rolled out an additional statistic. >> the question was whether you provide exterm independent researchers with access to your algorithms and data sets. >> i'm sorry, senator? >> you provide that access? >> we regularly partner with experts, for example, in child development and mental health -- >> if somebody independent came to you and wanted that access,
5:04 pm
yes or no, would you permit it? >> senator, it would depend on the details but we're always looking to partner with experts in these dependent fields. >> i'm going to cite the difference between your response and mr. beckhams and ms. stout's which indicates certainly a strong hesitancy if not resistance to providing access. let me ask you, ms. miller, i think one of the issues here really relates to the claim that these sites are trance parent and truthful, which is belied by our actual experience and the fact they favor regulation. in the case of facebook they amounted armies of lawyers, millions of dollars to fight regulation. whether it's section 230 or
5:05 pm
privacy legislation or requirements to be more transparent on algorithms. coordinated details in a multi-state anti-trust case google has, quote, sought a coordinated effort to forestall and diminished child protections by the ftc and legislation. that filing described attempts to discourage facebook and microsoft to fight privacy rules and back down on advocacy for legislation in a particular meeting where that exchange occurred. this disclosure made news, but everybody in d.c. really knew it was true. what was new was that google's hypocrisy was finally called out. the fact is that google and youtube have been fighting against privacy behind the scenes for years.
5:06 pm
it's hidden in plain sight. it's an open secret. you spent vast sums of money fighting californian's privacy rules. i want to ask you, ms. miller, what work has youtube done to lobby against congress strengthening online protections for children, and is that report and that claim by the multi-state plaintiffs accurate? >> senator, i understand that the material that you're referencing was regarding our point of view on e-privacy legislation in europe. our ceo has regularly called for comprehensive privacy legislation in the u.s. and on behalf of youtube, i am not aware of any efforts other than to be involved in conversations in a
5:07 pm
multi-stakeholder way as it relates to any legislation or bills that are being introduced regarding the oversight or regulation of companies such as youtube. >> so you are saying that these reports about political pressure and lobbying against children's privacy and safe guards are just totally false? >> i think we work with lawmakers such as yourself regularly to have conversations to share what we are doing on the platform, the updated protections we are putting in place but also to hear your concerns and to work with you as you contemplate new regulations. >> will you commit that you'll support privacy legislation as has been proposed? >> senator, i'm not steeply involved in the details of any specific privacy legislation, but i'll commit we'll work with you and partner with you on
5:08 pm
federal privacy legislation. >> would you support a ban on targeted advertising to children and young teens? >> senator, at youtube we prohibit personalized advertising in youtube kids as well as unsupervised experiences. >> would you support a ban? >> we have not waited in laws in order to put those types of protections in place. >> well, i hope that you will and that you'll be perhaps more forthcoming in the next round of questioning. i'll turn to the ranking member. >> thank you, mr. chairman. mr. beckerman, i want to come to you first. in the past tiktok has said that it has never nor would it ever share and provide user data to the chinese government even if asked. yet your privacy policy says you can disclose data collected to respond to government inquires. it also says you share data you collect with your parent
5:09 pm
companies and affiliates and that you transmit user information to servers and data centers overseas. and earlier this year the chinese communist party acquired an ownership state and a seat on the board of bite dance. so does tiktok share user data with its parent company bite dance? >> thank you, senator. this is an important question and i'm glad you're asking. >> quickly, please. >> yes, senator. we do not share information with the chinese government, and i'd like to point you to a citizen lab report which are the most well respected national security experts where it said our testing tiktok did not contact any servers in china, and the report goes on, senator, that tiktok does not pose a threat to national security. i'd be happy to submit that
5:10 pm
report for the record. >> please submit the report for the record. do any bite dance employees have access to any tiktok user data or any role in creating their algorithm? >> senator, u.s. user data is stored in the united states. our backups are in singapore, and we have a well renowned u.s. based access team. >> i understand you say you store it in singapore. tell me about programmers, product developers and the data teams. are they housed in china? >> senator, like many technology companies we have engineers in the united states and around the world. >> and so they have access to algorithms and data? so that answer is yes. what about doyan? do those employees have any access to tiktok user data or input into the algorithm? >> senator, that's a completely
5:11 pm
different app from tiktok. >> no, it's a related company. you might want to check that. if the chinese communist party ask row for u.s. user data, which is to stop you from providing it since they have a seat on the board of bite dance and they have a financial stake in the company? >> senator, that's not accurate. one, they do not have a stake in tiktok at all. >> yes, they do. that happened in august. that is bite dance, and we can clarify that for the record. but the record is that the chinese communist party acquired a stake in bite dance in august, and they now have a seat on the board. so let's talk about tiktok's privacy policy. it says you collect and keep a wide variety of information including biometric data such as face prints, voice prints, geolocation information, browsing and search history not just on tiktok but on other apps and keystroke patterns and
5:12 pm
rhythms. why do you need all of this personal data especially on our children, which seems to be more than any other platform collects? >> senator, many outside researchers and experts who look at this have pointed out tiktok actually collects less data than many of our peers. and on the keystroke issues -- >> outside researchers that you're paying for? >> no, senator. >> you would submit that to independent outside researchers because what we're seeing with all of this biometric data and the keystroke patterns that you are exceeding that. so what do you do with this? are you creating a virtual you of the children that are on your site? >> senator, i don't know what you mean by virtual you. >> well, a virtual you is you and your presence online. it's like a virtual dossier. i'm sure you understand that
5:13 pm
term. and what do you need with all of this information? do you track children's viewing patterns? are you building a replication of where they go, their search history, their voice, their biometrics? and why does tiktok and bite dance and doyan need that information on our children? >> senator, tiktok is an entertainment platform where people watch and enjoy and create short form videos. it's about uplifting, entertaining content. people love it, and i disagree with your characterization of the way -- >> that is it from the positive. but there's also a negative. and the negative is that you are building a profile, a virtual you of our children because of the data that you're collecting. you've mentioned the family
5:14 pm
parent provision that you have. so when you have a parent that goes on that, are they opening their data to tiktok? and is tiktok following them or following and capturing their search history? see, mr. beckerman, when you capture all of this data and you hold all of this data, then you are invading the property, the private -- the privacy of individuals that are on your site. and that applies to you, to ms. stout, to ms. miller. because you are -- you say because you are using the platform we can do this. but in essence what you're doing is making our children and their data -- you're making that the product because you turn around and you sell it. and then basically it becomes
5:15 pm
weaponized against their users. mr. chairman, i'm over time. i have several questions for ms. stout and ms. miller. and we'll do those in a second round. >> we'll have a second round. senator klobuchar? >> welcome to both of you. our reports indicate nearly half of kids 9 to 12 and a third of kids age 7 to 9 use social media on platforms like facebook, insta, snap, tiktok, youtube. i don't think parents are going to standby while our kids and our democracy become collateral damage to a profit game. and i heard last night mike zuckerberg's words to his earnings report. and while he may be out there acting as a victim at his $29 billion quarter earnings report meeting, the true victims, the
5:16 pm
mom in duluth who can't get her kid off facebook to do her homework, the dad mourning losing a child to a snap speed filter that measured him the kid at going 123 miles per hour trying to beat the filter or a child exposed to content glorifying eating disorders on tiktok. so i have had a case right in my state -- two cases, actually, of young people who got drugs through snap, and i wanted to first start out with that with you, ms. stout. this is a story -- there's two kids. devon, he was suffering from mental pain at the beginning of the pandemic. he'd been given a percocet
5:17 pm
before, and the classmate said he had a percocet. well, what this young man did not know is that this percocet was laced with fentanyl, and he died just like that. as his mom said in a letter to me, all of the hopes and dreams we as parents had for devon were erased in the blink of an eye. a group of parents including devon's mother, bridget, demanded answers and accountability from snap on this issue in a letter to you in september. ms. stout, i want to know what the answers are? will you commit to providing more information about the automated tools snap uses to proactively search for illegal drug related content as the parents ask? >> senator, i very much appreciate you raising this issue because it has been a devastating crisis that's been afflicting our young people. i want to make clear we are absolutely determined to remove
5:18 pm
all drug dealers from snap chat, and we've been very public about our efforts in this space. first of all, we've stepped up our operational efforts. and my heart goes out to the family. i met with bridget. i heard from her and other families to understand what is happening to them, their experience and also what's happening on snap chat. we have stepped up, and we have deployed proactive detection measures to get ahead of what the drug dealers are doing. they are constantly evading our tactics not just on snap chat but on every platform. we've also stepped up our work with law enforcement. just last week we had a law enforcement summit where we gathered over 2,000 members of law enforcement across the country so that we can understand what they're dealing with and top find out best practices how we can get the information they need. finally, senator, this is so important. but we have deployed an education and awareness campaign, because what is happening on our platforms all across social media and
5:19 pm
technology platforms is that young people who are suffering from mental health and stress induced by the pandemic and other issues, they're reaching for substances oftentimes pills and opioids. but these substances are laced with fentanyl, enough fentanyl to kill them. >> here's my problem. is if a kid had just walked into, say, a pharmacy he wouldn't be able to buy that or get that. but in this case they can get on your platform and just find a way to buy it. and that is the problem. and i guess i want to know are you going to get your drugs -- i appreciate everything you said. i appreciate you meeting with the mom. are you going to get drugs off snap chat when you have all these kids, half the kids in america looking at these platforms? >> i assure you this is such a top priority for our entire company. and senator, it's not just happening on our platform. it's happening on others. so therefore we need to work
5:20 pm
collectively with other platforms, the other companies here today and work together. >> thank you. i think there's other ways to do this, too, creating liability when this happens so maybe that will make you work even faster so we don't lose another kid. mr. beckerman, a recent investigation by "the wall street journal" found that tiktok's algorithm can push young users into content glorifying eating disorders, drugs, violence. have you stopped that? >> yes, senator, i don't agree with "the wall street journal." with that said, we have made a number of improvements the way people can have control over the algorithm and age appropriate content on tiktok. >> what are those changes? are they completely protected now from this content? >> senator, the content relates to drugs as you're pointing out. it violates our community guidelines. over 79% of violative content is removed proactively. of course we want to get to
5:21 pm
100%, and that's something we're constantly working on. >> are you aware of the research your company has conducted about apps pushing content related to eating disorders to teens? >> no, senator. >> did you ask for any studies on eating disorders before testifying? >> not that i'm aware of, but we do work with outside experts to understand. i'd love to see the camera act passed so we can additional research in the public domain all of us can learn from and improve. >> i'll save my questions for ms. miller for the next round. thank you. >> thanks, senator klobuchar. and again, i would remind everyone you committed to provide the research you just referred to in your response that senator klobuchar, all of you committed to provide that research. and we'll look forward to receiving it within days or weeks, not months. and i particularly appreciate
5:22 pm
the reference to creating liability to a strong incentive which would involve reform of section 230. >> and mr. chair, if i could put this letter from the parents into the record. >> without objection. >> thank you. >> we've been joined by senator cantwell remotely. >> mr. chairman, i would defer to my colleagues, senator markey and senator baldwin. >> thanks, very much. senator markey? >> thank you, mr. chairman, very much. the problem is clear. big tech preys on children and teens to make more money. now is the time for the legislative solutions to these problems. and that starts with privacy. i've introduced bipartisan legislation to give children and teens a privacy bill of rights for the 21st century. today a 13-year-old girl on
5:23 pm
these apps has no privacy rights. she has no ability to say no. no, you can't gobble updata about me. no, you can't use that data to pileup algorithms that push toxic content towards me. no, you can't profile me to mip me and keep me glued to your apps. no. you have no rights. 13-year-old girl in the united states of america in the year 2021. my bipartisan childrens and teens online protection act gives 13, 14 and 15-year-olds that right to say no. to each witness, do you support my legislation to update the chirp's online privacy protection act to give that 13, 14 and 15-year-old control of their data? >> ms. stout? >> senator markey, i just want to say that we absolutely
5:24 pm
support a federal privacy proposal, and we've worked very hard with members of this body to try to -- >> do you support my child protection, my teen protection law? do you support it? >> so, senator, we agree there should be additional protections put against young people to protect them further from -- >> so you've had a chance to look at the child online privacy protection update i've introduced. do you support it or not? >> i think, senator, we'd love to talk to you -- >> this is what drives us crazy. we want to talk, we want to talk, we want to talk. this bill has been out there for years and you still don't have a view on it. do you support it or not? >> i think there are things we'd like to work with you on, senator. >> mr. beckerman, do you support the child online privacy protection act being updated the way my legislation does. >> senator, thank you for your leadership on this issue. it's been very important. yes, we agree it needs to be
5:25 pm
updated particularly as it relates to age voifgds happens across the internet. it's an area i think has been given not given as much attention it deserves. we do believe it needs to be updated. >> do you support my legislation to update it? you've had plenty of time to look at it. >> we like your approach. however, i think a piece that should be included is a better way to verify age, across internet and apps. and i think with that improvement it'd be something we'd be happy to support. >> okay. ms. miller? >> senator, we also support the goals of updated comprehensive privacy legislation on your specific bill. i know we've had conversations with your staff in a constructive manner, and i would welcome continuing to do that. >> it's going to happen soon because this is a crisis that --
5:26 pm
this has just surfaced in a way that's clear we don't have anymore more time. we've got to get this finished. among young teens 49% say they're on tiktok, 52% say they're on snap chat. 81% say they're on youtube. and those 13-year-olds, they deserve the right to say, no, you can't track me. do you agree with me, mrs. stout? do you agree with that? >> yes, i agree with that. >> do agree with that mr. beckerman? >> yes, senator. >> do you agree? >> yes, senator, we have tools for all our users to handle and control and make choices as it relates to the information that is gathered. >> the bill also would ban targeted ads to children, apps that should never be allowed to track a 10-year-old's browsing history and bombard him with ads based on that data.
5:27 pm
do you agree that congress must ban targeted ads to children? >> senator, i defer to you and your peers. but, again, we've got waited for laws like this. >> would you support to make sure there's a uniform banning of this practice? if you've already adopted it as a company, would you support that being the standard that we have for all platforms across the country? >> as you describe it, it's consistent with the approach we've already taken. >> and you would support it? is that what you're saying? >> senator, again, we are already doing this -- >> no, we're trying to draft a law here. would you support that provision being in a law to prohibit it? >> senator, yes, as we already prohibit targeted advertising.
5:28 pm
>> -- so that we can legislate. same question, should you ban targeted ads to kids? >> we've worked on that already, where kids can opt out. >> would you support it as a national law that we ban it? >> an example has been the age appropriate design code, and we're looking at exactly that model -- >> do you support it as a law this body passed this year to prohibit it? >> we offer those and we agree with the approach so -- >> so, yes, you do support it? yes or no? >> we agree with your approach so -- >> we're now trying to say if you support it we want a law to prohibit anyone else from doing it. yes? >> i think we're very close, senator. >> mr. beckerman? >> yes, senator. and i would say we should go beyond that.
5:29 pm
and certain categories of ads shouldn't be shown to teenagers and young adults at all, and i think that should be part of the approach as well. >> we need to go beyond privacy and tackle the design features that harm young people, take like buttons. senator blumenthal and i have a bill, the kids act which would ban these and other features that quantify popularity. the research is clear, these features turn apps into virtual popularity contests and are linked to feelings of rejection, low self-worth and depression. even youtube kids has acknowledged this problem and does not have like buttons. to each witness, should congress ban features that quantify popularity for kids? yes or no? >> senator, as i mentioned in my opening statement, we don't have those metrics. we've never had a like button or comments because we don't think it should be a popularity
5:30 pm
contest. so we support that. >> so you would support. >> senator, this is one a little more complex we'd have to have a conversation. but we have implemented much of the age proecht design code here in the united states and would encourage similar measures. >> so you -- i don't know there was any answer in there. you said it's complicated. do you support banning it or not? >> on banning likes? >> yes. >> again, if you want to set it by age, that's something we could look at. >> ms. miller? >> senator, as you noted we already prohibit things on youtube kids uch such as being able to comment, and we would support working with you in regulation in this area. >> okay. you would support working with us, but would you support banning likes? >> yes, senator. again, we already do not allow for this on a youtube kids platform. >> and again, the american academy of pediatrics just
5:31 pm
declared a national state of emergency for children and teen mental health. we need to outlaw the -- the question we have to answer ultimately is whether or not, for example, we're going to ban auto play for kids with this feature when one video ends, another quickly begins. kids stay glued to their phones so apps collect more data and make more money. today 82% of parents are worried about their kids screen time. to each of you today, do you agree that congress should ban auto play for kids? yes or no? ms. miller, we'll start with you this time. >> senator, each of the items you're outlining we already prohibit. we do -- we have auto -- excuse me, the default set to auto play off on youtube kids as well as
5:32 pm
for supervised experiences. >> okay, so you would support that being legislated? >> yes, sir. >> okay, mr. beckerman. >> senator, we have to break videos and auto play tools. >> would you support that legislation passing that which would ban auto play? >> we'd be happy to look at it and talk to you about it. >> you don't do it. >> again, i think it's important to look at age appropriate features for teens as something we build into tiktok proactively. but, again, as we look at legislation i do think a first step is something around age verification across apps. >> again, this is the historic problem. ms. stout, would you support it? >> senator markey, i don't believe we have auto play on snap chat, so i would defer and say that's something we need to look at more closely. and i'm not familiar with that piece of the proposal in your
5:33 pm
legislation. >> okay, great. we have a lot of work to do, and we have to telescope the time frame, i think, mr. chairman. >> telescoping the time frame. >> thank you, mr. chairman. >> thank you. thanks, senator markey. thanks for all your good work. senator baldwin. or actually i think senator cantwell is here. oh, sorry. >> thank you, mr. chairman. we all know social media offers a lot of benefits and opportunities, but like as has been expressed this morning, i have concerns about the lack of transparency online and limited accountability of big tech companies. and one of the major problems of social media increasingly concerning is social media's platform use of algorithms to shape and manipulate users experience resulting in individuals being trapped in what we call the filter bubble. the filter bubble can be particularly troubling for
5:34 pm
younger user. for example, a recent wall street journal article described in detail how tiktok's algorithm serves up sex and drug videos to minors. i have a bill, the filter bubble transparency act, and another bill called the pact act that would make significant strides in addressing the lack of transparency online. and importantly the filter bubble transparency act would give the option to engage in platforms without being manipulated by opaque algorithms. let me ask you, do you believe consumers should be able to? >> yes, senator, we agree there needs to be transparency in the way algorithms work and additional choice for individuals as they're using them. >> ms. miller? >> senator, we do provide transparency in the way our systems and practices work. >> ms. stout?
5:35 pm
>> senator, it's important to understand what we apply algorithms to is a very small set of content. and users get to select interest categories that then determine the kind of concept they're served up. but it's not a unlimited list or set of user generated content. it's quite narrow. >> but i don't know that you or ms. miller really answered the question. that is should consumers who use these social media platforms be able to use them without being manipulated by algorithms? >> senator, yes, i agree with you. >> ms. miller? >> yes, senator. >> mr. beckerman, what's your response of "the wall street journal" auricle that described in detail how tiktok's algorithm serves up sex and drug videos to minors? >> senator, thank you for the question. sex and drugs violates our guidelines and have no place on
5:36 pm
tiktok. we disagree with that being an authentic experience an actual user would have. >> your platform is perhaps driven by algorithms more so even than facebook. unlike facebook tiktok algorithm is not constrained by a user's social network. tiktok's ceo kevin meyer wrote and i quote, we believe companies should disclose their algorithms and data flows to regulators, end quote. it makes tiktok source code access to regulation. has tiktok disclosed their algaictms, moderation policies to data flows to any federal or state regulators? >> senator, yes. as we pointed out we do have these transparency centers. and we've done i think over 100 tours with members of senate,
5:37 pm
their staff and others in u.s. government. and we'd be happy to continue to be transparent in how that works. >> i think senator blumenthal maybe touched on this, but in keeping with tiktok's disclosure practices announced in july 2020, would you commit to providing data flows to this committee so we may have independent experts review them? >> yes, sir. >> thank you. ms. miller, does youtube engage in attitudes to influence its users in any way? >> senator, when users come to youtube they come to search and discover all types of content. for example, how to bake bread, to watch a church service or to do exercise. and as a result they are introduced to a diversity of content that isn't based on a particular network that they are a part of.
5:38 pm
in so doing there may be additional videos that are recommended to them based on some signals. but those signals will be overrided to make sure that we are not recommending harmful content. >> back to mr. beckerman. all chinese internet companies are compelled by china's national intelligence law to turn over any and all data that the government demands, and that power is not limited by china's borders. has tiktok provided data to the chinese government on chinese persons living in the united states or elsewhere outside of china? >> no, senator. tiktok is also not available in china. and as i'd like to point out our servers with u.s. data are stored in the united states. >> does tiktok censor videos of tank man, the famous video of a young man who stood his ground in front of a procession of chinese army tanks during the
5:39 pm
1999 tenman square in beijing? >> no, senator. >> there are a number of things we need to address. congress needs to be heard from in this space, and particularly with respect to the use of algorithms and the way that users are manipulated. and as you've already pointed out, particularly young people. so i hope that we can move quickly and directly and in a meaningful way to address this issue. thank you. >> thank you. i think we have strong bipartisan consensus on that issue. thank you, senator thune. senator baldwin. >> thank you, chairman blumenthal. i would like to just note that this series of hearings really began with the revelation that internal research at facebook revealed the negative impacts on
5:40 pm
teenagers' body images from using the company's instagram platform. and we learned based on research by the chairman's staff how quickly somebody on instagram can go from viewing content on healthy eating to being directed toward postings that focus on unhealthy practices including glorifying eating disorders. i know we don't have facebook and instagram before us today, but i'm particularly concerned about the impact that type of content could have on young users. i recently joined senators klobuchar and capito with whom we sponsored the ana weston act legislation to support training and education on eating disorders. in a letter to facebook and instagram seeking more details about how they handle this
5:41 pm
issue, but i want to ask each of you, can you briefly outline the steps your companies are taking to remove content that promotes unhealthy body image and eating disorders and direct users to supportive resources instead? and in particular how are you focusing on this issue with regard to your younger users? why don't we start with mr. beckerman and tiktok. >> thank you, senator. i myself have two young daughters, and this is something i care a lot about and our teams at tiktok care a lot about. one, i want to assure you we do aggressively remove content like you're describing that would be problematic for eating disorders. second, we direct people that are seeking help. one thing we've heard that people struggling with eating disorders or other weight loss issues come to tiktok to express themselves in a positive way, so
5:42 pm
it has been been more of a positive source. and lastly, we don't allow ads that target people based on weight loss and that kind of content. >> ms. stout? >> thank you, senator baldwin. i want to make clear that the content you describe, content that glorifies eating disorders or self-harm is a complete violation of our community guidelines. also as i described earlier, we don't allow unvetted, unmoderated content from being serviced up to our users. discover, which is our media publisher platform, which we partner on with people and publishing companies like "the wall street journal" or nbc news, all of that content is vetted and moderated ahead of time. but specifically -- >> can i interrupt and ask is that done through ai or humans? >> no, these are hand picked partners that snap chat has selected to say in this closed garden of content, which is discover, we will allow certain
5:43 pm
publishers and media companies to provide news, entertainment content, espn or cmt or, you know, "the washington post," in fact. so users can come look at that content. it's all premoderated and curated. so it's not an unlimited source of user-generated content where you could go down a rabbit hole, perhaps, and access that kind of hurtful, damaging content on body image. but i think you raised a very interesting question. what are the products you're surfacing? how are we helping users find positive resource snz and as a result we did conduct research about the mental health effects of body image and self-harm, and we created a product called here for you. this was created in 2020 in the height of the pandemic. when users are in snap chat, and they search anorexia or eating disorder, instead of perhaps being led to content that could be harmful, that content which is against our guidelines, we
5:44 pm
now surface expert resources that show content that can help that user. maybe it helped them, maybe it helped their friends. so this is a redirection of that kind of search for potentially hurtful and harmful concept that then steers the user to resources that may help them or a member of their, you know, circle of friends. >> ms. miller? >> senator, we take a comp hence ive and holistic approach. we prohibit content that promotes or glorifies things such as eating disorders. it has no place on platform. we also realize users come to share their stories about these experiences or find a community let alone to find authoritative sources, which is what we raise up on searches like this. in addition, we also rollout programs and initiatives such as
5:45 pm
the with me campaign whereby we are encouraging users to spend their time particularly during covid in pursuing healthy habits. so we look at this in a very holistic way to make sure that youtube is a platform that people come and they have a healthy experience, and we, again, prohibit the type of content that glorifies or promotes these issues such as eating disorders. >> and just if i could follow up similarly, when you remove content, how are you filtering that out? are you using artificial intelligence? are you using a team of humans who are -- people who are looking at that content and deciding whether to remove it or not? >> it's a mix, senator. so it's a mix of when we develop content policies we rely on experts to inform the
5:46 pm
development of these polies. and then we have machine learning to help us capture this type of content at scale. you'll see in our quarterly transparency report that more than 90% of content that violates our community guidelines are flagged originally by machines, and then there's a mix of human reviewers. >> thank you. i want to thank the -- senator blumenthal and senator blackburn for holding this subcommittee hearing. and as the witnesses can see, our colleagues are well-informed and anxious to get legislative fictions to things they think are usual to protecting individuals and peoples privacy, so i want to thank them for that. yesterday motherboard vice had an article that basically the headline was location data from gps data from apps are given even when people have opted out.
5:47 pm
so basically they're -- i'm going to enter this for the record unless there's objection. but, quote, the news highlights a stark problem that smart phone users, that they can't actually be sure if some apps are respecting their explicit preferences around data sharingch the data transfer presents an issue for the location data companies themselves. so basically these companies are reporting information about location even when people have explicitly opted out. and so they're continuing to collect this information. that is what the report of researchers and motherboard found. so i have a question. do you believe that location data is sensitive data and should be collected only with consumers' content? all the witnesses, please. >> yes, senator, we agree. >> yes, senator, we agree. >> yes, senator. and for users, they have access
5:48 pm
to their account under my activity in my account and can modify their settings, delete their history and things of that nature. it's all just one click away. >> so any federal privacy law should make sure that that's adhered to. i see a nod. is that -- >> yes, senator. >> yes, senator. >> yes. >> okay, thank you. do any of you share location data with the company that's in this article? it's i think h-u-k, they're a major data. >> senator, i've never heard of that company and i'm not aware. >> senator, i'm not aware of the company, but we also don't collect gps data. >> you would be affiliated with them in some way. i mean, they're getting this information anyway. so, i'm sorry, the last witness if you -- >> senator, i'm also not aware of the company.
5:49 pm
>> maybe you can help us for the record on this so that we know. but this is exactly what the public is frustrated about and concerned about particularly when harm can be done, that, you know, they go to a website, they say to the website i don't want my sensitive information to be shared, and then there's this conglomerate of data gathering on top of that that's not honoring those wishes as it relates to the interface on those apps. this is exactly why i think we need a strong privacy law and why we should protect consumers on this. in the facebook hearing we had, we had this discussion about advertising and the issue of whether advertisers knew exactly what the content was they were being advertised. i get we're also seeing a migration of major companies like proctor and gamble and others moving off the internet because they're like i'm done with it.
5:50 pm
i don't want my ad because it's now run on a system -- i don't want my ad just appearing next to certain kinds of content. but what was more startling is there may be actual deceptive w saying, oh, this content is this when in reality it is something else, and in some of these cases we discussed with facebook objectionable hate speech and content that we don't even think should be online, and yet that's what the advertisers knew, so on your web sites, do advertisers know what content they're being placed next to? >> so, senator, i can respond to your question. yes, our advertisers do know where they're advertising, their advertisements show up. i mentioned discover, the closes curated garden, those advertisers appear next to publishers and verified users we have hand selected to allow to
5:51 pm
appear, so on a platform like snapchat, there is no broadcast disinformation or hate speech, and that's why i think snapchat is, in fact, a very appealing place for advertisers because they know their advertisements will be placed next to safe content. >> mr. beckerman? >> yes, senator. advertisers come to tiktok particularly because our content is known for being so authentic and uplifting and fun, and you know, we see ads that are very much like tiktok videos which are the same themes. >> ms. miller? >> senator, we have worked with our advertising partners over the years to make sure that they have trust in the fact that advertising on you tube is safe for their brands in the same way that we have worked significantly to make sure that users themselves have a safe experience on the platform and the advertising associations have recognized the work that we have done in the space so that
5:52 pm
their brands are safe on the platform. >> thank you. i'll probably have a follow up on this, but senator lee. >> thank you, madame chair. ms. miller, i would like to start with you, if that's all right. i want to ask you a particular question regarding you tube's app age rating. now, google play has the app rating set at teen, meaning 13 and up. while the apple store has it rated as 17 and up. can you tell me why this disparity exists, if apple determined that the age rating for you tube ought to be 17 and up, why did google determine that its own app should be rated as teen, meaning 13 and up? >> senator, i'm unfamiliar with the differences that you've just outlined but i would be happy to follow up with you and your staff once i get more details on
5:53 pm
this. >> okay. yeah, i'd love to know about that just it is a simple question, and i understand you may not be able to answer it right now. as it sounds, you don't have the information but i would just like to know why that difference exists and whether you agree or disagree with the fact that google has rated its own app as 13 and up while apple has rated it 17 and up. i'm happy to follow up on that in writing or otherwise. i want to address a similar issue with regard to snapchat. snapchat is rating 12 and up on apple, and it's rated teen on the google play store. any idea why there is that disparity there? >> senator, that's a very good question, and i -- for some reason i've heard somewhere the reason why apple lists at 12 and up, it's an app that's intended for a teen audience. >> right. why is there a disparity between
5:54 pm
the age rating and the content that's available on that platform? >> senator, the content that appears on snapchat is appropriate for an age group of 13 and above. >> yeah, let's talk about that for a minute because i beg to differ. in anticipation of this discussion and this hearing, i had my staff create a snapchat account for a 13-year-old -- for a 15-year-old child. now, they didn't select any content preferences for the account. they simply entered a name, a birth year, and an e-mail address. and then when they opened the discover page on snapchat with its default settings, they were immediately bombarded with content that i can most politely describe as wildly inappropriate
5:55 pm
for a child, including recommendations for, among other things, an invite to play an online sexualized video game that's marketed itself to people who are 18 and up, tips on quote why you shouldn't go to bars alone, notices for video games that are rated for ages 17 and up and articles about porn stars. now, let me remind you that this inappropriate content that has been default been recommended for a 15-year-old child is something that was sent to them by an app just using the default settings. so i respectfully but very strongly beg to differ on your characterization that the content is, in fact, suitable for children 13 and up as you
5:56 pm
say. now, according to your own web site, discover is a list of recommended stories, so how and why does snapchat choose these inappropriate stories to recommend to children? how does that happen? how would that happen? >>. >> so senator, allow me to explain a little bit about discover. discover really is a closed content platform, and yes, indeed, we do select and hand select partners that we work with. and that kind of content is designed to appear on discover and resonate with an audience that is 13 and above. i am unfamiliar, and i've taken notes about what you have said that your account surfaced. i want to make clear that what content and community guidelines suggest that any online sexual video games should be age gated to 18 and above so i'm unclear
5:57 pm
why that content would have shown up in an account that was for a 14-year-old, but these community guidelines and publisher guidelines on top of those guidelines are intended to be an age appropriate experience for a 13-year-old. >> i understand that. you have these community guidelines, community guidelines that are there, and they note that advertisers and media partners in discover agree to additional guidelines, what are these additional guidelines, and i mean, i can guess only that they permit these age-inappropriate articles to be shared with children. how would that not be the case? >> senator, so these additional guidelines on top of community guidelines are things that suggest they may not glorify violence, that any news articles must be accurate and fact checked, there's -- >> i'm sure the tips on why you
5:58 pm
can go to bars alone are accurate and fact checked but that's not my question. it's about whether it's appropriate for children ages 13 and up as you've certified. >> absolutely. and i think this is an area where we're constantly evolving and if there are instances that is inappropriate, they will be removed from the platform. >> do you review them? what kind of oversight do you conduct on this? >> we use a variety of human review as well as automated review, and i would very much be interesting in talking to you and your staff about what kind of content this was because if it violates our guidelines, that kind of content would be done, and just senator, one last thing, while i would agree with you tastes vary when it comes to the kind of content that is promoted on discover, there is no content there that is illegal. there is no content there that is hurtful. i mean, it really is intended to be a close ecosystem where we have better control over the type of content that surfaces.
5:59 pm
>> now, madame chair, i just have one follow up question. i'll be brief. >> go ahead, and our colleague senator patiently waiting. so snapchat assured users it doesn't collect identifying data on them for advertising. how does snapchat then decide what content is pushed to the top of their discover page. >> so senator, if you go into your snapchat account you have the ability to select preferences, it's interest categories, and there are several interest categories that a user can select or unselect if they wish. that could be they like, you know, to watch movies or they enjoy sports or they are fans of country music. at any point, it is completely transparent and a user has the ability to go in and select what they like and that determines the kind of content that's surfaced to them. if there's any content they don't like, they can uncheck or check and that really generates
6:00 pm
the kind of contact that a user in discover would see. >> my time has expired. thank you so much, madame chair, we have to get to the bottom of this. these are inappropriate. we know there's content on snapchat and on you tube among many other places. it's not appropriate for children ages 12 or 13 and up. >> i think senator lee, and to my line of questioning, it's not appropriate to tell advertisers it's not located next to content, and next to condition tent that is inappropriate. senator luhan. >> you mentioned that all content on snapchat, on the spotlight page is human reviewed before it can be reviewed by more than 25 people. yes or no, does human review help snapchat reduce the spread of potentially harmful content? >> yes, senator, we believe it does. >> and i appreciate snapchat's approach to this problem.
6:01 pm
more platforms should work to stop harmful content from going viral. however, far too often, we find companies say one thing to congress, and then once attention is diverted and the public is distracted they go around and do the very thing they were warning us against. can i hold you to that? will snapchat continue to keep a human in the loop before content is algorithmically promoted to large audiences? >> senator, this is the first time i have testified here before congress, so please hold me to it. but at snapchat, we have taken a very human moderation first approach. not just on spotlight, but across our platform, so yes, indeed, human moderation will continue to play a huge part of how we moderate content on our platform and how we keep our users safe. >> i'm glad to see the importance of platforms taking responsibility before they amplify content and especially publish it to a mass audience,
6:02 pm
and it's something many of us share. it's why i introduced the protecting americans from dangerous algorithms as well. online platforms must be responsible when they're actively promoting hateful and dangerous content. ms. miller, i'm grateful that you tube is making an effort to be more transparent regarding the number of users that view content in violation of community guidelines. i'm concerned with one trend, earlier this year i wrote a letter to you tube with 25 colleagues on the crisis of nonenglish misinformation on the platform. we need to make sure all communities no matter the language they use at home have the same access to good, reliable information. ms. miller, will you tube publish its view rates broken down by language? >> senator, thank you for your question, and what you're referring to is this latest data point that we shared earlier this year in which for every
6:03 pm
10,000 views on you tube, 19 to 21 of those views are of content that is violative. and we apply our content policies at a global scale across languages. we do not preference any one language over another, and this includes for the violative view rate. >> and ms. miller, i don't believe that's good enough. when we don't break algorithms down by their performances across different groups of people, we end up making existing gaps, existing biases worse. we've seen this with facial recognition technology that unfairly targeted communities of color and according to reports, we're seeing this happen right now on you tube, so i'll ask again, will you tube publish its violative view rate broken down by language? >> senator, i would be happy to follow up with you to talk
6:04 pm
through these details. as i said for all of our content policies and the enforcement there within, and the transparency we provide, it is global in scope and it is across languages. >> i definitely look forward to following up and working with you in that space. mr. beckerman. >> thank you. >> before launching tiktok for younger users, did tiktok do any internal research to understand the impact it would have on young children? >> thank you, senator. i'm not aware, tiktok for younger users, the content is curated with common sense networks, and it's an age appropriate experience. i'm not aware of any specific research. >> i would like to follow up on that as well. products like tiktok can lead to addictive behavior and body image issues in young children, and it's critical platforms work to understand these problems before they take place. this is a very serious issue, and it's one that's finally getting the attention that it deserves with revelations and whistleblowers that have come
6:05 pm
forth. i urge you to take this opportunity to begin a transparent of the impact your product is having on young children, and in the end, i just want to follow up on something that many of us have commented on, leading up to these important hearings, and i appreciate the chair's attention to this, the ranking member, both of them have authored legislation. our chair and ranking member of the subcommittee have also partnered on legislative initiatives. it's critically important that we continue moving forward and that we mark up legislation and get something adopted and i'm certainly hopeful here in the united states we're paying attention to what's happening in other parts of the world. again, europe is outpacing the united states and being responsible with legislative initiatives surrounding protecting their consumers. there's no reason we can't do that here as well. and i just want to thank chairman cant well for the work she has been doing in this
6:06 pm
particular space and i look forward to working with everyone to make sure we're able to get this done in the united states. i yield back. >> senator luhan you have reintroduced that bill in the senate, that's right? thank you. very much appreciate that, and appreciate your leadership. we're awaiting the return of senator blumenthal so i can go and vote. if he doesn't come in the next minute or so, we'll take a short recess. we're way past time to get over there. i want to thank the members who have participated thus far because we have had a very robust discussion today. you can see that this is a topic that the members of this committee feel very very passionately about, and obviously believe that there's much more that we need to be doing in this particular area, so i appreciate everybody's attendance and focus, and i want to thank senator blumenthal and black burn for their leadership in having both of these
6:07 pm
hearings, and for the larger full committee, we had planned to move forward on many of these agenda items anyway, but we're very appreciative of the subcommittee doing some of this work in having members have a chance to have very detailed interactions on these policies that we need to take action on, so very much appreciate that. so i see senator blumenthal has returned. thank you so much, and i'll turn it over to you. >> thank you, chairman cant well, and thanks for your excellent work on this issue. i'd like to ask some additional questions on legislative proposals. one of the suggestions that senator klobuchar raised was
6:08 pm
legal responsibility and liability, which as you know is now precluded by section 230. let me ask each of you, would you support responsible measures like the earn it act, which i have proposed to impose some legal responsibility and liability cutting back on the immunity that section 230 affords? >> senator, we agree at snapchat that there should be an update to the intermediary platform liability law, cda 230, and in fact, the last time this body addressed a reform, which was for the sesta snapchat was a company that actively participated in that and helped draft legislation. so we would welcome another
6:09 pm
opportunity, senator, to work with you on that. >> thank you. would you support the earn it act, as you know, senator graham and i proposed, it imposes liability, and affords victims the opportunity to take action against platforms that engage in child pornography and related abuses. >> of course, senator, we completely prohibit that kind of activity, that illegal activity and we actively look for it, and when we find it, we remove it. if you would allow me to get back to it's been a while since i have looked at the earn it act. i do recall when you and senator graham introduced it, but i believe that the spirit of your legislation is something we would very much support. >> you had the opportunity before to say whether you supported it, so far, you haven't. will you commit to supporting it? >> senator, again, my memory is failing me a little bit. but i do believe that the provisions in the earn it act
6:10 pm
were many of the provisions we'd supported. i would be happy to come back with a more specific answer for you. >> we do agree there needs to be a higher degree of accountability and responsibility for platforms particularly as it relates to content moderation that needs to be done in a way that allows all platforms to moderate in an appropriate and aggressive way to make sure that the kind of content that none of us want to see on the internet around any of our platforms is able to be removed. >> do you support changes in section 230 to impose liability? >> there absolutely can and should be changes, but again, in a way that would allow companies like ours that are good actors, that are aggressively moderating our platform in a way that we think is responsible to be able to continue to do so. >> will you support the earn it act? >> again, we agree with the spirit of it. we would be happy to work with you and your staff on that bill. >> well, the bill again was
6:11 pm
reported unanimously out of the judiciary committee since the last session. it hasn't changed significantly. did you support it then? >> i think the concern would be unintended consequences that would lead to hampering the ability to remove violative content on platforms. >> is that a yes or no? >> it's a maybe. >> well, so far we have two maybes, ms. miller. >> senator, i'm aware of a number of proposals regarding potential updates to 230, and me and my team as well as the other teams across google have been involved in the conversations regarding these various proposals. i would just like to say, though, that we see 230 as the backbone of the internet, and it is what allows us to moderate content, to make sure that we are taking down content that leads to potentially eating
6:12 pm
disorders, for example, that we were talking about here earlier and self-harm. we want to make sure we continue to have protections in place so we can moderate our platforms and they are safe and healthy for users. i am aware of the earn it act, and i know again that our staffs have been speaking but i understand i think there's still ongoing discussions regarding some portions of the proposal. but we also very much appreciate and understand the rational as to why this was introduced, particularly around the area of child safety. >> well, again, is that a yes or no? do you support it? >> we support the goals of the earn it act but there are some details that i think are still being discussed. >> well, you know, as senator mareky has said, this is the talk we have seen again and again and again, we support the goals, but that's meaningless
6:13 pm
unless you support the legislation. and it took a fight, literally, bare knuckle fight to get through legislation that made an exception on the liability for human trafficking. just one small piece of reform. and i join in the frustration felt by many of my colleagues that good intentions, support for goals, endorsement of purposes is no substitute for actual endorsement. i would ask that each and every one of you support the earn it act but also other specific measures that will provide for legal responsibility and i think i know what ms. miller means by
6:14 pm
the claim that section 230 provides a backbone, but it's a backbone without any real spine right now because all it does is afford virtually limitless immunity to the internet and to the companies that are here. i'm going to interrupt my second round and call on senator cruz. >> thank you, mr. chairman. mr. beckerman, thank you for being here today. i understand this is the first time that tiktok is testifying before congress. and i appreciate you making the company available to finally answer some questions. in your testimony you talked about all the things you say tiktok is doing to protect kids online, and that's great but i want to discuss the broader issue here, which is the control the chinese communist party has over tiktok. its parent company bite dance,
6:15 pm
and its sister companies, like beijing bite dance technology. now, tiktok has stated repeatedly that it doesn't share the data it collects from americans with the chinese communist party, and that it wouldn't do so if asked. it is also stated that with regards to data collected on and from americans that data is stored in virginia with a back up in singapore. but these denials may, in fact, be misleading. a quick look at tiktok's privacy policy, in fact, just last night shows there's a lot more than meets the eyes. for example, in the quote, how we share your information section, one blurb reads quote, we may share all of the information we collect with a parent, subsidiary or other affiliate of our corporate group. interestingly in june of this year, the privacy policy was updated to state that tiktok quote may collect biometric
6:16 pm
identifiers and biometric information as defined under u.s. laws, such as face prints and voiceprints. mr. beckerman, does tiktok consider bite dance the parent company of tiktok which is headquartered in beijing to be a part of tiktok quote corporate group as that term is used in your privacy policy? >> thank you, senator. this is an important question. i would just like to take an opportunity first to clear up misconceptions around some of the accusations that have been leveled against the company. i would like to point to independent research, i understand that trust needs to be earned. >> i get you may have broader points you want to make. my question is simple and straightforward. does tiktok consider bite dance the parent company, headquartered in beijing to be
6:17 pm
part of tiktok's group. >> access is done by our u.s. teams and as independent researchers and expert haves pointed out, the data is not of national security importance and low sensitivity. but again, we hold it to a high standard. >> we're going to try a third time because the words that came out of your mouth have no relation to the question you were asked. your privacy policy says you will share information with your corporate group. i'm asking a very simple question. is bite dance your parent company headquartered in beijing part of your corporate group, yes or no, as you use the term in your privacy policy? >> senator, i think it's important that i address the broader point in your statement. >> so are you willing to answer the question yes or no? it is a yes-or-no question, are they part of your corporate group or not. >> yes, senator it is. >> yes, it is. so under your privacy policy, you're explicitly stating you
6:18 pm
may be sharing data with them, including biometric data, face prints, voiceprints is that correct. >> no, senator. it says if we are to collect biometric information, which we do not collect biometric data to identify americans, we would provide an opportunity for consent first. >> but you also say we may share all of the information we collect with a parent facility of our corporate group, bite dance headquartered in beijing. >> under u.s. access control, sir. >> secondly. what about beijing bite dance technology, which media reports from earlier this year showed beijing took a minority stake in through a state-backed internet investment chinese entity, and on the board of which now sets wushung gang, a cc official that spent most of his career with a stint at the online opinion bureau under the cyber space
6:19 pm
administration of china. china's internet regulator. would you consider beijing bite dance technology to be a part of tiktok's corporate group with whom tiktok could share all of the information it collects. >> i want to be clear that that entity has no affiliate with tiktok. it's based for domestic licenses of the business in china that is not affiliated or connected to tiktok. >> so are you saying, no -- yes or no as to whether beijing bite dance technology is part of your corporate group as the privacy policy defines it. it says we may share all of the information we collect with a parent, subsidiary, and other affiliate, presumably that's where it would fall, other affiliate of our group. is beijing bite dance technology an affiliate of your corporate
6:20 pm
group. >> entity deals with domestic businesses. >> you're having a hard time. you're answers questions i'm not asking. again, it's a yes/no. is beijing bite dance technology a quote other affiliate of your corporate group as your own privacy policy defines it. >> i'm trying to be clear to answer your question. that entity is based in china for the chinese business that is not affiliated or connected with tiktok. >> twice you answered. last time you did it on the third. >> the answer is the same senator. >> what i just said. that entity is -- >> what you just said did not answer the question. let me repeat the question again. is beijing bite dance technology a quote other affiliate of our corporate group as your privacy policy defines it yes or no? >> senator, as i stated, that entity does not have any
6:21 pm
relation to the tiktok entity. >> so i'll point out it took three questions to get you to answer about your parent. you finally answered yes, that you can share all of your information with your parent company based in beijing. i have asked you three times about the sister company that is obviously another affiliate. you have refused three times, that may be revealing often as sherlock holmes observed about the dogs that do not bark, it may be revealing that the chinese propaganda -- you're refusing to answer whether they fall under your privacy policy. that reveals a great deal. >> with all due respect. i'm trying to be accurate here. there are a lot of accusations that are not true, and i want to make sure it's clear -- >> i'm going to give you one more chance, my time is over. baseball, three strikes you're out. tonight the astros are going to begin winning the world series, let's see if a fourth strike, you can answer the question.
6:22 pm
it's a simple yes/no is beijing bite dance technology a quote other affiliate of our corporate group as your privacy policy defines that term. >> as i pointed out before, my answer is the same. >> yes or no. you didn't answer. >> senator, i appreciate your trying with gotcha questions. i'm trying to be truthful and accurate about -- >> are you willing to answer the question yes or no. >> i answered the question. >> you have not answered the question. is it another affiliate yes or no. >> senator, i stated a number of times that that entity is a domestic entity within china for licenses there and is not connected to tiktok. >> is it another affiliate as defined under your privacy policy, yes or no. >> senator, i answered -- >> you're here under oath. >> i answered the question. >> are you instructed not to answer the question. >> you're just refusing to answer because you do not want
6:23 pm
to. >> it is not affiliated with tiktok. that's your question. that's the answer. >> so your answer, i want to be clear, because you're under oath. your answer is that beijing bite dance technology is not a quote other affiliate of our corporate group as your privacy policy uses that term. this is a legal question with consequence. >> senator, i understand the question as i pointed out, tiktok is not available in china. that is an entity that is for purposes of a license of a business in china that is not affiliated with tiktok. >> for the record, you're refusing to answer the question. i believe i answered your question. >> yes or no, tell me what it is, yes or no. >> senator, i answered the question. >> you're not willing to say yes or no. >> it was not a yes-or-no question. i want to be precise. i want to be -- >> is this company another affiliate as defined in your privacy policy. that is binary, there's not a maybe. it's yes or no. >> senator, the way i answered, that is the answer to the question.
6:24 pm
>> so you're refusing to answer the question. that does not give this committee any confidence that tiktok is doing anything other than participating in chinese propaganda and espionage on american soil. >> senator, that's not accurate, and again, i would point you to -- >> if it were not accurate, you would answer the questions, and you have dodged the questions more than any witness i have seen in my nine years serving in the senate. that is saying something because witnesses often try to dodge questions, but you answer non seq. -- non sectors, in my experience when a witness does that, it's because they're hiding something. >> senator moran. >> mr. chairman, thank you very much. let me turn to ms. stout and ask a question about data privacy. so senator blumenthal and many others have been working on a
6:25 pm
consumer data privacy build now for the last several years. i have introduced a bill that includes an appropriately scaled right for consumers to correct and erase data that is collected or processed by covered entities including social media companies. ms. stout, i understand that snap currently allows users the right to correct or delete their user data. would you please explain snap's decision to provide this service. >> thanks for the question. we applaud the committee and your leadership on this issue and fully support a federal comprehensive privacy bill, and look forward to continuing to work with you and your staff on that. to address your question, senator snap has been designed with a privacy centric focus from the outset. we don't collect a lot of data. we believe in data minimization. short data retention periods and to the effect that you just
6:26 pm
made -- as you pointed out, we don't store content forever which includes giving them the ability to delete their data if they wish or the ability to download their data. we have a tool within the app users are able to download their data which gives them essentially agreed to share or post or put on the snapchat account. >> other platforms he may not take the same position that snap has. tell me what it is you give up. what is it you give up? >> we make tradeoffs every day that sometimes disadvantage our bottom line. there are no rules or regulations that require snaps to have short retention periods. that's why federal privacy legislation is so critical or
6:27 pm
why we choose to voluntarily have a data minimization practice. often times that means that advertisers find other platforms perhaps more enticing because those platforms that have been ever searched or shared or location day that that's ever been provided, and that is not the case on snapchat, so that's a tradeoff, senator, that we make because we believe in being more private and more safe. >> because if you did it otherwise, what would you gain by doing so? >> we just believe we have a moral responsibility to limit the data that we collect on people. >> you're answering it fine but i'm curious to know, are you giving up -- can you generate money? what do you generate by keeping that data and in some other way using it. >> yes, i think we limit ourselves and our ability to optimize for those advertisements and make more money, and we're a company that have not turned a profit. we invested every dollar back
6:28 pm
into our company, and we're here for the long game, and our ultimate desire is to make a platform that's safe for our community. >> thank you. and mr. beckerman, what responsibility do platforms have to prevent harmful social media trends from spreading, and how can tiktok improve its algorithm to better comply to tiktok's own terms of service that prohibit quote content that promotes or enables criminal activity. >> thank you, senator. we have a responsibility to moderate our platform along with our community guidelines and do that in a transparent way. >> where does that responsibility come from? >> it comes from doing the right thing. for us, we want to be a trusted platform. we want to be a platform where people have a joyful experience and like coming to the app, and that's what we're seeing, and that starts with our community guidelines. certain content, like you mentioned illegal activities, misinformation, other categories that you wouldn't allow on the platform, we work really hard, i think our content moderation teams and safety teams are often the unsung heroes of the
6:29 pm
companies that are working every day 24/7 to ensure that community guidelines are met and the platforms say positive and joyful. >> the nature of my second question was leading because it suggests you're not complying with your own terms of service that prohibit content that promotes or enables criminal activity. and my question was how can you improve your algorithm to better accomplish that? >> maybe you want to discount the premise of the question. >> we release regular transparency report. 90% of removals is done proactively, 24 hours before there's any views. we want to strive to get 100%, and that's something we fight and work on every single day. >> thank you. thank you, chairman and ranking member, and i think the series of hearings the subcommittee are
6:30 pm
having are hugely important to the nature of the country and the future. >> thank you, senator moran. senator black burn. >> ms. miller, i would like to come to you. you talk about moderation as a combination of machine learning and human review. it seems that you tube has no problem pulling down video that is question abortion, global warming, vaccine mandates, but child abuse videos remain on your side. so i'm interested to know more about the specific inputs that you use in these reviews who establishes these inputs and who oversees them to make sure that you get them right? >> senator, thank you for your
6:31 pm
question. so we heavily invest in making sure that all of our users and particularly kids on the platform have a safe experience, and the way we do this is with a number of levers, for example, we have content policies, so not putting minors into risky situations in videos on the platform. we have policies -- >> let me jump in and ask you, then f you're saying you don't want to put children too risky videos. there is a world of self-harm content on your side and a few searches come up with videos such as, and i'm quoting from searches that we have done. songs to slit your wrists by. vertical slit wrist. how to slit your wrist. and painless ways to commit suicide. now, that last video, painless ways was age gated, but do the
6:32 pm
self-harm and suicide videos violate you tube's content guidelines? if you're saying you have these guidelines. >> senator, i would certainly welcome following up with you on that video you may be referencing because we absolutely prohibit content regarding suicide or self-harm. >> ms. miller, i have to tell you, we have pulled these down in my office. our team has worked on this because i think it's imperative that we take the steps that are necessary to prevent children and teens from seeing this content. and i just can't imagine that you all are continuing to allow children to figure out how to do this on your site, how to carry out self-harm, so yes, why don't you follow up with me for more detail, and i would like that response in writing, and i also
6:33 pm
talked to a film producer friend this morning, and the film trailer for i'm not ashamed, which was based on the story of rachel scott. she was the first victim of the columbine attacks. and the film focused on her faith and how it helped in her life. so why would you remove this film trailer and block its distributor from being on your site, and you did this for eleven months. you did not put the trailer back up until "hollywood reporter" called and said why have you done this, you got an answer on that one? >> i'm sorry, but i'm not familiar with that specific removal. >> then let's review this. because and we can submit the documentation. i had it sent back over to me. ms. stout, i want to come to you. we had an issue in memphis with
6:34 pm
a 48-year-old man who was on your site. he raped a 16-year-old memphis teen, and he claimed to be a music producer. he lured her into this relationship and one of the news articles recently called snapchat the app of choice for sexual predators. this is something that is of tremendous concern, and much of it, from what i understand from talking to moms and talking to grand moms is they use the snap map location service, and i know you're probably going to say only your friends can follow you, but somehow people are getting around that. and these sexual predators who are following young people are using this map to get to their location. so we had this in memphis with the rape.
6:35 pm
we have another child that was in the middle part of the state that the predator followed they are and she tried to commit suicide because she knew her family was going to find out. so are you taking steps? do you want to give me a written answer as to the steps you-all are taking? how are you going to get a handle on this? this is endangering young women. >> senator, i'm more than happy to give you a detailed written answer, and you have written to us in the past and i appreciate your leadership in following up on this issue. i want to make crystal clear that the exploitation of minors is deplorable, it's our highest priority to prevent this type of event from happening on the platform. but with respect to the map, yes, indeed, location -- appearing on the map is off by default for everyone. not only for minors but everyone. so in order to appear to someone on the map and share your
6:36 pm
location, you must be bidirectional friends with that person. with respect to grooming, this is an area we spend a tremendous amount of time and resources to try to prevent. snapchat makes it intentionally difficult for strangers to find people that they don't know. we do not have open profiles. we do not have browsable pictures. we don't have the ability to understand who people's friend are and where they go to school. i would be more than happy to follow up in writing, and provide more details. >> let's do that, so we get more detail. one question for all three of you, and you can answer this in writing if you choose. you've all talked about the research work that you do. do you get parental consent when you are doing research on children, and can you provide us a copy of the parental consent form. we asked facebook for this, and they punted the question repeatedly. and ms. miller, i think you need to provide the committee
6:37 pm
clarity. you said that you all had never spoke out against online privacy. i think the chairman and i may question that a little bit so many question to you, and you can come back to us with a little bit more depth on this, did you fight it as a part of the internet association as they were fighting privacy on your behalf. and just one thing to wrap mr. chairman, going back to mr. beckerman, with the confusion there seems to be around the ownership of tiktok, which their parent company is bite dance, and the ccp does have a seat on the board. they have a financial stake in bite dance. their doyan, is an affiliated entity, and we know there's a
6:38 pm
relationship with the chinese communist party through all of this, and then i checked and i know that tiktok user data is stored in the u.s. and singapore, and until quite recently singapore data services were run by alibaba, which is another chinese company, so what we need to have from you, mr. beckerman, is some clarity on the chain of ownership, the transmission in the sharing processes that are around u.s. consumer data, especially the data and information of our children and with that, i will yield back. thank you, mr. chairman. >> thanks, senator blackburn, we're going to go to senator sullivan, we're going to finish the first round, senator sullivan, senator lummis. i'm going to go vote. i should be back by the time that senator lummis finishes and in the meantime, senator blackburn will preside.
6:39 pm
thank you. >> thank you, mr. chairman, and mr. beckerman, i know there have been questions, i certainly have concerns about sharing data with the chinese communist party given the ownership or at least bored influence. senator blackburn was just talking about that. senator cruz made the issue. it's what i refer to as kowtow capitalism. kowtow capitalism. and i think that you guys are exhibit a of kowtow capitalism. what is kowtow capitalism. american executives of american companies censuring americans' first amendment rights in america so as now to fend the chinese communist party or gain access to the chinese market. we see it on wall street. we see it in the movie studios.
6:40 pm
we see it with the nba. so let me ask a couple of questions related to that. a tiktok user could put up a video that criticizes the chairman of the committee, the ranking member, any senator, president biden, former president trump, couldn't he? i mean, not like some horrible violent suggestion, but just a criticism of an elected official? >> yes, senator. >> is that common? >> actually, tiktok really isn't the place for politics. we don't allow political ads and so political content is not what people come to. it wouldn't be a violation of our community guidelines, as long as it's not misor disinformation or something hateful. >> good, that's free speech. i would hope you would answer that way. could a tiktok user put up a video criticizing xi jinping, i
6:41 pm
know he's sensitive of being compared to winnie the pooh. could a tiktok user put up videos that kind of make fun of him maybe with references to winnie the pooh. i don't know why he doesn't like winnie the pooh, but for some reason you can't put winnie the pooh up anywhere in china. can a tiktok user do that? >> yes, senator. >> really? >> our community guidelines are done for the united states market by our team in california and our moderators are done here, and that wouldn't be a violation of our community guidelines. >> so what about in 2019, you admitted to censuring videos mentioning tibetan independence. can a tiktok user mention tibetan independence? >> yes, senator. >> what happened in 2019 when you guys admitted to censuring a video related to that. >> i'm not familiar with that
6:42 pm
incident, but that would not be a violation of community guidelines. it would be allowed on our platform. >> what about, i think there was a tiktok public policy director in 2020 admitted that tiktok had previously censured content that was critical of the ccp in regard to forced labor with regard to uyghur muslims is that true? >> that's incorrect. that would not be a violation of community guidelines, that content would be permitted. >> okay. so you're saying that your videos have not been censured by the chinese communist party on any matter? i'm just reading from this. maybe these are all wrong. >> i can assure you that our content moderation teams are led by americans. our moderation guidelines are public and transparent. and content that's critical of any government, frankly, as long as it meets our community
6:43 pm
guidelines, not mis or disinformation, and i would encourage you to search for a number of the examples today. >> the forced labor issues were not censored, i have wrong information on that. >> not currently. >> were they previously censored by anyone at tiktok? i'm reading here 2019, 2020, you admitted, somebody in tiktok admitted doing that, so that didn't happen? >> i'm not aware of that. i can say that's not a violation of guidelines now, and that's not how we moderate content. >> listen, madame chair, or mr. chairman, and the ranking member, i think this issue of kowtow capitalism where american companies are censoring americans is an issue this committee should be looking at because chinese communist party of course can crush freedom of speech in their own country, but shouldn't be able to crush it in
6:44 pm
this concern. and mr. beckerman, i'm glad that you're denying any of this, i look forward to seeing videos somewhere, somehow, on tiktok that are critical of the chinese communist party. i'm not holding my breath. maybe it's true. maybe bite dance and ccp board members are fine with videos criticizing president xi jinping and other members of the communist party. you're saying that is totally fine, and completely acceptable policy for tiktok users? >> yes, senator, and just to be perfectly clear, there's not involvement from the chinese communist party in moderation of tiktok and it's all done by americans from within the united states. >> great. thank you. >> thank you. >> thank you, mr. sullivan, and i can -- i will let you -- we are going to continue looking at these issues. the chinese communist party's influence into u.s. companies and into technology and the
6:45 pm
silencing and censoring of free speech of u.s. citizens online. >> it's a really important issue. i appreciate that. >> senator lummis, you're recognized. >> thank you. i'm going to start directly with questions, and if i have any time left, i would like to read a statement into the record. but i'll start with a question for ms. miller. you tube has implemented several features, such as autoplay that have proven to make the platform difficult to stop using. what mechanisms has you tube employed to ensure that children specifically have tools to counter act these design decisions, and to believe those controls are sufficient. >> thank you for your question. autoplay is default off on you tube kids as well as in supervised experiences.
6:46 pm
we do allow if the default is changed to allow for autoplay, for example, if a family is in a car and they would like to have the parents have decided they want autoplay to continue, but we have set it to default off. >> and do you believe that's sufficient? >> i think it's one of a number of tools that are important to make sure that kids have a healthy and safe experience on the platform. another is that we do not deliver targeted advertising on you tube kids. another is that we age gate content to make sure that minors do not see age inappropriate material, so it's only with a number of tools and protocols in place do we think that we are meeting the bar we've set for ourselves that parents expect of us, experts in the field such as
6:47 pm
child development advise us on to make sure, again, kids are having a safe experience on the platform. >> thank you. mr. beckerman, after my staff reviewed your privacy policy, i want to list some of the items that tiktok will automatically collect from one of its users, that person's location, the device model of your phone, their browsing history outside and inside of tiktok. content of all messages sent on tiktok. their ip address, their biometric identifying information, and information from their phones, such as key stroke patterns and other apps. do you believe this sort of mass data collection is necessary to deliver a high quality experience to your users. >> senator, i thank you for that question. some of those items that you listed off are things that were
6:48 pm
not currently collecting and we stated in the privacy policy, if we were to, we would notify use ers and get their consent. >> which of those that i named have that condition? >> as it relates to biometric, i didn't write down every single thing, i would be happy to go through it with you and your team on that. >> perfect. we will follow up with you. >> my question is, regardless of which ones require consent of those i mentioned, why should any member of this committee feel comfortable with the vast amounts of data your company is collects on our children, especially since tiktok has a relationship to the chinese communist party. >> senator, first off, as it relates to data, tiktok collects less in many categories than many of our peers, and some of these things, as you mentioned, key stroke, that's not collecting what people are
6:49 pm
typing. that's an antifraud, antispam measure that measures the cadence of typing. a bot, for example, would behave different than a human. it's not collecting what people are typing, it's an antifraud measure. >> which of your competitors or companies that you're aware of collect more information? >> i probably would point to facebook and instagram, for example. >> well, i'll ask the same questions of them. thank you. this for all of our witnesses. are your platforms specifically designed to keep users engaged as long as possible? do you want to start, mr. beckerman? >> senator, we want to make sure that people are having an entertaining experience, you know, like tv or movies, you know, tiktok is meant to be entertaining, we do think we have a responsibility along with parents to make sure that it's being used in a responsible way. we have take a break videos, time management tools, and
6:50 pm
family pairing is another tool where parents can help limit the time that their teenagers are spending on the app. >> is the length of engagement a metric that your company uses in order to define success? >> there's multiple definitions of success, senator, it's not just based on how much time somebody is spending. >> is that one of them? is length of engagement one of the metrics? >> i think overall engagement is more important than the amount of time being spent. >> but is it one of the metrics? >> it's a metric that i think many platforms check on how much time people are spending on the app. >> thank you. ms. stout, same question, are your platforms designed to keep users engaged as long as possible. >> so senator when you open up snapchat, you don't open up snapchat to a feed of other people's content, designed to keep you consumering more and more content. you open to a blank camera, which is a blank canvas. it's a place where ewers come to talk to your friend in videos
6:51 pm
and pictures. >> is it a metric your company incorporates into the definition of success. >> i believe we see success if the platform is facilitating real live conversations and connections with friends. snapchat is a place where friends come to talk to each other. >> is it a metric, do you measure success in any way, shape or form by how long people stay on your site, is that one of multiple driving measures of success? >> i think the way i can answer that question is it is one of many metrics. >> okay. thanks. ms. miller, same question. are your platforms designed to keep users engaged as long as possible? >> senator, our platforms are designed to allow users to search and discover all types of content. it is intended for them to have an enjoyable experience.
6:52 pm
>> i'm asking is this one of the metrics by which you define success. >> we have a number of digital well being tools designed directly --. >> is this one of the metrics, is it one of them. there could be numerous metrics but is this one of them. >> yeah, to the specific question that you're asking, we do look at, for example, if a video was watched through its entirety. that helps us determine whether or not that was a quality video relative to the search that the user had. we look at those data points to inform us as it relates to the experience the user has had on the platform. >> thank you. >> madame chairman, do i have time to enter an opening statement. >> thank you, madame chairman. this generation of children will grow up under a level of surveillance well beyond any previous one, and although the recently "wall street journal" reports focused on the problematic harms of facebook we know that the problem is endemic
6:53 pm
among our youth and bigger than facebook alone. children are impressionable. they are easily manipulated by targeting advertising to them and they are readily influenced by the highly sophisticated algorithms that often serve age-inappropriate content to its youth users. these invisible algorithms continuously nudge our children in different directions which can impact their development without their knowledge, and without their parents' knowledge. these algorithms on these platforms were designed with adults in mind, not children. only a tiny fraction of children understand the harm that can come from sharing sensitive information, pictures or opinions, that become part of their digital permanent record, but what is most alarming is that none of them can fully understand how the content fed to them by algorithms will shape their world view during these
6:54 pm
formative years. so more must be done to promote responsible social media use. we must educate parents on how to teach their children to avoid the pitfalls of using these platforms and more importantly, we must hold these platforms accountable for the effects that their design decisions have on our children. mr. chairman, thank you for the opportunity to add that opening statement to the record and thank you for your indulgements, i yield back. >> thank you. and thank you for your leadership on these issues and your very insightful questions. we do have to protect kids in our country. you've just put your finger on it. so let me ask this just following up on senator lummis, and senator blumenthal, everyone on the panel here today. these kids are constantly
6:55 pm
posting content and data is being tracked, stored and monetizing but we know that young users lack the cognitive ability to grasp that their posts are going to live online forever. to each of our witnesses, do you agree that congress should give children and teens, but more importantly, their parents their ability, the right to erase their online data, mr. beckerman. >> yes, senator. >> ms. stout. >> yes, we do, senator, but i would say that content on snapchat does not appear permanently. >> again, i appreciate that. and to you ms. miller. >> yes, senator, and users have the ability to delete their information as well as having auto delete tools. >> so they should have the right to delete it, do you agree with that, ms. miller?
6:56 pm
>> yes, senator. >> okay. great. today apps collect troves of information about kids that have nothing to do with the app's service, for example, one gaming app that allows children to race cartoon cars with animal drivers has reportedly amassed huge amounts of kids' data unrelated to the app's game, including location and browsing history, why do apps gobble up as much information as they can about kids, well, it's to make money. congress, in my opinion, has to step in and prevent this harmful direction of data. ms. miller, do ewe agree that platforms should stop data collection that has nothing to do with fulfilling the app's service? >> senator, we do limit the data that we collect, and this is particularly true, for example, on the you tube kids app. we limit the data collection to only rely on what is necessary
6:57 pm
to make sure that the platform runs. >> so do you agree that that should become a law, that all platforms have to do the same thing. senator, i don't want to speak to whether or not it should become a law and/or the details of any proposed legislation, but at you tube we have not waited for a law to make sure that we have these protections. >> i appreciate that. i appreciate that. it's just time to make up your mind. yes or no on legislation? we need to move mr. beckerman. >> yes, senator we do need legislation. i think we're overdue on strong national privacy laws. >> great. ms. stout. >> yes, senator we absolutely collect less data, and it sounds as though collection of data that is irrelevant to the performance of the app does not appear to be within scope. >> okay.
6:58 pm
today popular influencers pedal products online while they flaunt their lavish lifestyles to young users. influencer marketing videos of online child celebrities opening new toys, getting millions of views, but they're inherently manipulative to young kids who often cannot tell that they're really paid advertisements that their heroes are pushing that the hero is getting a monetary kick back from. my bill with senator blumenthal, the kids act would ban this type of promotion of influencer marketing to kids, to each of the witnesses do you agree that congress should pass legislation to stop apps from pushing influencer marketing to children in our country? yes or no, ms. miller. >> senator, again, we've actually moved in this direction whereby we have a set of quality principles regarding the type of
6:59 pm
content that's made available on the you tube kids app, and in so doing, we make sure that we're not providing content that, for example, would have a significant prevalence of that type of material. and we also limit the types of ads that can be delivered on the app. i apologize, but i don't know the details -- i'm sorry? >> should we make it illegal so that people out there who might be trying to influence children know that there's an enforceable penalty that -- >> i absolutely think it's worth a discussion. i would need to stare at the details of such a bill. >> again, it's been around for a long time. mr. beckerman. >> yes, senator, we already limit the kinds of advertisements that can be served to teens, but we do agree there should be additional transparency and additional privacy laws passed. >> ms. stout. >> senator, i would agree that i think for young people there
7:00 pm
should be additional protections placed, so yes, we would be happy to look at that. >> and by the way, we ban it on television because we know that we can't have the hero just holding the product on saturday morning, and say, hey, kids, tell your parents buy this. we ban it there. we have to ban it online, so thank you. and finally, push alerts. studies show that 70% of teenagers report checking social media multiple times a day. the last thing apps should be to depression, anxiety and feelings of isolation. the last things apps should be doing is using method like push notifications, automated messages that nudge users to open an app, to make young users spend even more time online. to each of the witnesses, do you agree that congress should pass, again, the law that senator blumenthal and i are trying to move, which would ban push alerts for children? >> ms. miller? >> i agree that additional
7:01 pm
protections should be in place regarding push alerts. >> we already limit push notifications. >> should we ban push notifications? >> i think that would be appropriate but we've already done that proactively. >> ms. stout? >> yes, senator, snapchat does not utilize push notification or nudges, as the uk appropriated zion code pointed out. >> thank, you thank you mister chairman. >> thanks, senator markey. i understand that senator klobuchar had a few more questions and while we are waiting for her, i have a few as well. so, appreciate your patients. patience. let me begin by acknowledging
7:02 pm
-- i think you would acknowledge as well -- the reason that you made many of the changes that you have had, is the uk's child safety law. they age-appropriate design code. i think we need an american version of the british child safety law. and i want to ask about some of its provisions. will you commit to supporting a child safety law that obligates companies to act in the best interest of children? it establishes, in effect, a duty of care that could be legally enforceable. ms. stout? >> senator, we were very
7:03 pm
privileged to be able to work with the information commissioner's office in the uk in their design of the age-appropriate design code. it'we have course comply with e code as it's come into force. as i mentioned to senator markey, we are looking at that code to see how we can apply it outside the uk market and to other markets. with respect to a child safety law that obligates companies to think about the safety of children, that's something that snap has done without regulation. but we would be happy to work with the committee. >> but the point is -- and we would love to be in a world where we could rely on voluntary action by platforms like yourselves. but, in effect, you've sacrificed that claim to voluntary action or reliance on voluntary action. whether it is facebook or your
7:04 pm
companies, in various ways, i think you have shown that we cannot trust big tech to police itself. and so when you say, we already do it, well, you may decide to do it. but there is no legal obligation that you do it and there's no way to hold you accountable under current law. that's what we need to do. that's why i'm asking you about a law. i am hoping that your answer is a less. yes. that your answer would support it, as a duty of care, as a matter of law. i hope that's a yes. >> yes, senator, and that was very much a part of my testimony and opening statement. because the time it takes regulation to be implemented, we don't believe we should have to wait for that regulation. we are going to take the -- >> but will you support it? >> yes, senator.
7:05 pm
>> senator, we've already voluntarily implemented much of the age-appropriate design code here in the united states. but i do agree that companies can do more. i was struck by your comments in your opening statement about a race to the top. and that is very much the ce to thapproach that we are tro take at tiktok to do more and go above and beyond to a place where it is seen that we are putting wellness of teens and the safety of teens in particular ahead of other platforms. let me see if i can answer the question as i would if i were in your seat -- yes, we strongly and enthusiastically support that kind of child safety law. we are already doing more than we would need to do under the law. >> yes, senator, but additionally i do think as it relates to age verification, something that has been in measures like that and an update copy that is long overdue --
7:06 pm
>> ms. miller? >> senator, i actually respectfully disagree that we only wait until we have legal obligations for these systems and protocols to be in place. we rolled out you two kids in 2015 to make sure that as kids were trying to be on the main platforms, that we created a safe place for them. we rolled out a number of -- >> i think you misinterpret me. i'm not suggesting -- >> oh, sorry. >> i'm suggesting that you do it now, on the contrary. i think that's perfectly well understood, my question, that you support that law, would you? yes or no? >> i would support looking at any details as it relates to additional legal protections
7:07 pm
for kids in the u.s.. as you may know, that age-appropriate code went into effect in the uk. so it's still early days. but we had already rolled out a number of protections. we had rolled him out globally. >> is that a yes or no? >> yes, i would be happy to work with you and your staff -- >> would you support a version of the uk child safety law? >> i would need to stare at the details of any specific bill. but i certainly support expansions of child safety protections. >> i'm going to yield to senator klobuchar and she is ready to ask your question. >> thank you, thank you very much for what's going on here today, thank you for taking it remotely for the second round. so one of the things that i've tried to do is at all of these hearings, including in the
7:08 pm
judiciary, and the subcommittee that i chair, is taking the veil off this idea that this is the web, everyone has fun, that's what it is. some of that is true. but there is huge profit making ventures, and when you look at it that way, as the most successful and biggest companies that the world has ever known, and big tech platforms in terms of money -- they look at, why haven't we done anything about privacy law? or why haven't we done anything about senator markey's law? why haven't we put in some place some rules about transparency and algorithms or, mostly, from my perspective, done anything about competition policy, and how to get alternatives. so let me start with this question,
7:09 pm
which i have asked many of the larger platforms. ms. stout said that snap reported that its advertising revenue per user in north america for the second quarter of 2021 was seven dollars 37 cents. how much of snaps revenue came from users under the age of 18? >> senator, i don't have that information for you but i'd happy to be -- >> okay, very good. i've been trying to get that from facebook, just to give you a sense. facebook's revenue, from their own user documents is 51 dollars per user. for the u.s.. per quarter, just to put it in some perspective. mr. beckerman, tiktok is a privately held company, so we don't have public documents on your advertising revenue per user.
7:10 pm
what is your best estimate of advertising revenue per u.s. user for the last quarter? >> i don't have those numbers but i'd be happy to go check with your team. >> do you think you can provide us that? >> again, we are not a public company but i will see what we can find for you. >> okay, and again, i'm trying to figure out the percentage from users under the age of 18. for us to get some perspective on how much of the business and the future growth of the business is in kids. mrs. miller, youtube reported that its advertising revenue overall in the second quarter of 2021 with seven billion dollars. how much of youtube's revenue came from users under the age of 18? >> senator, i don't know the answer to that and i'm not sure if we look at data internally that way so i would also be happy to follow up with you. but i would like to note that
7:11 pm
as a company, we've long shared our revenue with our creators. and so over the last three years alone, we have paid out more than 30 billion dollars to creators. >> okay. my work with the antitrust subcommittee has some related paths. and recently, we introduce the bipartisan american innovation in choice online act, with senator grassley, and there are several people, including senator blumenthal, who are co-sponsors. it's focused on a gnarly problem, which is that you have platforms that our preference-ing their own stuff at the top. they are taking, in some cases, and data that they uniquely have on other products. and then making products and underpricing the competitors in
7:12 pm
knock offs. roku says that u2 has made unfair demands in negotiations for -- including demanding roku give preference to you to over other content providers, exactly what this legislation aims that, and give youtube access to non public data for roku users. did you make -- preference result in negotiations with roku? >> senator, i am not involved in negotiations with roku. i know we have been having discussions with them for several months. and we are trying to come to a resolution that is good for users and companies. but i'm not involved in the negotiations. >> okay, i will put this on the record for others in the company. because i'd like to know more generally if youtube has ever demanded non public data or preference-ing in search or
7:13 pm
exult in discussions with other providers. it gives you a sense of the dominance of the platform, by far, in the area of search, to be able to have that power over people who are simply trying to be on that platform. so i think that that really gets at the core of what we are trying to do. i also just had a follow-up on your youtube banning all vaccine misinformation, which i commended you for at the time. how much content have you removed related to this policy change since you banned all anti vaccine misinformation? and have you changed seen a change in the viewer ship right? >> senator, i feel bad, i think this is a question i would absolutely love to be able to answer with the detail. but i know that we have removed so much detail -- >> okay, well we will put it in writing and get the answer. >> thank you. my last question, ms. stout,
7:14 pm
mr. beckerman, i just mentioned that this bill, that senator grassley and i introduced, ensuring that dominant digital platforms don't use their power to thwart competition. do you support some of the competition reforms in the bill? have you faced any challenges when it comes to competing with the largest digital platform? that will be my last question? >> senator, we are aware of the bill. as a non dominant platform, we very much appreciate your legislation and the work you are doing in this space. as a smaller platform, it is an incredibly competitive arena for us. we compete every day with companies that collect more information on users and store that information to monetize it. so any efforts that this body and especially the legislation that you have undertaken to create an equal
7:15 pm
playing field so that it is competitive in the atmosphere for platforms like snapchat, that is welcome. >> okay, thank you. mr. beckerman? >> likewise, we appreciate your efforts and the work being done to promote and spur competition. it's something that i think we all benefit from and the specific challenges that we face, they i will be happy to meet with you. >> thank you very much, thank you everybody. >> senator, i found the answer, if you don't mind -- >> okay. >> on covid miss info, we've removed over 1 million video since we've started calling out covid miss info. as relates to covid maxine miss info. so it's an area with a lot of resources behind it. to make sure our platform isn't allowing this type of content. >> okay, thank you, thanks everybody. thanks senator blumenthal and blackburn.
7:16 pm
>> thank you senator klobuchar. i seem to be the last person standing or sitting between you in the end. but i do have some questions. ms. stout and mr. beckerman, over 10 million teens use your app and these are extremely impressionable young people. and they are highly impressionable minds. and you make probably hundreds of millions of dollars from them. you have heard it called snapchat dysmorphia, describing the depression and other ailments associated with your app, snapchat dysmorphia. if you use the filters that are offered. and create destructive, harmful expectations. as the filter does. and you studied
7:17 pm
the impact of these filters on teens mental health. and i assume you study the impact of these filters before you put them in front of kids. ms. stout? >> yes, senator, the technology you are referring to is what we call lenses. these lenses are augmented reality filters that we allow users who choose to use them to apply them over the top of selfies. and for those who are familiar, they are the, you know, opportunity to put on a dog face or -- >> i've seen them. but they also change one's appearance, potentially to make one thinner, different colors, different skin tones. >> so these filters, senator, are created both by snapchat
7:18 pm
and our creator community. there are over 5 million of these filters and a very small percentage of those filters are what you would call beautification filters. most of them are silly, fun, entertaining filters that people used to barrier them lower the barrier of conversation. when you are using snapchat, you are not posting anything permanently for likes or comments. you are using the filters in a private way to exchange a text or video message with a friend. so it really kind of creates this fun, authentic ability to communicate with your friends in a fun way. and those filters are one of the ways in which friends love to communicate with each other. >> he study the impact on kids before you offer them? have you studied them? >> we do a considerable amount of research on our products. and there's competitive pieces of research that was revealed.
7:19 pm
and it show that lenses or filters of snapchat are intended to be fun and silly. it's not about -- >> well, we all know as parents, something that's intended to be fun and silly can easily become something that is dangerous and depressing. and that is the simple fact -- some of the filters, maybe not all of them, but some of them, can do this. and i'd like to you to provide the research done by you and others on gathering what you have said. you don't do the research before you provide it? >> no, that's not what i'm saying. with respect to this question i'm not able to answer the kind of research we've done simply because i'm not aware. but i would go back and look and try to get you an answer to answer your question. >> for eight years, snapchat
7:20 pm
had a speed filter. it allowed users to add their speed to videos, as you know. the result was it encourage teens to race their cars at reckless speed. there were several fatal and catastrophic crashes associated with teens using the speed filters. and warnings from safety advocates and multiple deaths until snapchat finally removed that dangerous speed filter. it was silly, it was maybe fun for some people. it was catastrophic for others. and i want to raise what happened to carson bride, his mother kristen, told me about how carson was relentlessly bullied through anonymous attacks on snapchat. and after desperate pleas for help, he took his own life. how
7:21 pm
can parents protect kids from the relentless bullying that follows kids home from school? as ms. haugen said so movingly, it no longer stops at the school house door. it's 24/7, comes into their homes, before they go to sleep. what are you doing to stop bullying on snapchat? >> this is an incredibly moving issue for me as a parent as well. bullying is unfortunately something we are seeing more and more happening to our kids. this is not just on the online community. they face that school and at home. we have zero tolerance on snapchat for bullying. as a platform that reaches so many young people, we see this
7:22 pm
as a responsibility to get in front of this issue and do everything we can to stop it. because snapchat is designed differently, you don't have the new multiple abuse vectors for bullying in a public sort of way on snapchat. you don't have public, permanent posts where people can like or comment, thereby introducing additional opportunities for public bullying or shaming. that's not to say that bullying doesn't happen both online and off line. so we have reporting tools where users can anonymously and quickly report bullying or any other harmful activity. our trust and safety teams that work around the clock, they actually move to remove this content in an average of less than two hours. usually it is far more quickly than that. but i want to assure you, senator, thank you for raising the bullying issue. in addition to combatting this, we do a lot to prevent it. and we have more opportunities to raise awareness on the effects of
7:23 pm
bullying and the effects that peoples words have on other people. and we will continue to raise awareness. >> we've heard from tic tac that it's a safe environment. at the same time, we see challenges. the blackout challenge, to take one example, where, in effect, teens and children have actually died emulating and recording themselves following blackout and choking challenges, including a nine year old in tennessee who saw one on tiktok. despite apparent efforts to discourage certain trends, this content is still being posted and viewed widely by kids online and followed an emulated, whether it's destruction of
7:24 pm
school property or other kinds of challenges. the mother who lost their child in the choking challenge, shared questions with me, and they deserve answers from you and others who are here today. i'm just going to ask her question. how can parents be confident that tiktok, snapchat and youtube will not continue to host and push dangerous and deadly challenges to our kids? >> thank you, senator. as it relates to this specific challenge, and particularly things that can be dangerous and deadly for teens, it's sad and tragic. i remember myself when i was in grade school, i had a classmate that we lost, from something very similar. i know something that touches many of our lives and it is
7:25 pm
awful. as it relates to tiktok, this is not content we've been able to find on our platform, it's not content we would allow on our platform. independent fact-checker's have looked. but it's important that we all remain vigilant and ensure that things that are dangerous or even deadly, particularly for teenagers, don't find their way on platforms. it is important we all have conversations with our teenagers. make sure that they stay off line, and stay safe. it's the responsibility that we take. i think it's important that we distinguish between content on platforms encouraging things, whereas actually the content doesn't exist on the platform. >> but this content exists on your platform. >> we haven't been able to find evidence of the blackout challenge on tiktok at all. it would violate our guidelines. it's something that we searched with ai and monitors.
7:26 pm
we found no evidence of it. it's something that we have conversations with parents and others all the time around things that could be dangerous. but it's important that we have conversations about this. >> other challenges, you are saying none of them? >> anything that is illegal or dangerous or violates our guidelines, our team is very aggressive, acting very quickly hastings pop up on the platform as it relates to transparency reports. over 90% is removed automatically. danger challenges have no place on tiktok. >> i understand that you react by taking them down but they existed for the time they were there. i guess my question is, what can you do to prevent those challenges from being there in the first place? >> we are often, able, to be proactive in blocking things from coming on, when they are
7:27 pm
found. we block searches, we removed content. but unfortunately, something we've seen recently our press reports about alleged challenges that when fact checkers and others look into it, they find that these never existed on tiktok in the first place. and in fact, were hoaxes that originated on other platforms. so i think it's important that all of us, like parents and teachers, that we look at the facts and look to see would actually exists rather than just spreading rumors about alleged challenges. >> well, i have to tell you, mister beckerman, we found passed out videos. we found them. so, i have a lot of trouble crediting your response on that score. let me ask you about another
7:28 pm
instance. someone in connecticut wrote to me about how they are 13 year old daughter was inundated on tiktok with videos about suicide, and eating disorders and self injury. i've heard similar stories across the country. we did our own research, my staff made a tiktok account, as we were describing earlier, spent hours scrolling through its endless's feed of videos. again, videos of dance trends. within a week, tiktok started promoting videos with suicidal ideation. we didn't seek out this content. we can't show these videos in this room because they were so disturbing an explicit. and finally, another tiktok account we created as a 13 year old, that's a 13 year old, was flooded with nothing but sexually obscene videos.
7:29 pm
how do you explain to parents why tiktok is inundating their kids with these kinds of videos of suicide, self injury and eating disorders? this is stuff occurring in the real world. >> senator, i can't speak to what the examples were from your staff, i can assure you that is not the normal experience that teens or people who use tiktok get. those kinds of content violate our staff guidelines but i'd be happy to go and sit down with you on your staff and go through the examples. -- >> barr features that lead to addictive use of apps? -- >> we've already done a number of that proactively in the form of take a break videos. in terms of time management features and in terms of not having direct messages for
7:30 pm
under 16. and our family and parent tools. it's important that we look at these issues. and foster conversations with our teenagers. we make it easy for parents -- i know it can be daunting for parents with teenagers to have all these different tools and features to protect them. but we have built this in a way where parents can have additional control of the safety and privacy and tommy's for their features. >> i have to go vote and i don't have anyone here to preside here for me. nasa just taking a five minute recess. i will have some final questions if you'd be willing to wait. and then we can close the hearing. thank you.
7:31 pm

34 Views

info Stream Only

Uploaded by TV Archive on