tv Snapchat Tik Tok You Tube Executives Testify on Kids Online Safety -... CSPAN December 3, 2021 4:34am-7:44am EST
sen. blumenthal: welcome to this hearing on protecting kids on social media. i want to thank the ranking member, senator blackburn, who has been a very, very close partner in this work, as well as chairman cantwell and ranking member wicker for their support. we are joined by senator thune and senator amy klobuchar and senator ed marquis. they have all been extraordinary leaders in this effort. and today, i should note, is the first time that tiktok and snap
have appeared before congress. i appreciate you and youtube for your testimony this morning. it means a lot. our hearing with facebook whistleblower frances haugen was a searing indictment, along with her documents of a powerful, gigantic corporation that puts profits ahead of people, especially our children. there has been a definite and deafening drumbeat of continuing disclosures, about facebook -- they have deepened america's concerns and outrage. and have led to increasing calls for accountability, and there will be accountability. this time is different. accountability to parents and the public, accountability to congress.
accountability to investors and shareholders, and accountability to the securities and exchange commission and other federal agencies. because there is ample credible evidence as we start an investigation here. but today, we are concerned about continuing to educate the american public and ourselves about how we can face this crisis. what we learned from her disclosures and reporting since then is repugnant and apparent -- repugnant and abhorrent about instagram's algorithms creating a perfect storm, in the words of one of facebook's own researchers. as that person said, it exacerbates downward spirals, harmful to teenagers, fueling hate and violence, prioritizing profits over the people that it
hurts. in fact, the algorithms push emotional and provocative content, toxic content that amplifies depression, anger, hate, anxiety. because those emotions attract and hook kids and others to their platforms. in effect, the more content and more extreme versions of it are pushed to children, who express an interest in online bullying, eating disorders, self-harm, even suicide. and that is why we now have a drumbeat of demands for accountability, along with the drumbeat of disclosure. we are hearing the same stories. and reports of the same harms about the tech platforms represented here today.
i have heard from countless parents and medical professionals in connecticut and elsewhere around the country about the same phenomenon on snapchat, youtube, and tiktok. in effect, that business model is the same. the more eyeballs means more dollars. everything that you do is to add users, especially kids, and keep them on your apps for longer. i understand, from your testimony, that your defense is, we are not facebook. we are different, and different from each other. being different from facebook is not a defense. that bar is in the gutter. it is not a defense to say you are different. what we want is not a race to the bottom, but a race to the
top. we want to know what steps you are taking that protect children. even if it means forgoing profits. we want to know what research you're doing. some of the studies and data then have been disclosed about facebook, we want to know whether you will support real reforms, not just the two weeks the minor changes that you suggested and recounted in your testimony. the picture that we have seen from an endless stream of videos that is automatically selected by sophisticated algorithms shows that you, too, drive and
scribe to find something that teenagers will like and drive more of it to them. if you learn that 18 feels -- if you learn that a a teenager feels insecure about his or her body, it is a recipe for disaster. you start out on dieting tips, but the algorithm will raise the temperature, flood that individual with more and more extreme messages, and after a while, all of the videos are about eating disorders. it is the same rabbit hole driving kids down those rabbit holes created by algorithms that lead to dark places and encourages more destructive behavior and violence. we have done some listening. i heard from nora in westport, who allowed her 11-year-old daughter on tiktok because she thought it was just girls dancing. everyone to exercise more with -- avery wanted to exercise more with the shutdown of school and sports, so like most people she
went online. nora told me about the rabbit hole that the tiktok and youtube algorithm pulled her daughter into. she began to see evermore extreme videos about weight loss. she started exercising compulsively and ate only one meal a day. her body weight dropped dangerously low, and she was diagnosed with anorexia. avery is now in treatment, but the financial cost of care is an extreme burden, and her education has suffered. we heard from parents, who had the same experience. so, we not only listen but we checked ourselves. on youtube, my office created an account as a teenager. like avery, we watched a few videos about extreme dieting and eating disorders. they were easy to find. the youtube recommendation algorithm began to promote extreme dieting and eating disorder videos each time we
opened the app. these were often videos about teenagers starving themselves. as you can see from this poster. the red is eating disorder related content, the green is all the other videos. one is before, and the other after the algorithm kicked in. you can see the difference. we received these recommendations each time we watched other videos. it is mostly eating disorder content. there was no way out of the rabbit hole. another parent in connecticut wrote to me about how their son was fed a constant stream of videos on tiktok related to disordered eating and calorie counting, after looking up athletic training. as scientific research has shown, eating disorders and body comparison also significantly affect young men on social
media -- young men often feel compelled to bulk up, to look a certain way. again, i heard about this pressure all too often. we created an account on tiktok, troublingly, it was easy searching tiktok to go from men's fitness to steroid use. it took us only one minute, one minute to find tiktok accounts openly promoting and selling illegal steroids. we know the dangers of steroids. and they are illegal. all of this research and fact and disclosures send a message to america's parents. you cannot trust big tech with your kid. parents of america cannot trust these apps with their children.
and big tech cannot say to parents, you must be the gatekeepers. you must be social media pilots. you must be the app police. because parents should not have to bear that burden alone. we need stronger rules to protect children online. real transparency, real accountability. i want a market where the competition is to protect children, not to exploit them, not a race to the bottom. a competition for the top. we have said that this moment is for big tech. a big tobacco moment. there is a lot of truth to that intention.
because it is a moment of reckoning. the fact is that, like big tobacco, big teenagers despite -- lured teens with celebrities, fashion and beauty, everything that appeals to young audience. and like big tobacco, facebook hid from parents in the public, the substantial evidence that instagram could have a negative effect on teen health. the products are different. big tech is not irredeemably bad, like tobacco. big tobacco and the tobacco products, when used by the customer, is the way the manufacturer intended as ms. -- intended actually killed the customer. as ms. haugen said, our hope is not to burn facebook to the ground, it is to bring out the
best to improve and impose accountability. as she said, we can have social media we enjoy, that can access -- that connects us without tearing apart our democracy. putting our children in danger and sewing ethnic violence around the world -- we can do better, and i agree. thank you, and we will turn out to the ranking member. >> thank you mr. chairman, and thank you to our witnesses for being here today. we appreciate it. mr. chairman, i thank your staff for the good work they have done working to facilitate these hearings on holding big tech accountable. we appreciate that. and today, the conversation is something that is much needed, and it is long overdue.
for too long, we have allowed platforms to promote and glorify dangerous content for its kids and teenage users. in the weeks leading up to this hearing i have heard from parents, teachers, mental health professionals who are wondering the same thing -- how long will we allow this to continue? and, what will it take for platforms to finally crack down on the viral challenges, illicit drugs, the eating disorder content, and the child sexual abuse material? we find this on your platforms. and teachers and parents and mental health physicians cannot figure out why you allow this to happen. it seems like every day, i hear stories about kids and teenagers who are suffering after interacting with tiktok, youtube and snapchat. kids as young as nine years old have died doing viral challenges on tiktok. we have seen teenage girls lured
into inappropriate sexual relationships with predators on snapchat. you are parents, how can you allow this. -- allow this? how can you allow this? i have learned about kids and teenagers who commit suicide because of bullying that they have suffered on the sides. and the platforms refuse to work with law enforcement and families to stop the harassment when asked. if it were your child, what would you do to protect your child? does it matter to you? my staff has viewed abusive content, featuring minors, in videos of people slitting their wrists on youtube. it is there. yet all the while, kids and teenagers are flocking to the sites in increasing numbers and
the platform loves it. -- and the platforms love it. they know that youth are a captive audience. one which will continue consuming content fed to them through the algorithms, even if it puts them in danger. they are curious, they get pulled down the rabbit hole. they continue to watch. and these platforms are getting more and more data about our children. but do we know what they are doing with it? in the case of facebook, we learned they are using it to sell their products to younger and younger children. those who cannot legally use their services, but these platforms -- you all know you have children on these platforms that are too young to be on them. and you allow it. -- allow it to continue, because
now, they want your face prints and voice prints in addition to your geolocation data and your keystroke patterns and rhythms. they also collect audio that comes from devices connected to your smart phone, like smart speakers. they collect face and body attributes from videos, as well as the objects and scenery that appear in those videos. this makes sense, to some degree, to create videos, but given china's propensities to surveil its own citizens, why should we trust that tiktok, through bytedance, isn't doing the same to us? most americans have no idea this is happening. and, well, they hope on the face of things to reassure us by creating u.s.-based offices for their high-priced lobbyists and marketing personnel, it is not enough.
we see through it all, as we will get into today, tiktok's own privacy policies give them an out to share data with bytedance and the chinese communist party. yet most americans have absolutely no idea. that the chinese communist party is getting their information. the time has come where we must focus on what congress can do to secure american consumers personal data. -- secure american consumers' personal data. as a mother and a grandmother, i know this is doubly important when it comes to our children. as this hearing will show, we cannot afford to wait on that. thank you to the witnesses. we look forward to your cooperation. thank you, mr. chairman. sen. blumenthal: we have with us this morning ms. jennifer stout, vice president of global public policy at snapchat.
the vice president of global policy, and prior to snapchat, she spent most of her career in government, working for then senator joe biden and at the u.s. department of state. mr. michael beckerman, vice president and head of public policy at america's tiktok. he joined tiktok in february of 2020 and leads their government relations. he was previously the founding president and ceo of the internet association. he is joined by ms. leslie miller, remotely, vice president, government affairs and public policy at youtube. she leads the youtube public policy team. previously, she served as acting head of global policy for google. why don't we begin with your testimony, ms. stout. thank you. ms. stout: thank you mr. chairman. --
ms. stout: thank you, mr. chairman. chairman blumenthal, ranking member blackburn, and members of the subcommittee, thank you for the opportunity to be here today. my name is jennifer stout, and i am the vice president of global public policy at snap. i and in this role for nearly 5 years, after spending most two decades in public service, more than half in congress. i have tremendous respect for this institution and the work you are doing to ensure young people are having safe and healthy online experiences. to understand snap's approach to protecting young people on our platform, it is helpful to start at the beginning. snapchat's founders were part of the first generation to grow up with social media. like many of their peers, they saw social media was capable of making a positive impact, but it also had certain features that troubled them. these platforms encouraged people to broadcast their thoughts permanently. young people were constantly measuring themselves by likes and comments, trying to present
a perfect version of themselves because of social pressure and judgment. -- social pressures and judgment. social media also evolved to feature an endless feed of unvetted content, exposing individuals to a flood of viral, misleading, and harmful information. snapchat is different. snapchat was built as an antidote to social media. from the start, there were three key ways we prioritized privacy and safety. first, we decided to have snapchat open to a camera, instead of a feed of content. this created a blank canvas for friends to visually communicate with each other, in a way that was more immersive than text. second, we embraced strong privacy principles and an ideal -- and the idea of ephemerality, making images delete by default. in real life, friends do not break out their tape recorder to
document every conversation. third, we focused on conversations and connecting people who were already friends in real life, by requiring both opt into being friends in order to communicate. because in real life, friendships are mutual. we have worked hard to keep evolving responsibly. understanding the potential negative effects of social media, we made proactive choices to ensure we reflected the early values. users had the potential to reach a large audience. discover, our closed content platform and features content from professional, verified users, and spotlight, where users can submit creative and entertaining videos to share, are both structured in a way that does not allow
unvetted content to be reached. our design protects our audience and makes us different. when it comes to young people, we have made intentional choices to apply additional protections to keep them safe. we have adopted responsible design principles and rigorous processes that consider the privacy and safety of new features from the beginning. we take into account the unique sensitivities of young people. we intentionally make it harder for strangers to find minors, by not allowing public files for users under 18. we have long deployed age gaining tools to prevent minors from viewing age related content and ads. -- in ads. we make no effort and have no plans to market to young children. individuals under the age of 13 are not permitted to create snapchat accounts. and if we find them, we remove them. additionally, we are developing tools that will give parents more oversight over how their teenagers are using snapchat. over 500 million people around
the world he was -- the world use snapchat, and 95% of our community say snapchat make them feel happy because it connects them to their friends. we have a moral responsibility to take into account the best interest of our users and everything we do. -- in everything we do. and we understand there is more work to be done. as we look to the future, we believe regulation is necessary. but given the different speeds at which technology develops and the rate at which regulation can be implemented, regulation alone cannot get the job done. technology companies must take responsibility to protect the communities they serve. if they don't, government must act to hold them accountable. we fully support the subcommittee's approach to investigate these issues, and we welcome a collaborative approach
to problem-solving that keeps our young people safe online. thank you for the opportunity to appear here today. i look forward to answering your questions. >> thank you. >> chairman blumenthal, ranking member blackburn, and members of the subcommittee, my name is michael beckerman, i am the vice president of public policy for the americas at tiktok. i am also the father of two young daughters. i am passionate about making sure our children stay safe online. i joined tiktok after nearly a decade for presenting the area -- the internet industry at large because i saw an opportunity to help them grow from a young startup to a trusted entertainment platform. tiktok is not a social network based on followers or shows a. -- or social grasp. it is not an app that people check to see what their friends are doing. you watch and create on tiktok. the passion, creativity, and diversity of the community has fueled new cultural trends.
chart topping artists and businesses across the country. it has been a bright spot for american families to create videos together. i have heard from countless friends and family and even members of the senate and your staff about how joyful and fun and entertaining and authentic tiktok content truly is. i am proud of the hard work our safety teams do every day to safeguard our community and that our leadership makes safety and wellness a priority, particularly to protect teenagers on the platform. being open and humble is important to the way we operate at tiktok. we seek out feedback from experts and stakeholders to constantly improve. we find areas and flaws were we -- where we can do better, we hold ourselves accountable, and find solutions. turning a blind eye to areas where we can improve is not part of our dna. most importantly, we strive to do the right thing, protecting people on the platform. when it comes to protecting minors, we work to create
age-appropriate expenses for -- experiences for teenagers throughout their development. we have built privacy and safety protections with this in mind. for example, people under 16 have their account set to private automatically. they cannot host livestreams and cannot send direct messages on our platform. we do not allow anyone to send off platform videos, images, or links via direct messaging. these are perhaps underappreciated product choices that go a long way to protect teenagers. we made these decisions, counter to industry norms or our own short-term growth interests, because we are committed to do what is right and building for the long-term. we support parents in their important role to protect teenagers. that's why we have built parental controls called family pairing, that empower a parent to link their tiktok account and a simple way from their own device as a parent to their teenager's account to enable a range of privacy and safety controls. i encourage all the parents that
are listening to the hearing today to take an active role in your teenager's phone and app use. visit our youth portal, read through the guardians guide on the safety center. the tools for parents are industry leading, innovative, and we are always looking to add and improve. it is important to know our work is not done in a vacuum. it is critical for platforms, experts, and governments to collaborate on solutions that protect the safety and well-being of teenagers. that is why we partner with networks to help us ensure we provide age-appropriate content. we work closely with the national center for missing and excluded children, the national pta, the digital wellness lab at boston children's hospital, and our u.s. content advisory council. tiktok has made tremendous strides to promote the safety and well-being of teens. we are transparent about the room we have to grow and
improve. we are investing in new ways for a community to enjoy content based on age appropriateness or family comfort. we are developing more features that empower people to shape and customize their experience in the app. there's a finish line when it comes to protecting children and teens. the challenges are complain. we're determined to work hard and keep the platform safe and create age-appropriate experiences. we do know trust must be earned. we are seeking to earn trust to a higher level of action, transparency, and accountability. as well as the humility to learn and improve. thank you for your leadership on these important issues. i look forward to answering the questions of the committee. >> thank you, mr. beckerman. ms. miller. i hope you're with us. please proceed. >> sorry. i think i am having a bit of technical difficulty. can you hear me okay? >> we can hear you, and now we
can see you. >> wonderful. thank you. chairman blumenthal, and distinguish members of the subcommittee, thank you for the opportunity to appear before you today. my name is leslie miller, and i'm the vice president of public policy at youtube. as young people spend more time online, and given their changing needs as they grow up, it is important to age-appropriate access information. -- crucial to put in place protections that allow them age-appropriate access to information. we do this by investing in partnership, technologies, and policies that create safer environments. that allow children to express their imagination and curiosity and empower families to create the right experiences for children. our internal teams include experts who come from child development, child psychology, and children's media backgrounds. they work closely to ensure that product design reflects an understanding of children's unique needs and abilities and how they evolve over time.
the advice from trusted experts informs the youtube child specific policies. the policies that are described in greater detail in my submitted testimony prohibits dangerous content to children on youtube. we commit significant time and resources to remove the harmful content as quickly as is possible. between april and june of this year, we removed roughly 1.8 million videos for violations of child safety policies. of which about 85% were removed before they had 10 views. we are constantly working to improve our safeguards. we have also invested significant resources to allow -- to empower parents to have greater control over how their children view content on youtube. in 2015, we created youtube kids
for kids to more safely pursue their curiosity and explore their interests while providing the parents more tools to control and customize the experience for families. videos on youtube kids include popular children's videos, diverse new content, and content from trusted partners. after we launched youtube kids, we learned from parents of teens that have different needs not fully met by our products. we have worked with parents and experts across the globe in areas related to child safety, child development, and digital literacy to develop a solution for these parents. which we call supervised experiences. we launched this earlier this year on the main youtube platform. parents now have the option of choosing three different content choices. content generally suitable for viewers age 9 plus or 13+ and then most of youtube.
the most of youtube option excludes all age restricted content on the main platform. we want to give parents the controls that allow them to make the right choices for their children. on youtube kids, and even on youtube for all under 18 users, autoplay is off by default. in the coming months, we will launch additional parental controls in the youtube app. the parents can choose a locked default autoplay setting. a take a break reminders and bedtime reminders are also involved in these experiences. youtube treats personal information from anyone watching children's content on the platform is coming -- as coming from a child, regardless of the age of user. this means on videos classified as made for kids, that we limit data collection and use. as a result, we restrict or disable certain features. for example, we do not serve
personalized ads on this content on the main youtube platform. we do not support features such as comments or live chat. we have never allowed personalized advertising on youtube kids or unsupervised experiences. -- or on supervised experiences. there is no issue more important than the safety of kids online and we are working closely to address these challenges. thank you for the opportunity to appear here today, and i look forward to your questions. sen. blumenthal: thanks, ms. miller. we will have five minute rounds. we have votes at 11:00. we have three votes. i think we will be able to go to the questioning and testimony. if necessary, we will take it for recess. let me begin. as you know by now, in august, senator blackburn and i wrote to facebook asking whether they had done any research, whether they
had any facts that showed harm to children. in fact, they denied it, and dodged the question. they explained that instagram is not harmful to teens. let me ask you, i think it is pretty much a yes or no question, the same question we asked facebook. have any of your companies conducted research on whether your apps can have a native of -- can have a negative effect on children or teen's mental health and well-being and whether they promote addiction? have you done that research? >> senator, we have conducted research. much of our research is focused on our products and how to improve products and services to meet the needs of users in our community. as i mentioned in my testimony, some of the research we did shows that 95% of users say that snapchat makes them happy.
sen. blumenthal: will you make that research available to the subcommittee? >> yes, we would. sen. blumenthal: mr. beckerman. >> thank you for the question. we believe that research should be done in a transparent way. and we partner with external stakeholders to get their feedback. we think it is something that everybody can work together on. we have supported passage of the camera act at nih and we would love to be seen this done in an external and transparent way. sen. blumenthal: ms. miller. >> we work with experts to leverage their insights and research to make sure there -- to make sure the product and policy decisions are up-to-date, based on their insights. sen. blumenthal: i asked if research has been done that could show negative effects or addictive like impacts. you've all indicated you have done the research, and i am
assuming mr. beckerman that your company will make the research available. ms. miller? >> we have published some research and will make additional available. sen. blumenthal: let me now ask about the black box algorithms. as you know, these algorithms exist and function to drive sometimes toxic content to kids. more of it and more extreme versions of it, and the consequences are potentially catastrophic. but the companies are evaluating their own effects on kids when it comes to addiction and harms. let me ask a similar question. do you provide external independent researchers with access to your algorithms, data sets, and data privacy practices?
in other words, if an academic researcher comes to you and wants to determine whether one of your products causes teen mental health issues or addiction, with a get access to -- will they get access to the raw data from you without interference? >> so, senator, it's important to remember that on snapchat, algorithms work very differently. very little of our content is sorted algorithmically. sen. blumenthal: i want to apologize. i am going to interrupted to the question is about access to independent research. on those algorithms that you do use. and there's no question that you do have algorithms. correct? >> correct, we do have algorithms, but they operate differently. to your question on whether we have had requests from outside researchers or mental health specialists to access that, to my knowledge, we have not.
sen. blumenthal: but would you provide access to that? >> yes. it is important to understand that algorithms for us operate differently. to compare them to different platforms it is different things -- compare them to different platforms is different things. sen. blumenthal: that is one of the things that an external researcher would verify. >> yes, senator, we believe that the transparency is important. we were one of the first companies to publish publicly a deep dive and how the old rhythm works. we invite experts and you, senator and steph, to come see how the algorithm works. additionally, it's important to give choice to people. with tiktok, in your feet, you can indicate you are not interested and we are trying to give transparency to individuals. sen. blumenthal: so, external access. ok. ms. miller. >> senator, we are very
transparent with the way our machine learning works. for example, our quarterly transparency report that summarizes the videos and channels we removed based on by lighting our community guidelines. earlier this year, we rolled out an additional statistic. the violate view rate. sen. blumenthal: i apologize for interrupting. the question is whether you provide external independent researchers with access to your algorithms and data sets. do you allow that? >> i'm sorry? sen. blumenthal: do you provide that access? >> we regularly partner with experts in child development mental health. sen. blumenthal: those are experts chosen by you. if somebody independent came and wanted that access, yes or no, would you provide it?
>> senator, it depends on the details, but we always look to partner with experts in these important fields. sen. blumenthal: well, i will cite the difference between your response and mr. beckerman and ms. stout, which indicates hesitancy, if not resistance to providing access. let me ask you, ms. miller, do you think one of the issues here really relates to the claim that these sites are not transparent and truthful, which is belied by experience? the fact that they favor regulation. facebook has mounted armies of lawyers and have paid millions of dollars to fight regulations. whether it is responsible reforms in section 230 or privacy legislation or requirements to be more transparent on algorithms. according to the details made public last week in a multistate
antitrust case, google has sought a coordinated effort to forestall and diminish child privacy protections in the proposed regulations by the ftc and by legislation. that filing describes attempts to encourage facebook and microsoft to fight privacy rules and back down on efficacy for -- on advocacy for legislation in a particular meeting where that exchange occurred. this disclosure made news. everybody in d.c. really knew it was true. what was new is that google's hypocrisy was finally called out -- was finally called out. the fact is that google and youtube has been fighting against privacy behind the scenes for years. it is hidden in plain sight, and it is an open secret. you have been lobbying the ftc
to weaken the existing privacy rule. you have spent vast sums of money fighting california's privacy rules. i want to ask, what work youtube has done to lobby against congress strengthening online protections for children? is that report and that claim by the multistate plaintiffs accurate? >> senator, i understand the material you are referencing was regarding our point of view on e -- point of view on e-privacy legislation in europe. our ceo has regularly called for comprehensive privacy legislation in the u.s. and on behalf of youtube, i am not aware of any efforts other than to be involved in conversations and a -- in a multi-stakeholder way as it relates to any bills that are introduced regarding the
oversight or regulation of companies such as youtube. sen. blumenthal: the reports of political pressure and lobbying against children's privacy and safeguards are totally false? >> i think we work with lawmakers such as yourself regularly to have conversations. to share what we are doing on the platform and the updated protections we are putting into place, but also to hear your concerns. to work with you as you contemplate new regulations. sen. blumenthal: will you commit you will support privacy legislation, as has been proposed? >> senator, i'm not steeply involved in the details of any specific privacy legislation. but i commit we will work with you and partner with you on federal privacy legislation. sen. blumenthal: would you
and earlier this year, the chinese communist party acquired an ownership state and a seat on the board of bytedance. does tiktok share data with its parent company, bytedance? >> this is an important question and i am glad you are asking. >> quickly, please. >> we do not share information with the chinese government and i would like to share a citizen lab report which is a respected global security expert. they said research shows there is no dratted data transmission to the chinese government. the report goes on to state that tiktok does not pose a threat to national security and i would be able to submit that for the record. >> please submit that for the record. do any bytedance employees have access to the data? >> the data is stored in the
united states and the backups are in singapore. we have a world-renowned u.s.-based security team that has access. -- that handles access. >> i understand you say you store it in singapore. tell me about programmers and product developers in the data teams. are they housed in china? >> like many technology companies, we have engineers in the united states and around the world. >> so they have access to algorithms and data? >> we have engineers in the united states and we have engineers -- so the answer is yes. >> what about -- bytedance says they are fully separate. what about doyenne employees. >> that is a completely different app from tiktok. >> if the chinese company asked
platform collects? >> senator, many outside researchers and experts that look at this have proven that tiktok collects less data than peers on the keystroke issues. >> outside researchers you are paying for? >> no, senator. >> you would submit that to outside independent researchers? what we are seeing with all of this biometric data and the keystroke patterns that you are exceeding that. so what do you do with this? are you creating a virtual you of the children on your site? >> senator, i don't know what you mean by virtual you. >> a virtual you is you and your presence online. like a virtual dossier. i'm sure you understand that term. what do you need with all of this information? do you track the viewing problem
-- viewing patterns of children? are you building a replication of where they go? their search history, their voice, their biometrics? whitest tiktok and bite dance -- and bytedance and need that information on our children? >> senator, tiktok is an entertainment platform where people watch and enjoy short form videos. it is about uplifting and entertaining content and people love it. i disagree with the characterization. >> that is it from the positive, but there's also a negative. and the negative is that you are building a profile, a virtual you of our children, because of the data you are collecting. you have mentioned the family parent provision that you have. so, when you have a parent that goes on that, are they opening
their data to tiktok? is tiktok following them? following and capturing their search history? when you capture all of this data and you hold all of this data, you are invading the property and the privacy of individuals on your site. and that applies to you and to ms. stout and ms. miller. because you say that because you are using the platform, that we can do this. in essence, what you are doing is making our children and their data -- you are making that the product, because you turn around and you sell it. and then basically, it becomes weaponized against their users. mr. chairman, i'm over time, and i have several questions for ms.
stout and ms. miller and we will do that in a second round. sen. blumenthal: we will do that in a second round. senator klobuchar. >> thank you to both of you. reports indicate that half of kids 9-12 and those over 13 use snap, facebook, tiktok, you two. i don't think parents are going to stand by while our kids and our democracy become collateral damage to a prophet king. i heard last night mark zuckerberg's words in his earnings report. while he may be out there acting as a victim at his 29 billion-dollar quarter earnings report meeting, the true victims are the mom in duluth who cannot get her kid off facebook to do her homework.
the dead who is mourning losing a child to a speed filter that measured the kid at going 123 miles an hour, trying to beat the filter. or a child exposed to content glorifying eating disorders on tiktok. i have had a kiss right in my state, actually two cases of young people who got drugs through snap. i want to first start out with that stout. there are two kids. devon was suffering from dental pain at the beginning of the pandemic and he could not see the doctor. he had been given a percocet before and a classmate said he had a percocet. what the young man did not know was the percocet was laced with
fentanyl, and he died, just like that. as his mom said in a letter to me, all the hopes and dreams that we as parents had for devon were erased in the blink of an eye. a group of parents, including devon's mother, bridget, demanded answers and accountability from snap. in a letter to you in september, ms. stout, i want to know what the answers are. will you commit to providing more information about the automated tools that snap uses to proactively search for illegal drug related content as parents ask? >> senator, i very much appreciate you raising this issue of a devastating crisis impacting our young people. i want to make clear, we are determined to remove drug dealers from snapchat, and we have been public about our efforts in this space. first of all, we have stepped up
operational efforts. my heart goes out to the families. i have met with bridget back in april and i heard from her and other families to understand what is happening with their experience and also what is happening on snapchat. we have deployed proactive detection measures to get ahead of what the drug dealers are doing. they are constantly evading our tactics. not just on snapchat, but every platform. we have also stepped up our work with law enforcement. just last week, we had a law enforcement summit, where we gathered over 2,000 members of law enforcement from across the country to understand what they are dealing with and to find the best practices to get them the information they need to help with their investigation. senator, this is so important. we have deployed an education and awareness campaign. because what is happening on our platforms across social media and technology is that young people suffering from mental health and stress induced
by the pandemic and other issues, they are reaching for substances. oftentimes, pills and opioids. but these substances are laced with enough fentanyl to kill them. >> here is my problem. if a kid walked into a pharmacy, they would not be able to buy that or get that. but in this case, they can get onto your platform and find a way to buy it. that is the problem. i want to know, are you going to get your drug -- i appreciate you meeting with the mom come are you going to get the drugs off snap chat when you have other kids in america looking at these platforms? >> i assure you, this is a top priority for our company. senator, it is not just happening on our platform, but it's happening on others. we need to work collectively with the other platforms and companies here today to work together. >> that's good. thank you. i think there are other ways to
do this, such as creating liability when this happens so that might make you work even faster sir we don't -- faster so we don't lose another kid. mr. beckerman, recent investigation by the wall street journal found that tiktok's algorithms can push young users to content glorifying eating disorders, drugs, and violence. have you stopped that? >> yes, senator. i don't agree with the way the wall street journal when around that. -- went around that. we have made improvements to the way people can have control of the algorithm and have age-appropriate content on tiktok. >> what are those changes? are kids completely protected from this content? >> the content related to drugs, as you are pointing out, violates our community guidelines. over 97% of violent content is removed proactively. we want to get to 100% and that is what we are working on. -- something that we are constantly working on. >> are you we are of research
your company has conducted about pushing content promoting eating disorders to teens? >> no, senator. >> did you offer studies before testifying? >> not that i'm aware of, but we work with outside experts understand the issues. i think it should be done in a transparent way. i would like to see the act passed so we can have additional research in the public domain we can all learn from and improve. >> i will save my questions for ms. miller for the next round. thank you. >> thank you, senator klobuchar. i would remind everyone, have committed to provide the research -- you have committed to provide the research and we look forward to receiving it within days or weeks, not months. particularly appreciate senator klobuchar's reference to creating liability as a strong incentive, which would involve reform of section 230.
>> mr. chair, if i could put this letter from the parents and to record. -- into the record. sen. blumenthal: without objection. >> thank you. sen. blumenthal: we have been joined by senator cantwell remotely. >> mr. chairman, i defer to my colleagues. senator markey. senator baldwin. sen. blumenthal: thanks very much. senator markey. >> thank you, mr. chairman, very much. the problem is clear, big tech preys on children and teens to make more money. now is the time for the legislative solutions to these problems. and that starts with privacy. i have introduced bipartisan legislation to give children and teens the privacy bill of rights for the 21st century. today, a 13-year-old girl on these apps has no privacy rights. she has no ability to say no.
no, you cannot gobble up data about me. no, you cannot use the data to power algorithms that push toxic content towards me. no, you cannot profile me to manipulate me and keep me glued to your apps. no, you have no rights. a 13-year-old girl in the united states of america in the year 2021. my bipartisan children and teens online privacy protection act gives 13, 14, and 15-year-olds the right to say no. to each witness, do you support my legislation to update the children's online privacy protection act to give that 13-year-old, 14-year-old, and 15-year-old that control of their data? ms. stout? >> senator, i want to say i >> senator, i want to say
absolutely support a federal privacy proposal and worked hard with members of this body. sen. markey: do you support my child and teen protection? >> senator, we agree there should be additional protections to protect young people . sen. markey: so you have had a chance to look at the child online protection update that has been online for years ? >> we would like to talk more about the issues. s this drives usen crazy. . markey: back being updated? >> thank you for your leadership on the issue. we agree it needs to be updated as it relates to the weight age
verification happens across the internet, an area that has been not given as much attention as it deserves and we agree and we agree it needs to be updated. sen. markey: you support my legislation? you have plenty of time to look at it. we like the approach but the peace that should be included is a better way to verify age across the internet and apps rather than a system in place now. without improvement, we would be happy to support it. sen. markey: miss miller. >> we also support the goals of updated, comprehensive privacy legislation. we have had conversations with your staff in a constructive manner and i would welcome continuing to do that. sen. markey: it will happen soon. this is a crisis -- thank you, senator blumenthal. this has surfaced in a way it is clear we do not have time.
we have to get this finished. among young teens, 49% say they're on tiktok and 52% are on snapchat. 81% are on youtube. those 13-year-olds deserve the right to sate you cannot track me. do you agree with that? do you agree with that? yes, i agree with that. >> yes, senator. chair blumenthal: do you agree with that? >> yes and we have tools for users to handle, control and make choices as it relates to the information that is gathered. chair blumenthal: the bill would be targeted at two children that should never be allowed to track a 10-year-old's browsing history and bombard him with ads based on the data. ms. miller, you said that the youtube kids platform prevents targeted ads to children. do you agree that congress must ban targeted ads to children?
>> i differed to you interns of what you would want to move forward. we have not waited for laws like this. chair blumenthal: would you support a uniform banning of the practice if you have adopted it as a country -- company? would you support the standard we have for all platforms across the country? >> it is consistent with the approach we have taken. sen. markey: you would support it? >> we are already doing this. sen. markey: we are trying to draft a law. would you support it being in? >> we already prohibit targeted advertising. sen. markey: we can legislate it. so that we can legislate.
same question. should kids ban targeted ads? >> we offer those tools already where kids cannot doubt -- can opt out. sen. markey: would you support that as a national law? >> an example has been and the design code we adhere to and i can tell you we are looking at that model. sen. markey: do you support it at this law this body passes? should we prohibit it? >> we agree with the approach. sen. markey: you support it? yes or no? >> we agree with the approach so we are applying it. sen. markey: if you support it, would you support the law that would prohibit anyone else from doing it. >> yes. >> i think we are close, centre, -- senator. >> we should go beyond that intake it -- and take certain
categories that we do not show to young adults or teenagers. chair blumenthal: so we need to go beyond privacy to tackle the design features that harm young people. to take the like buttons. senator blumenthal and i have -- i have a bill, the kids act which would ban these and other , features according to popularity. the research is clear this turns apps into virtual popularity contests. they are linked to feelings of rejection low self-worth and , depression. even youtube kids has acknowledged the problem and does not have like buttons. should congress ban features that quantify popularity for kids, yes or no? >> as i mentioned in my opening statement we have never had a like button and so we would support that. we do not think it should be a popularity context. chair blumenthal: you would support that? >> this is more complex and i am
happy to have a conversation. we have implemented much of the design code in the united states and would encourage similar measures. sen. markey: i don't know there was an answer in that. you said it is complicated. do you support banning it? >> if you want to set it by age, that is something to look at. sen. markey: ms. miller? ms. miller: we already prohibit being able to comment on youtube kids and we would support working with you in regulation. sen. markey: you would support working with us but would you support banning lights? ms. miller: senator, again, we do not allow for this on the youtube kids platform. sen. markey: again the american , academy of pediatrics just declared a national state of emergency for children and teen mental health.
we need to outlaw the online features that exacerbate this crisis. the question that we have to answer, ultimately is whether or not, for example we are going to ban autoplay for kids. the future when one video ends and another begins. kids stay glued to their phones so the apps gather data and make more money. parents are worried about screen -- 82% of parents are worried about their kids' screen time. to each of you, today, do you agree congress should ban autoplay for kids? yes or no? ms. miller, we will start with you this time. ms. miller: senator, each of the items you are outlining, we already prohibit. we have auto default set to autoplay off on kids as well as for supervised experiences. sen. markey: would you support
that being legislated? ms. miller: yes, sir. sen. markey: -- >> we have take a break videos and time management tools but for autoplay, you have to switch to the next video. sen. markey: with ban autoplay? >> we would be happy to talk to you about it. sen. markey: you don't do it. >> i think it is important as we look at the features for teens, it is something we built into tiktok proactively but as we look at legislation, i think a first step is around age verification process. sen. markey: this is a historic problem. would you support it? ms. stout: we do not have autoplay on snapchat so that is something to look at more closely and i'm not familiar with that piece, the proposal in your legislation.
sen. markey: we have work to do and we have to telescope the timeframe. chair blumenthal: thank you for your good work. senator braun when -- baldwin. sen. cantwell: thank you, mr. chairman. we all know social media offers a lot of benefits and opportunities but like has been expressed, i have concerns about the lack of transparency online and limited accountability of big companies. a major problem in social media is increasingly concerning is social media platforms usage of atherton -- algorithms to munich -- manipulate user experience, users get trapped in a filter bubble which can be troubling for younger users. a recent wall street journal article described in detail how
tiktok serves up sex and violence -- drug videos to minors. i have a bill to filter the bubble and another that would make strides in addressing the lack of transparency online and importantly the filter bubble transparency act would give the option to not be manipulated by opaque algorithms. do you believe consumers should be able to use platforms without being manipulated by algorithms designed to keep them engaged on the platform? mr. beckerman: we agree there needs to be transparency and a way algorithms work. and individual choices for those who use them. ms. miller: we provide transparency in a way that works. ms. stout: it is important to understand what we apply
algorithms to as a small subset of content. we provide transparency to users where they get to select interest categories that determine the content they are served up. it is not in a limited list or set of user generated content. it is narrow. sen. thune: i don't know if you answered the question. should consumers who use the social media platforms be able to use them without being manipulated by algorithms. ms. stout: senator, yes i agree with you. ms. miller: yes, senator. sen. thune: - mr. beckerman: sex and drugs are violations of our community guidelines and have no place in tiktok. as it relates to the article, we disagree with that being an
authentic experience. sen. thune: your platform is more driven by algorithms than any other social media platform available today, more so even than facebook. unlike facebook, tiktok's algorithm is not constrained by a social network. on july 19, 2020, the former ceo wrote that we believe all companies should disclose their algorithms, moderation policies, and data flows to regulators. tiktok also states on its website it makes tiktok source code available for testing and evaluation to get at its transparency and accountability center. has tiktok disclosed their algorithms, moderation policies, and data pools to federal or state regulators? mr. beckerman: as we pointed out, we have transparency centers and we have done may be over 100 towards with members of senate and staff and others in the government and would be
happy to be transparent of how that works. sen. thune: i think maybe centered blumenthal touched on this but in keeping with tiktok's disclosure of practices announced in july, 2020, would you commit to providing algorithms, moderation policy to the committee so we may have independent experts review them? mr. beckerman: yes. sen. thune: thank you. does youtube engage in efforts to change the attitudes and behaviors or influence its users in any way? ms. miller: when users come to youtube, they come to search and discover content like how to bake bread, watch a church service, or do exercise. as a result, they are introduced to a diversity of content that is not based on a network they are a part of. in so doing, there may be additional videos recommended to
them but those signals will be over righted to make sure we are not recommending harmful content. sen. thune: back to mr. beckerman. all chinese internet companies are held china's national intelligence law over data the government demands. that power is not limited by china's borders. as tiktok provided data to chinese governments on chinese persons living in the united states or elsewhere outside china? mr. beckerman: tiktok is not available in china and i would like to point out our servers are stored in the united states. sen. thune: this tiktok sensor videos of hangman, the famous video of the man who stood his ground in front of chinese army tanks during the 1989 tiananmen square crackdown in beijing? mr. beckerman: you can find that content on tiktok if you search warrant. sen. thune: mr. chairman, i
would suggest that as has been pointed out, i think there are a number of things we need to address. congress needs to be heard from in the space and particularly with respect to the use of algorithms and the way users are manipulated and particularly young people, i hope we can move quickly and directly and in a meaningful way to address this issue. thank you. chair blumenthal: i think we have strong consensus on that interview -- issue. similar baldwin. sen. baldwin: i would like to note that the series of hearings began with a revelation that internal research at facebook revealed the negative impact on teenagers' body images from
using the company's instagram platform and we learned, based on research by the chairman staff, how quickly someone on instagram can go from viewing content on healthy eating to being directed toward posting that focuses on unhealthy plaque to says including glorifying eating disorders. i know we do not have facebook and instagram before us today. i am particularly concerned about the impact that type of content can have on young users. i recently joined senators klobuchar and capito with whom sponsored the anna weston act to support training and education on eating disorders. a letter to facebook and instagram making more details about how they handled this issue. i wanted to ask each of you, can
you outline the steps or companies are taking to remove content that promotes unhealthy body image and eating disorders and direct users to supportive reseources instead and how are you focusing on this issue with regard to your younger users? why don't we start with mr. beckerman and tiktok? mr. beckerman: thank you, senator. i have two young but that starters and this is something i care about. we aggressively remove content for eating disorders. we work with outside groups and direct people that are seeking help and one thing we have heard his people struggling with eating disorders or other weight loss issues come to tiktok to express that in a positive way so it has been a positive source
and we do not allow ads that target people based on weight loss and that kind of content. ms. stout: thank you, senator baldwin. i want to make clear the content you described is content that glorifies eating disorders or self-harm is a complete violation of community guidelines. as i described earlier, we do not allow unvented, unmoderated content served up to our users. our media publisher, discover, which we partner on with people and publishing companies like wsj and nbc news is vetted and moderated ahead of time. sen. baldwin: can i interrupt to you and ask is that through ai or humans? ms. stout: these are hand-picked partners that snapchat has selected to say in this close garden of content, discover, we will allow publishers and media companies to provide news,
entertainment, content, espn, the washington post. users can come look at the content. it is pretty moderated and curated so it is not an unlimited source of user generated content where you can go down a rabbit hole and access that kind of hurtful, damaging content on body image. you raise an interesting question. what are the product surfacing? how are we helping users find positive resources? as a result we did conduct , research on the effects of body image and self harm and we created a product called here for you. it was created in 2020 at the height of the pandemic. when users search anorexia or eating disorders, instead of being led to content that is harmful and against our guidelines, we now surface export expert resources to show content to help them or maybe their friends.
it's a redirection of that search for potentially hurtful content that then steers the user to resources that may help them or a member of their circle of friends. ms. miller: senator we take a , comprehensive and holistic approach on topics like these. we prohibit content that promotes or glorifies things such as eating disorders. it has no place on our platform also realize users come to share stories about these experiences or to find a community. let alone find authoritative resources which is what we raise up on searches like this. in addition, we rollout programs and initiatives such as the with me cam pain. -- campaign.
we encourage users to spend their time during covid in pursuing healthy habits. we look at this holistically to make sure your tube --youtube is a platform people, and have a healthy experience. we prohibit the type of content that glorifies or promotes these issues such as eating disorders. sen. baldwin: if i could follow-up the same way as with ms. stout, when you remove content, how? do you filter that out? ? do you use artificial intelligence or a team of humans who are -- people who are looking at the content deciding whether to remove it? ms. miller: it is a mix of when we develop content policies, we rely on experts to form the development of the policies and then we have machine learning to
help us capture this type of content at scale. you'll see in our transparency report that more than 90% of content that violates communities --community guidelines are flagged by a machine and there is a mix of human reviewers. sen. baldwin: thank you. i want to thank senator blumenthal and blackburn for holding this subcommittee hearing. as witnesses can see, our colleagues are well-informed and very anxious to get legislative fixes two things they think are crucial to protecting individuals and to protecting people's privacy. i want to thank them for that. yesterday, other advice -- vice had an article called location data from gps data and apps. they are given when people have opted out. basically, they're going to enter this for the record unless
there is objection. the news highlights a stark problem that smartphone users cannot be sure if some app are respecting their preferences around data sharing. data transfer presents an issue for the location data companies themselves. these companies are reporting information about location even when people have explicitly opted out. they're continuing to collect this information. that is what the reporters and researchers and the board found. i have a question. do you believe location data is sensitive and should be collected only with consumer's consent -- consumers' consent? >> we agree. >> agree. >> yes, senator. for users, they have access to their account under my activity
and my account and can modify their settings, delete their history, and things of that nature. sen. cantwell: any federal privacy law, we should make sure that is adhered to. i see a nod. >> yes, senator. >> yes, senator. sen. cantwell: thank you. do any of you share location data with the company that is in this article, huk? their major data? >> i have never heard of that company and i'm not aware. mr. beckerman: i'm not aware of the company but we do not collect gps data. sen. cantwell: you would be affiliated with them in some way. they are getting this information in any way. . >> i am also not aware of the company. sen. cantwell: maybe you can help us for the record on this
so that we know but this is exactly what the public is frustrated about and concerned about particularly when more can be done. they go to a website and don't want my sensitive information to be shared and then there is a conglomerate of data gathered on top of that that is not honoring those wishes as it relates to the interface. this is exactly why we need a strong privacy law and why we should protect consumers on this. in the facebook hearing, we had a discussion about advertising and the issue of whether advertisers knew the exact -- exactly what the content was they were being advertised. we are also seeing a migration of major companies like procter & gamble and others moving off the internet because they are like, i'm done with it. i do not want my ad appearing
next to certain kinds of content. what was more startling is that it may be actual deceptive practices here where people are saying this content is this when in reality, it is something else and in some of these cases, objectionable hate speech with facebook and content we do not think should be online. that is how the advertisers knew. on your website, do advertisers know what content they are being placed? next to. ? >> i can respond to your question. our advertisers know where their advertising shows up and as i had mentioned, discover, the close, curated garden, those advertisements appear next to publishers and verified users we at hants to allow to appear. on a platform like snapchat,
there is no broadcast disinformation and hate speech and that is why i think it is a very appealing place for advertisers, because they know their advertisements will be placed. mr. beckerman: advertisers come to tiktok because our content is known for being so authentic and uplifting and fun. we see ads that are very much like tiktok videos. ms. miller: senator, we have worked with advertising partners over the years to make sure that they have, in the fact that advertising on youtube is safe for their clients in the same way we have worked significantly to make sure -- have a safe experience and the advertising associations have recognized our work in the space so their brands are safe on the platform. sen. cantwell: senator lee.
sen. lee: i would like to start with you if that is all right. i want to ask you a question regarding youtube's app age rating. google play has the operating set at teen, meaning 13 and up, while the apple store has it rated at 17 and up. tell me why this disparity. if apple determines the age rating for youtube ought to be 17 and up, while -- why should google determine its own app as teen? ms. miller: i am unfamiliar with the differences you outlined but i would be happy to follow up with you and your staff once i get more details. sen. lee: i would love to know about that.
it is a simple question and i understand you may not be able to answer it right now. you don't have the information but i would like to know why that difference is and whether you agree or disagree with the fact that google has rated its own app 13 and up while apple rated at 17 and up but i'm happy to follow up on that in writing or otherwise. i want to address a similar issue with regard to snapchat. snapchat is rated 12 and up on apple and teen on the google play store. any idea why there is a disparity? ms. stout: that is a good question and for some reason, i heard somewhere the reason apple was 12 and up, it is intended for a teen audience. sen. lee: why is there a disparity between age rating and the content available on the
platform? ms. miller: the content that appears on snapchat is appropriate for an age group of 13 and above. sen. lee: let's talk about that for a minute because i beg to differ. in the investigation of the discussion in the hearing, i had my staff create a snapshot account for a 13-year-old. they did not select any content preferences for the account. they earned a name, a birth year, and an email address. when it opens it, the discover page on snapchat, with the default settings, they were bombarded with content that i can most politely described as wildly inappropriate for a child, including recommendations
for among other things and invite to play an online sexualized videogame marketed itself to people who are 18 and up. tips on why you should not go to bars alone. notices for video games rated for ages 17 and up and articles about poor and stars --- porn s tars. let me remind you that this inappropriate content that has by default been recommended for a 15-year-old child is something that was sent to them by an app, using the default settings. i respectfully but strongly beg to differ on your characterization of the content it in fact -- is in fact suitable for children 13 and up as you say.
according to your website, discover is a list of recommended stories. how and why do snapchat choose the inappropriate stories to recommend to children? how does that happen and how would that happen? ms. stout: senator, allow me to explain a little bit about discover. it is a close content platform and yes, we select and hand select partners we work with. that kind of content is designed to appear on discover and resonate with an audience 13 and above. i have taken notes about what you have said that your account surface. i want to make clear that what content and community guidelines suggest that any online sexual video games should be to teens and above so i'm unclear why that would have shown up in an
account that was for 14-year-old. these community guidelines and publisher guidelines on top of those guidelines are intended to be an age-appropriate experience for a 13-year-old should -- 13-year-old. sen. lee: you have community guidelines and advertisers and media partners agreed to them. what are these additional guidelines and --i mean, i can guess only that they permit these age inappropriate articles that we shared with children. how would not not be the case? ms. stout: these additional guidelines are things that suggest they may not glorify violence, that any news articles must be fact checked, that there is no -- sen. lee: i am sure the article is about the porn stars were accurate and i'm sure the tips on why not to go to bars alone
were fact-check but that is not my question. it is whether it is appropriate for children 13 and up. ms. stout: i think this is an area we are constantly evolving and if there are senses these publishers are surfacing content to an age cohort that is inappropriate, they will be removed from our platform. sen. lee: what kind of oversight do you conduct? ms. miller:k we use human review and automated review. i would be interested in talking to you and your staff about what kind of content this was because if it violates our guidelines, this would come down. one last thing. i would agree with you when it comes to the kind of content that is promoted on discover. there is no content that is a legal. none is hurtful. it really is intended to be -- where we have control over the content that services. -- surfaces.
sen. lee: i have a follow-up question. snapchat has assured it does not collect identifying data for advertising. how did snapchat decide what content is pushed to the top of their page? ms. stout: you have the ability to select preferences and there are several interest categories a user can select or unselect. they like to watch movies or they enjoy sports or they are fans of country music. at any point, it is completely transparent and a user has the ability to select what they like and that determines the content that is surfaced. after his content they do not like, they can uncheck or check and that generates the kind of content a user in discover would see. sen. lee: thank you so much.
i think we should get to the bottom of these. these apps are inappropriate and we know there is content on snapchat and youtube among other places that is not appropriate for children ages 12 or 13 and up. chair: i would say to my line of questioning and this is not appropriate to tell advertisers is a -- it is not located next to content that is inappropriate. sen. lujan: in your testimony, you mentioned that all content on snapchat on the spotlight page is human reviewed before it can be viewed by more than 25 people. yes or no. does human review help snapchat reduce the spread of potentially harmful content? ms. stout: we believe it does. sen. lujan: i appreciate snapchat's approach to the problem. more platforms to work to stop
harmful content from going viral. far too often, we find companies wanting -- once attention is diverted and the public is distracted, they do the very thing they were warning us against. can i hold you to that? will snapchat continue to keep a human in the loop before content is algorithmically promoted to large audiences? ms. stout: this is the first time i have testified before congress so please hold me to it but we have taken a human moderation first approach, not just on spotlight but across the platform. human moderation will continue to play a huge part of how we moderate content on our platform and keep users safe. sen. lujan: you see the importance of platforms taking responsibility before they amplify content and especially publish it to a mass audience and something many of us share.
they protect americans from dangerous algorithms. online platforms must be responsible and they are actively promoting hateful content. miss miller, i am grateful youtube is making an effort to be transparent regarding the numbers of users that view content and violation of community guidelines but i'm concerned with the trend. i wrote a letter to youtube with 25 colleagues on the crisis of non-english misinformation on the platform. we need to make sure all communities, no matter their language, have the same access to good, reliable information. will youtube publish its violative view rates broken down by language? ms. miller: senator, thank you for your question and what you are referring to is the latest data point from this year in which for every 10,000 views on youtube, 19 to 21 of those views
are of content that is violative and we apply our content policies at a global scale across languages. we do not preference any one language over another. this includes for the violative view rate. sen. lujan: i do monopoly that is good enough. we do not break other isms -- algorithms down across groups of people. we make existing gaps and buys his words. you see that with facial recognition technology that unfairly targeted communities of color according to reports. we see this happen right now on youtube. will youtube publish its violative view rate broken down by language? ms. miller: senator, i would be happy to follow up with you to talk through these details. for all of our content policies
and the enforcement within and the transparency we provide, it is global in scope and across languages. sen. lujan: i look forward to working with you in that space. before launching tiktok for younger users, did tiktok to any internal research to understand the impact it would have on young children? mr. beckerman: it's i am not aware that the content is curated when it comes to networks. it is age-appropriate but i'm not aware of the research. sen. lujan: i would like to follow-up. problems like tiktok can lead to addictive behavior and body image choose. -- issues. . it is critical to understand them before they take place. this is a serious issue finally getting the attention it deserves with revelations of whistleblowers that have come
forward. i urge you to take this opportunity to get an evaluation of the impact your product is having on young children and in the end, i want to follow-up on something many of us have and, as commented on leading up to these important hearings. i appreciate the chairs attention to this. both him and the ranking member have authored legislation. of the subcommittee, they have partnered on legislative initiatives. it is critically important we continue moving forward and that we mark up legislation and get something adopted and i'm hopeful that here in the u.s., we pay attention to what is happening elsewhere. europe is outpacing the united states and being responsible with legislative initiatives surrounding protecting consumers. there is no reason we cannot do that here and i want to thank the chair for the work she has done in the space. i look forward to working with
everyone to make sure we are able to get this done and the u.s. sen. cantwell: very much appreciate that and your leadership. we are awaiting the return of senator blumenthal so i can go slow. we can take a short recess because we are way past time to get over. i want to thank all of the members who have participated those far because we have had a robust discussion today. you can see this is a topic the members of the committee feel passionately about. they obviously believe there is more we need to be doing in this particular area. i appreciate everybody's attendance and focus. again, i want to thank center blumenthal and blackburn for -- senator blumenthal and senator blackburn for their leadership. both of these hearings. for the larger committee, we had
planned to move forward on many of these agenda items anyway but we appreciate the subcommittee doing some of the work and having members have a chance to have very detailed interactions on these policies that we need to take action on. i very much appreciate that. i see senator blumenthal has returned. thank you so much and i will turn it over to you. sen. blumenthal: thank you, chairman cantwell and thank you for your work on these issues. i would like to ask some additional questions on legislative proposals. one of the suggestions that senator klobuchar raised was legal responsibilities and
liabilities, which is now concluded by section 230. let me ask each of you, would you support responsible measures like the earn it act i proposed to impose some legal responsibility and liability on cutting back on the immunity that section 230 affords? ms. stout? ms. stout: we agree there should be an update to the intermediary platform liability law and in fact, the last time this body addressed a reform, which was - snuffed- out was a company that participated and helped draft legislation so we would welcome another opportunity to work with you on that. sen. blumenthal: would you
support the earn it act which senator graham and i proposed which proposes liabilities and affords victims the opportunity to take action against platforms that engage in child pornography related abuses? ms. stout: of course. we completely prohibit that kind of activity as we actively look for it. if we remove it when we find it. if you would allow me to get back to you, it has been a while since i worked on the earn it act. i recall you said senator graham introduced it but we support your legislation. sen. blumenthal: you had the opportunity to say whether you supported it and you have not. will you commit to supporting it? ms. stout: senator, again, my memory is failing me a little bit but i do believe the provisions of the earn and act in many of the provisions we supported. i would be happy to come back
with you for a more specific answer. mr. beckerman: we agree there needs to be a higher degree of accountability and responsibility particularly as it relates to content moderation that needs to be done in a way that allows platforms to moderate in an appropriate and aggressive way to make sure the kind of content none of us want to see on the internet or any of our platforms is able to be removed. sen. blumenthal: do you support changes in section 230? mr. beckerman: there can and should be changes but again, in a way that would allow companies like ours that are good actors and aggressively moderating our platform in a responsible way to continue to do so. sen. blumenthal: do you support the earn at act? mr. beckerman: we agree with the spirit and would be happy to work with you on the bill. it was reported unanimously on the last session.
it has not changed significantly. did you support it done? mr. beckerman: the concern would be unintended consequences that lead to hammering a company's ability to remove and police content on platforms it's. sen. blumenthal: is that a yes or no? mr. beckerman: maybe. sen. blumenthal: we have two maybes so far. ms. miller: i am aware of a number of proposals regarding potential updates to 230 and me and my team as well as other teams across google have been involved in the conversations regarding his proposals. we see 230 as the backbone of the internet and it is what allows us to moderate content and make sure we are taking down content that leads to potentially eating disorders for example.
what we have talked about earlier for self-harm. we want to make sure we continue to have protections so we can moderate our platforms so that they are safe and healthy for users. i am aware of the earn it act and i know our staff have been speaking but i understand there is still ongoing discussions regarding some portions of the proposal. we also very much appreciate and understand the rationale as to why this was introduced, particularly around child safety. sen. blumenthal: is that a yes or no? ms. miller: we support the goals of the earn it act but there are some details that are being discussed. sen. blumenthal: as senator markey has said, this is the talk we have seen again and again and again and again. we support the goal but that is meaningless unless you support the legislation.
it took a fight, literally, bareknuckle fight, get through legislation that made an exception for liability on human trafficking. just one, small piece of reform and i joined in a frustration of many of my colleagues that good intentions, endorsement of purposes and no substitute for actual endorsement, i would ask that each and every one of you support the earn it act but also other specific measures that will provide for legal responsibility and i think i know the claim section 230
provides a backbone but it is a backbone without real spine because all it does is support virtually limitless immunity to the internet and to the companies that are here. it is going to interrupt my second round and call on senator cruz. sen. cruz: mr. beckerman, thank you for being here today and i understand this is the first time tiktok has testified before congress and i appreciate you making the company available to answer questions. in your testimony, you talked about the things tiktok is doing to protect kids online and that is correct. i wanted to discuss the broader issue, the control the chinese communist party has on tiktok. its parent company and its sister company like beijing
answer. sen. cruz: i want to be clear because you are under oath. your answer is that it is not a "other affiliate of our corporate group". this is a legal question with consequence. mr. beckerman: i understand the question. as i pointed out, tiktok is not available in china and that is an entity for purposes of the licenses of business in china. sen. cruz: you are refusing to answer the question, for the record. mr. beckerman: i believe i answered your question on my god -- question. sen. cruz: you are not willing to say yes or no. mr. beckerman: it is not a yes or no question. i want to be precise. sen. cruz: is this company another affiliate defined in your policy? mr. beckerman: the way i answered, i am not aware that is the answer to the question. sen. cruz: so you refuse to
answer the question. that does not give any confidence that tiktok is doing anything other than participating in chinese propaganda and espionage. mr. beckerman: that is not accurate and i sen. cruz: if it were not accurate, you would have answered the questions and you have dodged the questions more than anyone else i have seen in my nine years on the senate. witnesses often try to dodge questions. you refuse to answer simple questions and that, in my experience, when a witness does that, it is because they are hiding something. >> senator moran? sen. moran: mr. chairman, thank you very much. let me turn to ms. stout and ask a question about data privacy. so, senator blumenthal and others have been working on a consumer data privacy bill for several years. i have introduced a bill that includes an appropriate scaled
right for consumers to correct and erase data that is collected or processed by entities including social media companie including social media companies. ms. stout, i you think so that snap allows users to correct their user data . would you please explain the decision to proactively provide this service. >> thank you, senator, for the question. we applied the committee and your leadership on this issue. we fully support a federal comprehensive privacy bill and
we look forward to working with you and your staff on that. yes, center, snap is designed with a privacy centric focus from the outset. we do not collect a lot of date and we believe in data minimization and short data retention periods. as you pointed out, we do not store content forever and we believe in giving users transparency and control. that includes the ability to delete data, if they wish, or the ability to download data have a tool in the app that allows users to download data. it gives them any information they may have agreed to share, post, or put onto there snapchat account. and other platforms who may not take the same position that snap has, what you give up in your ability to earn revenue?
what is it that you lose by doing that, if anything? >> we make tradeoffs every day that sometimes disadvantage are bottom line. there are no rules and regulations that require companies like snap to have short retention. or why we choose to voluntarily have a data minimization practice. means advertisers find other platforms perhaps more enticing those platforms keep a history of anything that has been shared or searched or location data that's ever been provided and that's not the case on snapchat. that is a trade-off, senator, because we believe in being more private and more safe. if you did otherwise, what wd you gain by doing so? >> we just believe that we have a moral responsibility to limit the data we collect on people. >> your answer is fine, but what do you generate by keeping the data and in some other way using it? >> we limit ourselves in our ability to optimize those advertisements to make more money. we are a company that has not yet turned a profit.
we have reinvested every dollar into the company and we are here for gain . our desire is to make a platform that is safe for our community. >> ity, and mr. beckerman, what responsibility do platforms have to prevent harmful social media trends from spreading? and how can tiktok improve its algorithm to better comply with the -- their own terms of service to prevent criminal activity? mr. becker man: thank you, senator. we do have a responsibility to moderate our platform along with community guidelines and to do that in a transparent way. >> where does that come from? mr. becker man: it comes from doing the right thing. we want to be a transparent platform where people enjoy their experience. that starts with community guidelines. certain content that we would not allow on the platform, we work really hard, i think. the safety teams are often the unsung heroes of the company.
they work every day 24/7 to make sure community guidelines are met and that the platform stays positive and joyful. >> the nature of my second question, the add-on-question was what's leading because it suggests that you are not complying with your own terms of service that enable criminal activity . my question is, how can you improve your algorithm to better accomplish that? maybe you would discount the premise of the question. mr. becker man: no, senator, it's an important area where we want to get to 100%. we release transparency reports and 94% of our removal of vile active content are done proactivity. much of it is done within 24 hours before there are views. we strive for 100% and that is something we fight and work on every day. sen. moran: thank you. thank you, chairman, and ranking member senator blackburn. this subcommittee is important
to the nature of our country and to its future. thank you. >> thank you, senator moran. >> thank you, mr. chairman. ms. miller, i would like to come to you . you talk about moderation as a combination of machine learning and human review. it seems that youtube has no problem pulling down videos that question abortion or global warming or vaccine mandates. but child abuse videos remain on your site. so i am interested to know more about the specific inputs that you use in these reviews. who establishes these inputs? who oversees them to make sure that you get them right? >> senator, thank you for your question. so, we heavily invest in making sure that all of our users and particularly kids on the
platform have a safe experience. the way we do this is with a number of levers. for example, we have content policies as it relates to child safety on the platform. so, not putting situations and videos on the plat form. sen. blackburn. let me jump in, then, and and you, if you don't want to put children into risky videos -- there is a world of self-harm content on your site. a few searches come up with videos, and i am quoting from searches that we have done. songs to slit your wrist by. vertical shildt wrists. how to slit your wrist. and painless ways to commit suicide. now, that last video, painless ways was age gated.
but do the self-harm and suicide videos violate youtube tees content guidelines, if you're saying you have these guidelines? >> senator, i would certainly like following up with you on the video you may be referencing. we absolutely prohibit content regarding suicide or self-harm. sen. blackburn: i have to tell you, we have pulled these down in our office. our team has worked on this because i think it is imperative that we take the steps that are necessary to prevent children and teens from seeing this content. and, i just can't imagine that you all are continuing to allow children to figure out how to do this on your site. how to carry out self-harm. yes, why don't you follow up with me for more detail and i would like the response in writing. i also talked to a film producer friend this morning and the film
trailer for i am not ashamed, which was based on the story of rachel scott who was the first victim of the columbine attacks. the film focused on her fait and how it helped in her life. so why would you remove this film trailer and block its distributor from being on your site and you did this for 11 months. you did not put the trailer bac up until a hollywood reporter called and said, why have you done this? do you have an answer on that one? >> senator, i'm sorry, but i'm not familiar with that specific -- senator blackburn. : ok, then let's review this. and we sub it in -- submit
the documentation that i had sent over to me. ms. stout, over to you . we had an issue in memphis with a 48-year-old man who was on your site. re-- he related to a 16-year-old memphis teenage. he claimed to be a music producer and he lured her into this relationship. one of the news articles recently called snapchat the app of choice for sexual predators. this is something that is of tremendous concern. much of it, from what i understand from talking to mom's and grandmom's is they use the snap map location service . i know you are probably going to say that only your friends can follow you. somehow people are getting around that. and these sexual principled torreyes, who are following young people are using this map to get to their location. so, we had this in memphis with the rape. we have another child that was in the middle part of the state
that the predator followed her and she tried to commit suicide because she knew her family was going to find out. so are you taking steps? do you want to give me a written answer as to the steps you all are taking? how are you going to get a handle on this? this is endangering young women. >> senator, i'm more than happy to give you a detailed written answer. i appreciate your leadership and following up on this issue want issue. i want to make crystal clear that the x-play tuition of minors is deplorable and we work to prevent this from happening on our platform. with respect to the map, yes, indeed, location -- appearing on the map is off, by default, for everyone. not only for minors but for everyone. so in order to appear to someone and share your location, you must be bidirectional friends with that person.
ally say, with respect to grooming, this is an area where we spent time and resources to prevent. snapchat makes it intentionally difficult for strangers to find people they do not know. we don't have open profiles or browsable pictures. we don't have the ability to understand who people's friends are and where they go to scoop. i'd be more than happy to give you more adam. sen. blackburn: let's get some more detail. and one question for all three of you and you can answer in writing, if you choose. you have all talked about the research work that you do. do you get parental consent when doing research on children? and can you provide us a copy of the parental consent form? we asked facebook for this and they punted the question repeatedly. and ms. miller, i think you need to provide the committee clarity. you said you had never spoken
out against online privacy. i think the chairman and i may question that a little bit. my question to you, and you can come back to us with more depth on this. did you fight it as part of the internet association as they were fighting privacy on your behalf? and just one thing to wrap up mr. chairman, going bac to mr. beckerman. there is confusion around the ownership of tiktok with the parent company as bytedance . they have a financial stake in bytedance.
douyin has an interest. i check could and know that tiktok data is stored in the u.s. and singapore. until recently, singapore dat centers were run by alibaba, which is another chinese company. what we need from you mr. beckerman is the clarity on ownership. the sharing processes around u.s. data services especially the information of our children. with that i will yield back. thank you, mr. chairman. >> thank you, sen. blackburn. wi going to finish the first round. senator soul vandal. i am going to go vote and i should center finishes. in the meantime, senator
blackburn will preside. thank you. >> thank you, mr. chairman. mr beckerman, i know there is sharing of data with the chinese communist party. given the own everyship or at least board influence. senator blackburn was just talking about that. i-know senator cruz raiseeds the issues. i want to raise a related sufficient. it's what i refer to as kowtow capitalism and i think that you guys are exhibits a of cow toy -- kowtow capitalism? what is that? it's american executives of american companies censorin americans' first amendment rights in america so as not to offend the chinese government and/or to gain access to the chinese government. we see it on wall street and in
the nba. we see it in the movie industry. let me ask a couple of questions regarding that. a tiktok user could put up a video that criticizes the chairman of this committee and the ranking member or any senator. president biden or former president trump. couldn't they? not like some horrible violent suggestion but just a criticism of an elected official. is that common? mr. beckerman: actually, tiktok doesn't lock up political ads so political contestant. i. wouldn't be a violation of your community guideline as long as it's not disinformation or something hateful. >> good, that's free speech. that is free speech.
could a tiktok user put up a video criticizing xi jinping paying. he has been compared to winni the pooh. could a tiktok user reference him with winy the pooh? i don't know why he doesn't lining winnie the pooh. could a tiktok user do that? mr. beckerman: yes, senator. community guidelines are done by our team in california and the moderation is done here. >> ok, in 2019 you admitted to sensorring videos ordering tibetable independence. can a tiktok user mentioned tibetan independence? mr. beckerman: yes, senator.
>> what happened in 2019? mr. beckerman: i can assure you it's not a violation of the community guidelines and would be allowed on the platform. >> i think there was a tiktok public policy director in 2020 that admitted that tiktok had previously censored information from the ccp about forced labor , is that true? mr. beckerman: that would not be in violation of community guidelines and it would be permitted. >> so you're saying that your videos have never been censored by the chinese communist party on any matter? i'm just reading these and maybe they're all wrong. mr. beckerman: i can assure you that our content moderation teams are led by americans. our mission guidelines are public and transparent. and content that is critical of any government frankly, as long as there is not disinformation or hateful speech, would be allowed. and i would enkoenning counsel
you to search for a number of the examples today up mentioneds on tiktok and i'm sure you could find them. >> so the tibetan and -- independence and the muslim forced labor issues are not censored? >> not currently. >> were they previously? i'm reading here that in 2019 and 2020 that somebody wit admitted doing that. >> i'm not aware of that and that's not how we monitor content. >> listen, i do think this issue of kowtow capitalism where american companies are censoring americans that we see all the time is an issue they should be looking at. because the chinese communist parties, of course, can crush freedom of speech in their own country but they shouldn't be able to crush it in this country.
and mr. beckerman, i'm glad that you're denying any of in. i look forward to seeing videos that are negative against the chinese communist party. not really holding my breath but maybe it's true. maybe they're fine with videos criticizing xi jinping and other members of the communist party. so you're saying that's fine and acceptable policy for tiktok users? >> yes, and just to be clear, there's not involvement from the chinese communist party in tiktok moderation and it's all done in the united states. >> thank you. >> thank you, mr. sullivan, and i will -- we are going to continue looking at these issues on the chinese communist party teats influence into u.s. technology and into
the silencing and sense urining of free speech of u.s. citizens online. >> great, it's very important, i appreciate that. senator, you are recognizeeds. >> thank you. i will start directly with questions and if i have time left, i would like to read a statement into the record. i will start with a question for ms. miller. youtube has implemented several features, such as auto play, that make the platform difficult to stop using. what mechanisms has you to employee to make sure that children, specifically, can counteract these design decisions? and do you believe those controls are sufficient? >> senator, thank you for your question. autoplay's default off on youtube supervised experiences on youtube.
we have set those two default off. we do allow that if the default is changed to allow for autoplay. for example if a family is in car and the parents have decided they want autoplay to continue. but we have set it to a default of off. >> do you believe that's sufficient? >> i think it's one of a number of tools that are important to make sure that kids have a healthy and safe experience on the platform. another is that we do not deliver targeted advertising on youtube kids. another is that we age gate content to make sure that minors do not see ainge-inappropriate material. it's only with a number of tools and protocols in place that do we think we are meeting the bar we have set for ourselves and the parents expect of us. experts in the fields of child development advise us to make sure that kids have a safe experience on the platform.
peers do more. the keystroke isn't collecting what people are typing but it is an anti-fraud in anti-spam measure that measures the cadence of typing because a bot behaves differently than a human. it's not collecting what people are typing but it is a anti-fraud measure. sen. lummis. which companies that you are aware of collect machine information? mr. beckerman: i would probably refer to facebook and instagram, example. >> well, i'll ask the same questions of them, thank you. beckerman: our morales are designed to keep users engaged as long as is possible? do you want to start mr beckerman?
mr. beckerman: senator, we want to make sure people are having an entertaining experience like tv or movies. tiktok is meant to be entertaining . we have take a break videos and time management tools. and then family is another tool that parents can use to limit the time spent on the app. >> but is the length of engagement a tool that your company uses floral to new success. mr. beckerman: there is multiple defenses of success, senator and it's not just how much time someone is spending on the app ? >> is length of engagement one of the metrics? overall engagement is more important. >> sit one of the metrics? >> it's a metric that many platforms check on. >> thank you. ms. stout, same question . are your platforms designed to keep users engaged as long as possible. miss stout: when you open
snapchat, you don't open to a feed designed to keep you in more and more content. you start with a blank camera or campus where users come to talk to their friends in videos and pictures. >> but is it a metric that the company incorporates into your definition of success? >> i believe we see success when the platform is facilitating platforms. snapchat is a place where people come. >> but is it a metric? my question is, do you measure success in any way, shape, or form by how long people stay on your site? is that one of multiple driving measures of success? >> i think the way i can answer that question is, is it one of many metrics. >> ok, thanks. ms. miller, same question. are your platforms designed to keep users engaged as long as is possible? ms. miller: senator, our platforms are designed to allow users to search and discover all types of content. it is intended for them to have an enjoyable experience. >> i get it but i'm asking is this one of the metrics by which you define success?
>> senator, we have a number of digital well-being designed tools designed. >> is this one of the metrics. one of them. there could be numerous metrics but is this one of them? >> yeah, to the specific question you're asking, we do look, for example, income a video is watched through its entirety. that helps us determine if it's a quality of video relative to the search the user had . we look at the data points to inform us as it relates to the experience the user has had on the platform. >> thank you. madame chairman, do i have time to enter an opening statement? thank you. this era of children will grow up under a level of surveillance well beyond any previous. the wall street journal reports focused on the prodromal promatic -- problematic forms of
fails book, we know that the problem is children are impressionbling. children are ma nip placemented by -- manipulated by targeting advertising to them and readily influenced by the highly sophisticated algorithms that often search these invisible algorithms continually note our children in different directions which can impact their development without their knowledge and without their parents' knowledge. these algorithms on these platforms were designed wit adults in mind, not children. only a tiny fraction of children understand the harm that can come from sharing sensitive information, pictures or opinions that become a part of their digital permanent record. but what is most alarming is that none of them can fully understand how the content fed to them by algorithms will shape their worldview during these formative years.
so more must be done to promote responsible social media. we must educate parents on how to teach their children to avoid the pitfalls of using these platforms and more importantly, we must hold these platforms accountable for the effects they are design decisions have on our children. mr. chairman, thank you for the opportunity to add that opening statement to the record and thank you for your indullingments, i yield back. >> thank you, and thank you for your leadership on these issues and your insightful questions. we do have to protect kids in our country. you just put -- put your finger on it. let me ask this everyone on th panel here today. kids are constantly posting
content and their dat i've is being tracked, stored and monetized, but we know young users lack the cognitive ability to grasp that their posts are going to live online forever. to each of our witnesses, do you agree that congress should give children and teenagers, more importantly, their parents the ability, the right to erase therrian line data? mr. beckerman? >> yes, senator. >> ms. stout? ms. stout: yes, we do, senator, but i must say that content on snapchat does not afear permanently. >> i appreciate that. to you, ms. miller. >> yes, senator, and children have the ability to delete their information as well as having auto delete tools.
>> so you agree that they should have the right to delete it? >> yes, senator. >> today apps collect astros of information about kids that have nothing to do with the apps service. one gaming app that allows children to raise cartoon cars with animal drivers it has amassed huge amounts of kids data related to the game including location and browsing history. why do apps gobble up as much information as they can about kids? it is to make money. congress, in any opinion, has to step in and collect this harmful misuse of data. ms. miller, do you agree that platform should stop data collection that has nothing to do with fulfilling the app service? ms. miller: senator, we do limit the data that we collect, and this is particularly true for
the youtube kids app. it only relies on what is necessary to make sure that th platforms. >> so do you agree that that should become a law that all platforms have to do the same thing? >> i don't want to speak to whether or not it should become a law and/or the details of any proposed association, but it youtube we have not waited for a lot to make sure we have these protections. >> i appreciate that. but it's time to make up your minds. yes or no on legislation. mr. beckerman. mr. beckerman: yes, senator, we do need legislation. i think we are overdue on strong national privacy laws. >> great, ms. stout? ms. stout: yes, senator, we absolutely collect data and it sounds as though collection of data that is irrelevant to the performance of the app does not appear to be within scope. >> today popular influencers peddle products online while
they flaunt their lavish lifestyles to young users. online chimed selects opening new toys, getting millions of views and they are inherently manipulative to young kids who often cannot tell that they are paid advertisements. that their heroes are pushing, that their hero is getting a monetary kickback from. my bill with senator blumenthal, the kids act, would banal in time of promotion of influencer marketing to kids to each of the witnesses. do you agree that congress should pass legislation to stop influencer marketing to children in our country? yes or no, ms. miller. ms. miller: senator, again, we've actually movemented in this direction whereby we have a
set of quality principles regarding the type of content that is made available on the youtube kids out in so doing we make sure that we are not providing content that would have a significant prevalence of that type of material and we limit the types of ads that can be delivered. i don't have the details. >> should well make it illegal? so that people out there who might be trying to influence children know there is an enforceable penalty? ms. miller: i absolutely think it's worth a discussion. we need too see the details of such a bill. >> it's been around for a long time. mr. beckerman? >> yes, senator, we already limit the kinds of advertisements, but we agree there should be additional transparency and privacy laws. >> ms. stout? >> senator, i would agree that for young people, i think there
could be additional protections placed. >> by the way, we ban it on television because we know we can have a hero holding a product. thank you. and finally, push alerts. studies show that 70% of teenagers report checking social media multiple times a day, excessive use of social media has been linked to depression, anxiety, and feelings of isolation. the last thing apps should be doing using methods like push notifications, automated messages that nudge users to open an app to make young users to spends even more time online. to each of the witnesses, do you agree congress should pass, again, the law that the senator and i are trying to move whic would ban push alerts for children?
ms. miller? i agree additional protectios should be in place. >> thank you, mr. beckerman? >> we already limit push notifications. >> should we ban them? >> i think that would be appropriate. we've already done that. >> ms. stout? >> snapchat does not use that as the uk appropriate design code. we would be in agreement with that legislation. thank you. thank you, mr. chairman. >> thank you, senator marky. i understand senator klobuchar has a few more questions and while we're waiting for her, i have a few too. so appreciate your patience. leapt me begin by
acknowledging -- and i think you would acknowledge as well -- the reason that you've made many of the exchanges on and off it is -- the u.k.'s chimed safety law, the age-appropriate design cold. i think we need an america version of the british child safety law and -- about some of its provisions. will you commit to supporting a child safety law that obligates companies to act in the best interests of children? and establishes, in effect, a duty of care that could be legally enforceable? ms. stout? ms. stout: senator, we were very
privileged to be able to work with the information commissioner officers in the uk and their is design of the age-appropriate design cold. we, of course, complied with the cold as it's come into force this year and i mentioned we are looking actively at that code to see how we can apply it to outside the u.k. market and apply it to many of our other markets. so with respect to a child safety law that obligates companies to think about the protection and safety of children, that is something the app has done without regulation and we would be happy to work with the committee. >> but the point is -- and we would love to be in a world where we could relyle on von dare actions but effect, you sacrificed that claim to von tour action or religious on your von terp action, whether it's
it's facebook or your companies in various ways, i think you have shown that we can't trust big tech to police itself. so when you say we already do it, well, you may think so, but there's no legal obligation to do it and no way to hold you accountable under current law. that's what question need and that's why i'm asking you about a law. i'm hoping that you would support it, that your answer is yes that you would support it as a duty of care it is a matter of law >> yes, senator, and that was very much a part of my it will in my opening statement, which because of the time it takes regulation to be implemented, we don't believe we should have to wait for that
regulation. we should will take the voluntary steps to best protect. mr. beckerman: we have voluntarily implemented much of the age-appropriate design code here in the united states. i agree that companies can do more and i was struck by your comments and your opening statement about a race to the top and that is the approach we are trying to take to do more and go above and beyond and to be a place where we are putting wellness of teenagers and safety of teenagers ahead of other platforms. let me see if i can answer the question. as i would if i were in your position. yes, we strongly and enthusiastically support that kind of child safety law. we are already doing more. that we would need to do under the law. mr. beckerman: yes, senator, and i do think relating to age is something that should be included in measures like that
and included. >> ms. miller? >> senator, i respectfully disagree that we only wait until we have legal obligations to put systems, practices in place. for example, we rolled out youtube kids in 2015 to make sure that as kids were trying to be on the main platform that we created a space that was particularly for them and their safety. we've rod out a number of -- >> i'm going to interrupt you, ms. miller because i think you misinterment me. i'm not suggesting you wait, i' is suggesting you do it now. i think mr. beckerman and ms. stout perfectly understood my question. i think both of them would support that law. would you, yes or no? >> i would support looking at
any details as it relates to additional legal protections for kids in the united states. as you may note the age-appropriate design code went into effect in the uk over a month ago, so it's still early days but a number of the protections required in the u.k., we have rolled them out globe lip. >> is that a yes or a no? >> is that a yes or a no? ms. miller: yes, i would be happy to support a -- >> would you support a version of the u.k. chimed safety law? ms. miller: i would need to stare at the details of any specific bill, but i certainly support expansions of child safety protections. i will yield to the senator. sen. klobuchar. thank you. thank you for taking me remotely for the second round. one of the things i'm tried to do with all of these hearings
including in the judiciary and anti-trust subcommittee that i share is taking the veil off this idea that tennis just the web. everyone has fun and it's just a cool thing. some of that's true but it's also a huge profit-making venture, and when you look agent it that way, which is the most successful and biggest companies the world has ever known in big tech platforms in terms of money --i just start with this question which i have asked many platforms, the larger platforms, ms. stout, snapp reporter user in north america for the second quarter in 2021 was $7.37. how much of your revenue came from users under the age of 18?
ms. stout: i don't have that information for you but i'm happy to take that back. sen. klobuchar: i appreciate that. i have been trying to get that from a just to give you a sense of facebook revenue per user for their own documents is $51 pe user in the u.s. per quarter. just to put it in some perspective. mr. beckerman, tiktok is a privately held company so we don't have public documents on your advertising revenue per
user. what is your best estimate for the last quarter? >> i don't have those numbers but i'd be happy to go back and check on those. >> do you think you can provide us with that? >> again, we're not a public company but i'll go back and see what we can find for you. >> ok, again, i'm trying to figure out the users under age 18. how much of the business and future growth of the business is in kids. ms. miller, youtube reported that its advertising revenue overall in the second quarter of 2021 was $7 billion. how much of youtube's rev newly came from users under the age of 18? ms. miller: senator, i don't know the answer
to that and i'm not sure if we look at data internally that way. i would be happy to follow up with you. i would like to note that as a company, we have long shared our revenue with our creators so after the last three years we have paid out more than 30 billion dollars to our creators. search klobuchar: ok. my work with the antitrus subcommittee takes me in related paths and i recently introduced is the bipartisan legislation and choice online act which with senator grassley and others who -- including senator bloomen that will who are co-sponsors of that legislation and it's focused only a gnarly problem, which is that you have platforms that are self referencing their own stuff at the top and taking, in some cases, data that they uniquely have on other products and making knockoff products and underpricing the competitors. ms. miller, they say youtube has
made unfair demands on negotiations for caring the a.m. on ruku, including they the search results which is exactly what this legislation and gives roku access to youtube for nonpublic data. nonpublic data from ruku users. did youtube make these demands? >> senator, i am not involved in the negotiations with ruku. i know we've been having discussion with them for several months and we are trying to come with to a resolution, but i'm not involved in the negotiations. sen. klobuchar: ok, i'll put this on the record for others because also i would like to know more generally if youtube has ever demanded nonpublic data or preferences in search results in negotiations with other providers. it gives you a sense of the
dominant platform by far in the area of search and being able to use that power over people who are simply trying to be on that platform. i also had a follow-up on your youtube banning all vaccine misinformation. which ill commended you only at the time. how much content have you removed since you banned all vaccine miss information? have you seen a change in the viewership rate? >> senator, i feel bad, this is a question i would absolutely love to answer, but i know that we have removed -- reviewed so much video as it relates to
covid-19. sen. klobuchar: we'll put it in writing. my last question, ms. stout, mr. beckerman, i just mentioned this bill that we have introduced with 10 other cosponsors aimed at ensuring dominant digital platforms don't use their market power, but--. do you support some of the con competition reforms in the bill and have you faced any challenges when it comes to competing with the larges digital platforms? that's be my last question. >> senator, we are aware of the bill you introduced and as a non-dominant platform, very much appreciate your legislation and the work that you are doing and as a smaller platform it is an incredibly competitive arena. we compete every day with companies that collect more information on users and store that information to monetize. any effort that this body and especially the legislation that
you have undertaken to create an equal playing field so it is a competitive atmosphere we would very much welcome. sen. klobuchar. thanks. mr. becker man? mr. beckerman: likes, we appreciate your efforts and the work you have done to promote and to spur competition. that is something that we all benefit from as it relates to specific challenges we face, i would be happy to discuss in detail those issues. sen. klobuchar: thank you very much. thank you, everybody. >> senator, i found the answer. if you don't mind. we've reviewed over a million videos since we started rolling out covid misinfo and over 130,000 videos as it relates to vaccine misinformation. it is an area we put a lot of effort behind. thank you. sen. klobuchar: ok, thank you.
thanks, everybody. >> thank you, senator klobuchar. i seem to be the last person standing or sitting and i do have some questions. ms. stout and mr. beckerman, over 10 million teens use your app. these are extremely impressionable young people and they were a highly lucrative market. and you've immediate tense of, probably hundreds of millions from them. there is a term snapchatdysmorphia. kids use the filters that are
offered and create destructive, harmful expectations. and you studied the impact o these fitters on teens' mental health and i assume you study the impact of these filters before you put them in front of kids. ms. stout? ms. stout: senator, the technology that you're referring to is what we call lenses and these lenses are augmented reality filters that we allow users who choose to use them to an ply them over top of selfies and for those who use them, they're the opportunities to put on a dog face or -- >> i've seen them but they also potentially change one's appearance to make them thinner, different colors, different skin tonal. ms. stout: so these filters are created both by snapchat and our
creator community. there are over 5 million of these augmented reality filters and a very small percentage of those filters are what you would call beautification filters. most of them are silly, fun, entertaining filters that peopl use to lower the barrier of conversation. because, again, when you're using snap chat, you're not posting anything permanently, you are using those filters in a private way to exchange a text or a video message with a friend. so it really kind of creates this fun, authentic ability to communicate with your friends in a fun way and those filters are one of the many ways in which friends love to communicate with each other. >> do you study the impact on kids before you off them? have you studied them? ms. stout: we do a considerable amount of research on our products and one of this -- i think it was one of
the competitive pieces of research that was revealed as your earlier hearing shows that filters or lenses are intended to be fun and silly. it's not about -- >> we all know as parents, something intended to be fun and silly can easily become something that is dangerous and depressing. that is the simple fact about these filters. not all of them. maybe not the majority of them but some of them. and i'd like you to provide the research that you've done and ask for other research but i'm gathering from what you said that you don't do the research before you provide them? ms. stout: no, senators, that's not what i said and particularly with respect to this question, i'm not able to answer the kind of research because i'm not aware. i will go back and look and tried to get you an answer to answer your question. >> you know, for eight years,
snap chat happened a speed filter. it allowed users to add their speeds to videos, as you know. the result was that it encouraged teens to rails their cars at reckless speeds. there were several fatal characterization associated with teens using the speed filter. it took years if multiple deaths for snapchat to finally remove that dangerous speed filter. it was silly, it was maybe fun for some people. it was catastrophic for others. and i want to raise what happened to carson rye, from derien.
how carson was relentlessly bullied through anonymous apps after trying desperately to stop the abuse and pleas for help. he took his own life. silly, fun, how can parents protect kids from relentless bullying that they said so movingly no longer stops at the schoolhouse door. 24/7, it comes into their homes just before they go to sleep. what are you doing to stop bullying on snapchat? ms. stout: senator, this is an incredibly moving issue for me as a parent as well, bullying is unfortunately something we see more and more happening to our kids and this is not just face it at school. they face it at home. we have zero tolerance for bullying or harassment of any kind and we see this as a responsibility to get in front of this issue and do everything we can to stop it.
so again, because snapchat is designed differently, you don't have multiple abuse on snapchat. you don't have public permanent posts where people can like or comment thereby introducing additional opportunities fo public bullying or public shaming. that's not to say that bullying doesn't happen, both online and offline. so we have reporting tools where users can thomasly and quickly report bullying or any other harmful activity. and i would say the trust and safety teams that work aroun the clock work to remove the content on average in less than two hurns. usually it is far more quickly than that. i want to assure you, thank you for raising the bullying issue. in addition to combating this practice we do a lot to prevent it and we think there should be more opportunities to raise
awareness of the effects of bullying and the effects o words on other people and we will continue to make that commitment. >> we've heard from tiktok that it is a safe environment. at the same time we see challenges. the blackout challenge, just to take one example. where, in effect, teens and children have actually died emulating and recording themselves following blackout and choking challenges. including a 9-year-old in tennessee. despite apparent efforts to discourage certain dangerous trends the content is still being posted and being viewed widely by kids online. and followed and
emulated whether it was destruction of school property or other kinds of challenges. the mother who lost her child from choking shared questions with me and they are questions that deserve answers from you and others who were here today and how can parents be confident that tiktok, snapchat and youtube will not continue to host and push dangerous and deadly challenges to our kids? >> thank you, senator. as it relates to this specific happen and particularly it is sad and tragic. i remember when i was in grade school, i had a classmate that we lost from something very similar. i know it is something that touches many of our lives. it is awful. as it relates to tiktok, this is
not content we have been able to find on our platform. it is not content we would allow on our platform. it is important that we all remain vigilant to ensure that things that are dangerous or even deadly for teenagers don't find their way on platforms. it is important will have conversations with our teenagers to make sure they stay safe off-line and online. it is a responsibility we take very seriously. we want to distinguish that there is content on platforms. if content -- we have not been able to find any evidence of a blackout challenge on tiktok at all. it would violate our guidelines. we have found absolute no
evidence of it. it is something we had conversations with parents about. >> other challenges, you're saying none of them? >> anything that is illegal or dangerous violates our guidelines. all of this as things pop up on the platform, over 94% of things on the 5 -- dangerous challenges have no place on tiktok. you react by taking them down but they exist for the time that they are there. what can you do to prevent those challenges from being there in the first place? >> we are able often be proactive in blocking --
we block content and remove content. something we have seen recently our press reports about alleged challenges that went fact checkers like snopes and other independent organizations looking to them, they find that these never existed on tiktok in the first place. there were hoaxes that originated on other platforms. -- they were hoaxes that originated on other platforms. we need to look at the facts and see what content actually exists rather than spreading rumors about alleged challenges. >> we found pass out videos. we found them. let me ask you about another instance.
they were inundated with tiktok figures about eating disorders. my staff made a tiktok account and spent hours scrolling through the endless videos. tiktok began by showing us videos. we can show them in these rooms because they were so explicit. another tiktok account was flooded with nothing but sexually obscene videos. how do you explain to parents why tiktok is inundating them
with videos of self injury and eating disorders? >> i can't speak to what the examples were from your staff but i can assure you that is not the normal experience that teens or people that use tiktok would get. that kind of content is removed. >> would support features that lead to addictive use? >> we have already done a number of that proactively informs of take a break videos.
it is important that we look at these issues and have conversations and foster these conversations with our teenagers and that's one of the reason we built i know it can be daunting and we have teenagers to have these different tools and we built this in a way where parents can have additional control of the safety and privacy for teenagers. >> i have to go vote and i don't have anyone to preside here for me. i suggest taking a five minute recess. i will have some final questions if you would be willing to wait. then we can close here. thank you.