Skip to main content

tv   House Hearing on Internet Safety  CSPAN  December 10, 2021 12:03pm-3:25pm EST

12:03 pm
watch tv every sunday on c-span2 and find a full schedule on your program guide or watch online anytime at >> up next hearing on internet safety. a house energy and commerce subcommittee on consumer protection heard from experts about the dangers of social media platforms post to children and big tech companies unwillingness to fix the problem. problem. this is about three hours 20 minutes. [inaudible conversations] [inaudible conversations] will [inaudible conversation]
12:04 pm
[inaudible conversations] [inaudible conversations] >> members, if you could take your seats i want to begin. okay. to give our digital team some notice, i am going to count down from five before calling -- four, three, two, one. the subcommittee on consumer protection and commerce will now come to order. today we will be holding a hearing entitled "holding big
12:05 pm
tech accountable: legislation to build a safer internet." due to the covid-19 pandemic, this hearing, members can participate in today's hearing either in person or remotely via online conference. meanwhile, excuse me. members participating in person must wear masks. such members may remove their masks when they are under recognition and speaking from a microphone. staff and press who are present in the committee room must wear a mask at all times, and for
12:06 pm
members are participating remotely, your microphones will be set on mute for the purpose of eliminating inadvertent background noise. members participating remotely will need to, you will need to unmute your microphones each time that you wish to speak. please note that once you are unmuted anything that you may say will be available and webex and it could be heard over the loudspeaker, and also in the committee room and subject to being heard by the live streaming and c-span. since members are participating from different locations, the way we're going to order the members will be by seniority
12:07 pm
within the subcommittee. documents for the record can be sent to kaspersky at the e-mail address that we have provided to the staff, and all documents will be entered into the record at the conclusion of the meeting. we will begin at this point with opening statements of five minutes by the members, , and te chair now recognizes herself for five minutes. bottom line, the internet is not living up to its promise. at its birth in the previous century, the internet promised more social connections, new communities and experiences, and more economic opportunities.
12:08 pm
but these benefits have come with very steep costs. today's internet is harming our children, our society, and our democracy. five years ago at age 13 anastasia vlasova joined instagram which quickly flooded her account with images of perfect bodies and perfect lives. she soon was spending three hours a day on the app and developed an eating disorder. despite public outcry, reporting yesterday confirmed that instagram is still promoting
12:09 pm
pro-anorexia accounts to teens. ms. vlasova eventually quit using instagram, but millions of children and teens remain powerless against its addictive and manipulative algorithms and ads. on january 6th, dc police officer michael fanone was grabbed, beaten, and tased, all while being called a traitor to his country. the deadly insurrection was coordinated on platforms like facebook and exacerbated by
12:10 pm
elevating the, amplifying algorithms. that were about the election disinformation. for too long big tech has acted without any real accountability. instead, they give us apologies and excuses, but the time for self-regulation is over. today will be discussing a number of pieces of legislation that will build a safer internet. last week i introduced the ftc whistleblower act with my colleague representative trahan. this bill protects from retaliation current and former employees who blow the whistle to the federal trade commission, from retaliation and it
12:11 pm
incentivizes the disclosure of unlawful activity. it's a critical step toward a more safe internet. the algorithmic justice and online platform transparency act from representative matsui prohibits algorithms that discriminate against certain consumers. the kids act from representatives castor, clarke, trahan, and wexton bans online practices that exploit young people. the social media data act from representatives trahan and castor provides transparency into how digital ads target
12:12 pm
consumers. the bipartisan detour act from representatives blunt-rochester and gonzalez prohibits large online platforms from using dark patterns to trick consumers. we can, , this subcommittee can create an internet that is better and safer and make sure that consumers are protected, that we protect our children, that is transparent, and holds bad actors accountable. and with that i want to give a hearty welcome and a thank you to this wonderful panel that is here, including one that i guess is he remotely with us. thank thank you very much, e chair now recognizes the ranking
12:13 pm
member, my friend, mr. bilirakis, for ranking member of the subcommittee, for his five minutes of an opening statement. >> thank you, madam chair. i appreciate it so very much. good morning to everyone. i want to thank my colleagues for the interest to improve transparency and increased protection online. there are a lot of initiatives under consideration today and all of them raise issues that deserve our attention. legislation brought forth by my friend in the majority would require the ftc to issue new rules and regulations and would grant the ftc with additional enforcement rules to reduce dark patterns, discriminatory algorithms, as you said, madam chair, harmful content directed at children and would also grant new rights for consumers to take control of their data here i
12:14 pm
hope that means this is a precursor and not a substitute, and we have discussed this with the chairperson, , for passing a national privacy and data security law. that is the best and most comprehensive way colleagues can protect our constituents through these means. that's my opinion. i think many of the issues we will be discussing today can and should be part of that larger privacy and data security discussion. and i sincerely hope my colleagues will join me in that effort. i will say to my fellow colleagues that my door is always open, and we have a great relationship with the chairperson. please don't hesitate to come and talk to me and give us some input on this particular issue. earlier last month republican leader rogers released draft legislative language for the control our data act, or of
12:15 pm
coda, which would create one national standard for privacy and data security, establish clear rules of the road for businesses to be able to comply and give every american equal data protections regardless of the location of their home. i for one sorely want to see rules that are clear and easy to understand from my constituents, and i'm sure you do, , too. i also want to assure that the ftc bureau of privacy that was included in our proposal has the appropriate staff and resources to enforce the national law. i hope the panel agrees today that there are elements of all these bills that can be incorporated in some fashion in this framework to ensure we leave behind a legacy that will benefit every american. that's the goal. we must also take seriously the threat from china, and moving
12:16 pm
forward on these two bills today is an important step towards holding them accountable. the . the legislation before us will provide americans with greater transparency into the application and websites they use online. h.r. 3991, that tail act, led by representative duncan, would inform users if their information is stored in china and whether the information is acceptable by the ccp or a chinese state-owned entity. h.r. 4000 the internet application id act led by representative kissinger would require websites and online users or distributors of mobile applications that are located in china owned by the ccp to disclose that location or ownership to users. both bills are very reasonable
12:17 pm
as far as i'm concerned. for those -- in today's hearing that has ties to china to show their views, you should know we absolutely did. we use one of our witness slots to invite tiktok to testify but, unfortunately, the client, they declined the invitation. madam chair, i hope we can work together to invite them before the subcommittee in the near future, just as senators blumenthal and blackburn did in the senate. there were many questions left unanswered in that hearing, and the senate last month on the stewardship of the platform, and i'm confident the panel today can shed light on our shared concerns. thank you so very much for being here. there are very important matters are subcommittee is examining today so i think the chair for holding this hearing again and i think the ranking member of the
12:18 pm
full ranking member and to the witnesses again for being here today. we really appreciate i look forward to your testimony on these bills and other proposals we are publicly circulated for this committees review, and i yield back. thank you. >> thank you, mr. bilirakis. before i invite the chairman and ranking member of the committee for the opening statement, let me just say i'm very excited and optimistic. we have had a real good history of working together in the subcommittee to get legislation, not only introduced and passed, and in the last week we also sent you an something, a proffer on a privacy bill. again i'm very confident that we are going to work together and get that done. and i agree with the urgency
12:19 pm
that you're projecting today and share it with you and look forward to moving ahead rapidly. now let me recognize the great chair of this full committee, frank pallone, for his opening statement. >> thank you, chairwoman schakowsky. today's hearing is the second of two hearings on legislative reforms to hold social media companies accountable. following last week's hearing examining possible reforms to section 230 of the communications decency act, today's panel will discuss consumer protection-focused legislation that aims to hold these companies accountable by enhancing transparency and promoting online safety. these legislative hearings come after years of repeated, bipartisan calls for online platforms to change their ways. unfortunately, instead of meaningfully addressing the serious harms that these platforms can inflict on the american people and our children, social media companies continue to make minor changes only after negative press coverage or in preparation for
12:20 pm
an executive testifying before congress. they also refuse to become more transparent. in fact, we only actually learn what is really going on inside these massive corporations when a whistleblower steps forward, and those courageous actions are becoming exceedingly difficult. even more disturbing, we are now seeing instances where these platforms are publicly shutting down efforts at transparency. since these companies are clearly not going to change on their own, congress must act. today, we will discuss seven bills that target different parts of the social media ecosystem to make platforms safer for users. one of the best ways to make these companies more accountable is to make them more transparent. we will discuss legislation that grants academic researchers and the federal trade commission access to ad libraries which will help us get the data we need on how these companies are targeting users. another bill will prohibit the use of algorithms that discriminate based on race, age, gender, ability, and other protected characteristics or
12:21 pm
methods that manipulate users into providing consent when they wouldn't otherwise. this legislation will help prevent people using social media from losing rights protected under the law. we're considering a bill that will protect whistleblowers, like former facebook employee frances haugen who testified at last week's legislative hearing. whistleblowers help bring truth to light and are another way of helping ensure that companies are held accountable. finally, we'll examine how to better protect our children online by banning certain design features directed at children and prohibiting the amplification of harmful content that is targeted at them. legislative measures that protect our children are critically important and have bipartisan support on this committee. republicans and democrats also agree that we do not want to see our data or our children's data surveilled or used in a manner that could risk our safety. that is why we are also discussing bills that attempt to force websites and apps to be transparent about their
12:22 pm
interactions with china. we all understand the danger the chinese government poses to the u.s. economy and national security, and we must take meaningful steps to address that danger. after multiple hearings, letters, and discussions with stakeholders, the members of this committee have developed legislation to address the harms caused by big tech. there is no silver bullet to fix the internet. the proposals that we are discussing today are important steps to improving the online ecosystem. another part of tech accountability is protecting people's privacy. the chairwoman already mentioned that significantly as she is so much involved with it. i think every member of this committee agrees that more must be done on privacy, and that's why we have been working since last congress on a bipartisan staff discussion draft. updates to that draft were made last week to address stakeholder feedback and have been shared with the minority. i continue to believe that there is a bipartisan path forward on privacy and our work continues
12:23 pm
to get there, but today we are focused on proposals to make these platforms more transparent and safer. i thank the witnesses for their testimony and look forward to the discussion. and thank chairwoman schakowsky for being out front on so many of these issues and particularly is a privacy issue because i know you're determined, and i yield back. >> the gentleman yield back. now the chair recognizes ms. rodgers, the ranking member of the full committee for five minutes for her opening statement. >> thank you, madam chair. and to our witnesses, thank you for being here. last week we discussed many examples of big tech companies failing to be good stewards of their platform. the cac is using power to censor americans, control what you see, manipulate us to use of harmful algorithms. big tech must be held accountable and that is why from day one this congress
12:24 pm
republicans have been exploring legislative solutions to our big tech accountability platform. it's a part of our platform we released a number of proposals to focus on content moderation, transparency and protecting our kids online. all issues that are relevant to today's hearing. my proposal which i i am leadg alongside my good friend congressman jim jordan nearly a men's section 230, to protect free speech. and our proposal big tech will be held accountable for censoring constitutionally protected speech. big tech will no longer be able to exploit the ambiguity and the discretion we see in the current law. big tech will be more responsible for content they choose to amplify, promote or suggest. big tech will be forced to be transparent while the content decisions and conservatives will be empowered to challenge big tech censorship decisions. republican policies would hold big tech accountable for their moderation practices and
12:25 pm
encourage transparency on enforcement decisions. especially when it comes to illegal drugs, counterfeit and stolen products, terrorism, child pornography, trafficking, cyberbullying, and revenge pornographic. we're also looking for new ways to improve cooperation with law enforcement while upholding our civil liberties. i am pleased to see some of these ideas presented today in the package that the democrats are leading on. it's unfortunate the majority decided not to use the string to discuss privacy. given many of these bills include provisions directly related to the collection and use of data. and would best be addressed in the context of a comprehensive privacy and data security framework. the proposals also include link to protecting data from wrongful purposes, other references to the child online privacy protection act, copper, and data portability provision. despite her interest in
12:26 pm
continuing our work unless congress in a bipartisan privacy framework, we have yet to have a hearing. let alone a markup. americans are desperate for privacy and data security bill. it's difficult to address the goals discussed without that national privacy framework and the data security bill, we will continue to talk, we can continue to talk, but we need a national privacy and data security bill. worse yet, the democrats tax and spending spree, reconciliation package before the senate right now includes dramatic increases for funding and authority for the federal trade commission, the ftc, that never see the bipartisan consensus. the majority suggested this is a way to protect america's personal data. couldn't be further from the truth. it includes no privacy and data security framework to implement or enforce.
12:27 pm
these bills will add to the confusion in the marketplace by creating conflicting rules on how data is used, collected and shared. this confusion only allows big tech to become more powerful, and it harms small businesses. the question i have today is how do these bills get into a comprehensive privacy and data security framework? like some of the proposals republicans have released publicly. let me also ship another reason that i am concern which i think we all agree on. and that's the need for a national standard because they tax troubling relationship is being more exposed with the chinese communist party. big tech has not been responsible with the data theft collected, who they share it with. i am pleased and i'm grateful that the majority included two bills related bills and him today to help address that threat. one by mr. duncan and one by mr. big tech companies like tiktok have an incredible amount of
12:28 pm
access and control over our data and information supply chain. americans deserve to know if the personal information is safe. and to what extent it is being accessed by the ccp. it is our duty to uphold american values like free speech and ensure that the united states of america continues to lead the cutting edge technology to beat china. that starts by establishing a national privacy and data security framework, and holding big tech accountable. i look forward to and from the witnesses today. i yield back, madam chair. >> the gentlelady's yield back, and i want to remind all members of the subcommittee, personal committee rules. all members written opening statements shall be included and made part of the record. and now i would like to introduce our witnesses for
12:29 pm
today's hearing. jonathan greenblatt is the ceo and national director for the anti-defamation league. nathalie marechal -- lets see, i'm going to get it -- miracle is the senior policy and partnership manager at ranking digital rights. rick lane is a ceo of iggy ventures. josh golin is the executive director of fair play, and jessica rich of counsel at
12:30 pm
kelley drye. and imran ahmed is the ceo of the center for countering digital hate. i just want to explain, i will recognize each of you for five minutes but i want to explain the lights that are in front of you just to make sure that you know when your time begins, the light will be green. wenders one minute left there would be a yellow light, and help at that point you will start wrapping up so that we can keep to as close as we can to five minutes. and we will begin now with mr. greenblatt. you are now recognized for five
12:31 pm
minutes. >> thank you, madam chair stokowski, ranking member bilirakis, and members of the subcommittee. good morning. it's a privilege and for me to be here today. adl is the oldest and they hate group in america. we have been fighting anti-semitism and all forms of bigotry for more than 100 years, and and we've been tracking online hate since the days of dial-up. this work includes partnering with law enforcement to prevent online threats from mutating into off-line incidents. we work with authorities at all levels and in the past 11 months we have provided the fbi with more than 1000 actionable tips. our 25 offices across the country engage directly with individuals and institutions affected by hate. in 2017 adl launched launched the center for technology and society to double down on her efforts to fight online hate. we were the first civil rights
12:32 pm
group with an operation right in the heart of silicon valley, and it is staffed not by longtime nonprofit professionals but by software engineers, product managers, data scientists, and computer experts all hired from industry. we conduct analysis, published research, build technology, and provide recommendations to policymakers like yourselves and industry leaders. today, there is no distinction between online and off-line lives. when we say that facebook is a front-line infighting hate, i mean that literally. we have seen over and over again the way that hateful content online leads to violence in our communities off-line. paoli california el paso, pittsburgh. these shootings were motivated by extremes conspiracy theories
12:33 pm
that respond and spread on social media. in addition to these tragedies, online hate affects the everyday lives of millions of americans. a research has found that 41% of users report experiencing online hate and harassment. according to our most recent analysis, 75% of those harassed report that it happened to them on facebook. that's nearly three times the percentage on any other platform. and make no mistake. all of them are highly profitable companies. so this isn't a resource problem. it's a responsibility problem. just today adl released new research demonstrating how easy it is to find white supremacist supremacists, acceleration is, content on instagram. less than 24 hours after the ceo
12:34 pm
sat at another table just like this and said they were cleaning up their mess. but these platforms lack and neglect safety because first and foremost they are exempt from liability due to the loophole of section 230. now i know that isn't the topic of today's hearing, but make no mistake, section 230 must be changed to force the companies to play by the same rules that every other media company on the landscape operates by today. it's just not a matter of free speech. it recently been being held accountable in courts of law when a platforms aid and abet unlawful, even lethal conduct, in service of the growth and revenue. tech companies are complicit in hate and violence on the platforms. because if it bleeds, it leads, and it feeds their business model. and the bottom line. hate speech, conspiracy theories, they are amplified by the algorithms, nudged to the
12:35 pm
top of the newsfeeds, and the addict users like a narcotic, driving engagement which in turn increases their profits. with no oversight and no incentives beyond increasing revenue, tech company will continue to do whatever they can, whatever it takes, to optimize engagement regardless of the consequences. this just can't continue. if not for courageous whistleblowers like frances haugen we wouldn't have hard evidence to prove that facebook knowingly, knowingly is mainstreaming extremism, inciting violence through its algorithms, and fracturing societies around the world. what if other tech companies, tech employees felt empowered and protected to expose wrongdoing when they saw it? that's why the protections on crystal ms. schakowsky injure ftc whistleblower act are so
12:36 pm
crucial. if platforms have meaningful motivation to fix the harmful algorithms identify hate, they won't do it. that's why the algorithmic justice and online transparency act that would protect consumers from harmful and discriminatory ai systems are really long overdue. so we applied that legislation as well. finally, just to stay ahead of the curve we've got to prioritize research. in august, adl fellow and nyu professor laura edelson was deplatformed on facebook hours after the company realized that she and her team were studying the role that facebook may have played a leading up to the gender six insurrection. platforms should not be able to thwart important third-party research at their whim. bills like the social media data at would ensure that academics, can study platforms to better inform the public. look, there are no silver
12:37 pm
bullets. there is no one size fits all solution to repairing our internet but there's a lot you can do right now to take action. i have highlighted three bills, and i'm happy to talk about then and others in the q&a, but members of the committee let me conclude by urging you to remember that what happens online has a real impact on our lives. the status quo directly threatens our kids, our communities and our country. now is the time for you to legislate an act. thank you and i look forward to your questions. >> i think the german and now we have remotely with us today doctor nathalie marechal, and you are recognized now for five minutes. >> thank you, congresswoman. good morning and thank you to all of you for inviting me to testify today. i am nathalie marechal, policy partnerships manager at ranking
12:38 pm
digital rights. as congress -- to hold the tech accountable for its negative impacts on society i urge you to focus on upstream structural reforms by regulating online advertising, mandating transparency and research access to data and encouraging the security and exchange commission to use its existing regulatory authority to do what shareholders are unable to get big tech to comply with the same laws as all other public companies and to improve their corporate governance. the substance of congressional hearings on the tech industry has come a long way in the past few years. thanks to a growing recognition that the harms users experience through social media platforms are connected to business models centered on maximizing revenue from targeted advertising. this business model incentivizes rapid growth, anticompetitive behavior like predatory acquisitions of would-be competitors, and vertical
12:39 pm
integration across the ad tech value chain mass commercial surveillance and data collection without our knowledge or consent. reliance on automation to perform tasks that actually require human nuance and contextual judgment to be done correctly. and consolidation of corporate power that forts in the interval attempt at reform. the company now known as meta- is most brazen example of these dynamics but the basic point that how a company makes money place to determine a role in its products and its behavior is to across the tech sector and beyond. a business model that relies on the violation of rights will necessarily lead to products that create an amplified march. so what should congress do about it? first, regulate the online advertising industry. transfer the basic principles that govern off-line advertising to the online world and pursue antitrust enforcement in the ad tech sector. these measures will directly address consumer and civil
12:40 pm
rights harms related to privacy, discrimination and fraud in online advertising. they will also shift the incentive structures that contribute to product design and corporate decisions that harm consumers and destabilize democracies around the world. further, increase competition in the aptech market will undercut the alphabet and enable greater accountability for these megacorporations that often behave as though they are above the law. second, create the conditions for evidence-based policymaking by mandating specific types of transparency for information that can safely be made public and by creating mechanisms for qualified, trustworthy, industry independent researchers to verify companies claims about users experiences and expand knowledge and understanding about how these platforms impact societies and democracy around the world. the rd our methodology of the santa clara principles on transparency and accountability and content automation both
12:41 pm
provide granular recommendations for the data that companies should disclose publicly. and third, congress should encourage the sec to use its authority to do what shareholders have been trying to do and have been unable to do for reason i will explain. get big tech to comply with the same laws as all of the publicly traded companies. numerous whistleblower disclosures to the sec indicate that several big tech companies are violating security laws but because of their dual class share structure come shareholders unable to hold corporate management accountable. when the ceo is also the chair of the board of directors, this means that person is accountable to no one. i am talking about mark zuckerberg here know what should have this much power. the sec must address the private market exemptions that allowed big tech companies to become so large and with concentrated governance. because matta was able to obtain significant private market funding before going public, the
12:42 pm
company was able to impose this dual class share structure and a government structure that allows mark zuckerberg to unilaterally make decisions that impact billions of people without any accountability. this loophole must be close set a shareholder democracy of the future facebook's can take hold. to address the successes of today's big tech, the sec should issue an enforcement policy declaring it will not let bad actor waivers to and will seek to increase enforcement penalties for companies with class b shares are those in which a single person is the ceo and chair of the companies board of directors. the bills under consideration today all seek to shine a light on big tech secretive business practices and hold them accountable when they harm their users, their competitors or society more broadly. whether through deliberate action on their failure to proactively identify and mitigate potential harms ahead of time. the republican detector can build platform also contains many provisions that ranking digital rights has long called
12:43 pm
for. transparency into a big tech develops policies and regular periodic disclosures about content policy enforcement including the type of content taken down and why, including understood appeals processes. they tech accountability is not a partisan issue. americans may disagree about how social media companies should govern content on their platforms but there is strong bipartisan agreement the big tech is not above the law and that whatever companies do, they should be transparent about it and they should be accountable to their users, their shareholders and the american people. legislation should start there. thank you again for the opportunity to testify today, and i look forward to your questions. >> thank you so much. and now let me recognize mr. lane. you are recognized for five minutes. >> chair schakowsky, ranking member bilirakis, chairman
12:44 pm
pallone, ranking member mcmorris rodgers, and members of the subcommittee, thank you for inviting me to testify. my name is rick lane and i am the ceo of iggy ventures. i also volunteer my time to a child safety organizations combat sex trafficking. over the past or use identity to work on almost every major piece of technology related consumer protection privacy and cybersecurity legislation that has moved through congress. i testified today in my personal capacity. building a more safe and secure it in it will require congress to focus on formatting issues. one, reforming section 230, two, creating more transparency in the way and a platforms operate while protecting internet users privacy, three, restoring access to the whois data and four updating the child online privacy protection act. these issues do not necessarily need to be addressed in the
12:45 pm
single comprehensive piece of legislation but they should be discussed in the comprehensive fashion here call the pieces must fit together. i recognize section 234 is the focus of last week's hearing. i would be remiss however if i didn't take this opportunity to take a few observations on the topic. i believe we need to restore the platforms the ordinary duty of care that would apply but for courts current and overbroad application of section 230. social media companies are rife with offers to sell illegal drugs yet the former ceo of tiktok stayed at a 2020s technology event that he'd never been told of illicit drug transactions on the platform and doubted their very existence. that was a surprising statement since others knew including the drug dealers that were using tiktok. tiktok is also increased the threat of espionage and cyber attacks in light of the implement of chinese government
12:46 pm
has over both it and night dance, the chinese company that owns tiktok. indeed we are confronted with the social networking site that is susceptible to manipulation by communist regime with a record of human rights abuses, growing rapidly in the u.s. competitor and collecting massive amounts of data on our youngest and most easily influenced demographic and andres to develop more sophisticated artificial intelligence. it's for these reasons that both h.r. 3991 telling everyone the location data using the u.s. act and h.r. 4000 4000 the intt application id act introduced by kensington are so important of these bills together will provide the american people with the information they need to know exactly where these types of companies are headquartered. where the data is being stored and to fully understand the risk they and their children are taking when using these apps. app state abuse undermines our
12:47 pm
democracy. another transparency issues that congress needs to address it access to accurate whois registration which contains contact the difficulties of internet domains and a fundamentally to protecting consumer privacy, promoting lawful commerce, ensuring public safety and protecting our national security. indeed the department justice cyber report states that first step in online reconnaissance often involves use of the database. in 2018 registries and registrars like go daddy,, verisign, increasing began restricting access to whois database on an overlap application of the european union gdpr. yet almost after five years of quote trying to fix the whois problem they have failed. the time as therefore come for this committee and congress to pass legislation requiring domain name registries and registrars to once again make whois information available.
12:48 pm
in no other area of consumer protection is it more important than establishing policies to protect children in the marketplace. this is especially true in the area of online privacy and market dominant digital payment apps and debit cards the target children and click and exploit a shocking amount of their data coppa creates an opt in parental consent privacy machine for websites directed at children under 13. by contrast gramm-leach-bliley enacted in 1999 created an opt out privacy regime for financial institutions. that privacy space between coppa creates a fintech child protection gap is existing law. this gap is especially harmful as we move toward a cashless society accelerated by the pandemic. the good news is that one company fintech digital oil
12:49 pm
company which i'm involved with is the only coppa compliant digital wallet. thank you again for giving me this opportunity to participate today. i look forward it to your questions and continue to work with you and your staff. we must all work together to fix these important problems because at the end of the day it is the right thing to do. thank you. >> thank you. and now mr. golin, the floor is short for five minutes. >> thank you chair schakowsky, ranking member bilirakis and distinguished members of the subcommittee for holding this important hearing. my name is josh golin and i'm executive director of fair play, a leading independent watchdog of the children media and marketing industries. through corporate campaign and strategically regulatory filings we have changed the marketing and data collection practices of some of the world's biggest companies. currently we we're leading a campaign to stop facebook from launching a children's version
12:50 pm
of instagram, and last week with other leading advocates relaunched design with kids in my come a campaign to demand regulations that require online operators with kids interest first when designing their platforms. frances haugen has shown how critical spotlight on instagram harmful impacts on teens and facebook's taoist disregard for children's well-being. but it would be a mistake to view her revelations as problems limited to facebook and instagram. compulsive overuse, exposure to harmful content, cyber bullying, harms to mental health and the sexual exploitation of children are industrywide issues that demand systemic solutions from congress. to put it plainly, the unregulated business model for digital media is fundamentally at odds with children's well-being. digital platforms are designed to maximize revenue and, therefore, engagement he tells the longer they can capture users attention the more money than make by collecting data and
12:51 pm
serving ads. as a result children are subject to relentless pressure and manipulative design that pushes into use and check platforms as often as possible. the harms -- this harms young people in several ways including encouraging the overuse of social media and activities like sweet, exercise and face-to-face interactions. overuse can also lead to isolation from secure family relationships and reduce interest in academic achievement and extracurricular activities allowing for profit tech companies to shape children's character, habits and future. design choices use to maximize engagement are also harmful because they exploit young people's desire for social approval and a natural tendency toward risk-taking. this displays of likes and fr accounts inviting instant snapshot of whose profiles and posts are popular. children quickly learn that way to improve these metrics is to post risqué and provocative
12:52 pm
content creating a permanent record of their youthful indiscretions and increasing the risk of cyberbullying and sexual exploitation. platforms also harm young people by personalizing and recommending content most likely to keep them engaged. one former youtube engineer observe recommendation algorithms are designed to optimize come not to show content that is actually good for kids. this means on platforms like instagram and tiktok teens interested in dieting will be barraged with content promoting eating disorders and did a press user will be shown content promoting cell phone. nearly every concerned that parents public health professionals and children themselves have about digital media platforms can be traced to deliberate design choices. it doesn't have to be this way. absent online platforms could be built instead to reduce risk and increase safeguards for children. but that won't happen without significant action from congress. the only federal law that protects children online was
12:53 pm
past 23 years ago, long before smartphones come instagram and youtube even existed. tongass continued inaction combined with the lack of enforcement at the ftc has emboldened big tech to develop and exploitative business model without considering or mitigating its harmful effects on children and teens. it's no wonder that pulse consistently show that parents want congress to do more to protect children online. we know the key legislative solution. the kids act which we will discuss today would prohibit companies from deploying design techniques like autoplay, displays of quantified popularity and algorithmic recommendations that put children and teens at risk. the privacy act would expand privacy protections to teens, band harmful uses a data like surveillance advertising and require platforms to make the best interest of children a primary design consideration. together these bills would create the safeguards children need and transform the online
12:54 pm
experience for young people. over the last year i've watched several hearings like this one and was heartened to your members of congress speak first and foremost not as republicans and democrats but as parents and grandparents with firsthand knowledge of what's at stake. but the american people need more than your understanding and justifiable anger at companies like facebook. the tech is banking on the fact that partisan divisions will keep you from taking action. i hope you will prove them wrong advanced legislative solutions that are protect children while they are online and make it easy for them to disconnect and engage in off-line activities they need to thrive. there is simply too much at stake for children and their futures to allow the status quo to continue. thank you for having me here today, and a look forward to your question. >> well, thank you. and now ms. rich, you are recognized for five minutes. >> chair schakowsky, ranking member bilirakis, the members of
12:55 pm
this subcommittee, i'm jessica rich of counsel at kelley drye and a distinguished fellow at georgetown university. i'm pleased to be a today testify on holding big tech the chemical and building a safer internet. my remarks on my own base of my years of government service. i background is as a law enforcement attorney and official. i work for over 26 years at the federal trade commission, the last four as director of its bureau of consumer protection. before becoming director i was the first and longtime manager of the ftc privacy program. i've supported stronger data privacy and security laws for over 20 years. the focus of my testimony today is on that very issue, privacy. while a interest in privacy is not the chief focus of this year income on highlighting it today because the need for privacy legislation, federal privacy legislation has never been stronger. this hearing is addressing many important issues some of which are closely related to privacy,
12:56 pm
but passing a strong and comprehensive federal privacy law is one of the most important things congress can do to hold big tech accountable and build a safer internet. consumers, , businesses, regulators in the marketplace as a whole, we all need a federal privacy law. first, serve it up on survey shows that consumers are concerned about the privacy and believed to have control how companies collect, use and share their personal information. they continued to be the ths massive data breaches, collection of abuses regular and companies make decisions affecting them everyday using algorithms and profiles with built-in assumption and vices. you can't educate consumers about their rights because it depends on the market sector, the state they're in and the type of company and the data involved. often consumers have no rights at all and consumers can't be expected to reach hundreds of privacy policies a day from
12:57 pm
companies they have never heard of. consumers need a clear and consistent privacy law that they can understand and rely on every day no matter where they are what they are doing. businesses are similarly confused about privacy laws in this country. at the federal level we have the ftc act as well as dozens of sector specific laws like coppa, hip and the fair credit reporting we have three copies of state laws with more on the way. on his come to spend enormous time and money to navigate all these laws while the unscrupulous exploit the gaps in the loopholes. imo, large companies have benefited that includes the platforms come because they can afford the cost of compliance and because many existing laws favor large entities that can keep the operations in-house and not share data with third parties. in sum, businesses need a clear and consistent federal privacy law to help them navigate a difficult regulatory environment and create a more level playing field.
12:58 pm
but there's more. for over 20 years ftc, my former agency, has overseen privacy using a law that is just not up to the task. the ftc act upon the ftc has accomplished a lot, this law does not establish clear stand for everyone to follow for a problem occur and of the gaps in his protection, creating uncertainty for the marketplace. many in congress on both sides of the aisle have criticized the ftc for these problems. too strong, too weak, too much, too little. but with respect it is congress that needs to fix the problems by passing along with clear standards for the ftc and the public. finally, we now all of us understand that concerns us run running the use of personal data reach well beyond traditional issues of privacy to issues like discrimination, algorithmic fairness, accountability, whistleblower protections, art patterns, protecting our kids,
12:59 pm
data portability and even with respective data security are critical infrastructure. a privacy law could address many of these issues at least in part achieving far more than could be achieved by adding yet more requirements to the confusing mix of laws we now have had in the united states. thank you so much for inviting me here today i stand ready to assist the subcommittee and its members and staff with ongoing work related to consumer protection and privacy. >> thank you very much. last but certainly not least, mr. ahmed, you are recognized now for five minutes. >> chair schakowsky and ranking members, members of the committee, thank you for this opportunity to appear before you today. the center for countering digital hate is a nonprofit research and the dynamics of misinformation and hate on social media. however the sure how it undermines democracy, child safety and our ability to deal
1:00 pm
with life-threatening crises such as covid. why is this happening? why are we here? the truth is social media companies discovered prioritizing hate,, misinformation, conflict and anger is highly profitable. it keeps users addicted so they can serve them ads. ccdh's research is documented that actors causing harm but also bad platforms, , encouragi, amplifying and profiting from that harm. ..
1:01 pm
march 24 we released a report showing 65 percent of anti-vaccination content circulating on facebook and twitter, 65 percent originates with sites and accounts by 12 anti-vaxxers. this committee asked mark zuckerberg about the hearing the next day on march 25. he promised to do something about it. he did not. six months later after the surgeon general and the president weighed in putting a report facebook responded claiming our report had a faulty narrative. however, facebook was the fact francis haugen review on the same day facebook produced an internal study confirming that a tiny number of accounts were responsible for more than half of anti-vaccine: content on the platform. they were lying while the american public was suffering under covid.
1:02 pm
members of the committee have seen the same tactics from social media executives time and time again. you have correctly determined as have legislators in the uk,australia, germany and other allied nations that social media companies cannot self regulate and we need new legislation . there is no silver bullet. that's right. section 230 shows the limitations to asingle solution based on one core principle . it did not predict nor deal with the harms we are seeing emanating fromsocial media . they will need to be a range of approaches to transparency and accountability to knowledge social media into a place that balances dialogue, privacy , safety and prosperity. the bills being considered today would collectively represent a big step forward
1:03 pm
to protecting children, family, society and our democracy. the act would put protections in place for our children. transparency is an essential tool encountering online hate and lies. the social mediadata act would give independent researchers that access needed to detect dangerous threats . was a lower and internal documents illuminating wrongdoing by big tech providing new urgency to the reform debate. but whistleblowing is still profoundly risky for the whistleblower which is why the incentive to protect provided by the ftc whistleblower act are critical . social media apps trick users into giving up personal data. their thoughts, their fears, their likes their dislikes which they sell to advertisers. bigtext big data is designed to exploit people, not to serve them better . the act puts a stop to that destructive spiral.
1:04 pm
there are also too much needed bills to address the growing threat of hostile foreign actors who revel in the divisions that social media creates and exacerbates in democratic society. in approving these bills the company would take a huge step forward towards better regulation and give us hope that an internet that brings out the best in people is possible. thank you very much. >> tank you very much. we have now concluded the incredible and i'm so grateful for the witness testimony. and their opening statements are finished and at this time we will move to member questions. each member will have five minutesto question our witnesses . i will start by recognizing myself for five minutes . we begin by saying the federal trade commission is the top regulatory agency
1:05 pm
tasked with keeping americans safe online by preventing unfair and disruptive practices. but the fcc, the ftc standout from many other regulatory agencies because whistleblowers are not protected by federal law. recent events as we've seen with francis haugen have made it clear how important whistleblower protection really is and that is why i introduced the ftc whistleblower act. and along with laurie tron, my colleague. this legislation protects whistleblowers from retaliation. for for content that is
1:06 pm
coming forward. and i wanted to get the opinion of some of our witnesses. and also incentivizes mister ahmed, you mentioned incentivize asian. to make sure that these arms are not present there and i wonder ifyou could comment . a little bit more on whether or not and why you believe that the fcc was a lorax would actually help to turn social media companies from making business decisions that could be harmful for consumers . >> yes, francis haugen turned on the floodlights for what she did can't be replicated. for one thing it's incredibly expensive. she had lawyers. the government says the loss of income and her real value, the reason it's so important is that she really exposed
1:07 pm
this action. active deception by social media companies, something that can't easily be replicated with any other mechanism . so the only way to cast light on that deception is to shed light on immorality from within the window of whistleblower is limited. you think since she took all these documents, the default into the meta-verse, most of the anti-vaccination prices happened since then and we need disclosure of deceit. not every decade but every time that there is active deceit on something of great public interest. so this bill is incredibly important in bringing forward more moral characters when you need them. >> thank you, mister greenblatt in your view with this legislation do you think it would work in favor of protecting consumers and ending some of this spreading of the harm that is done?
1:08 pm
>> yes madam chairman, there's no question that the whistleblower act is necessary. i mean, to build upon what she just said what we know is i've had direct conversations with mark zuckerberg and other facebook executives. they have lied. to my face. they have lied to you. they have lied to their advertisers. they have lied to the public. but let's be clear, silicon valley is a trickyplace . it is not easy so we need to give these people the protections that they need so they don't risk being in violation of their nda's. they don't risk future opportunities for employment but i think again if we're playing along game here we need to realize the moral leadership and courage displayed by people like francis haugen to think about it we learned because of her bravery facebook is only tackling 3 to 5 percent of
1:09 pm
the hate speech on their platform despite their protestations . we learned that their ai gets less than wait for it, one percent of the incitements to violenceon their platforms . the reason why this has prevailed for so long as they are exempt from liability and lack the incentives so madam chairman, unless we have the means to detect the people who have access to this information is clear the companies will not volunteer it to us. so i think it's vital that youract , the whistleblower act is passed. >> i wanted to ask doctor marechal how this legislation would help regulators and law enforcement to better understand the economic incentives behind decisions by internet platforms and the ones that they make.
1:10 pm
>> i agree wholeheartedly with the point that my colleagues on the panel have made. again, federal whistleblower protections make it easier for big tech workers who want to do the right thing to do that. again, miss haugen benefited from the fcc whistleblower statute which is why so many of her disclosures don't directly relate to matters within the ftc's jurisdiction . i would, i am confident if there were an equivalent for the ftc we would have seen this disclosure from her. additional whistleblower complaints related to matters under the ftc jurisdiction. which includes economic decision-making and the economic factors that cloud the company's decision-making.
1:11 pm
>> thank you so much and my time has expired and now i welcome the questioning by my ranking member. mister bilirakis, five minutes. >> appreciated and i want to thank all of you for your testimony . very important. reasonable proposals on and off the bills again, off the list of bills being considered today and in the future. however i'm concerned by the unintended consequence of the rise of callers deciding to legislate on privacy and data security in multiple bills without establishing a comprehensive framework. miss rich, question for you. can you elaborate on any potential consequences businesses and our constituents may face as a result of an acting several individual one-off bills and
1:12 pm
privacy as opposed to one comprehensive bill. i knowyou touched on it if you can however i really appreciate it very much . >> right now it's a highly confusing environment for both businesses and consumers. there's so many laws that pertain to privacy, to technology, to many related issues and no one really knows what the rules are. so one of the chief benefits of an acting a comprehensive privacy law which could include many of the issues we talked abouttoday is to bring it all together . even if certain laws, it's not going to reveal all the sectoral laws, is not going to roll it back. everything that people are dealing with now but it could bring it together and create a comprehensive enforcement scheme. so that's one of the reasons getting rid of that confusion
1:13 pm
, bringing greater clarity to the marketplace that is so vital that we pass that kind of law. >> next question. ultimately it will be for mister line but i wanted to perhaps i do have some comments first. in addition to privacy and data security, one central theme to today's conversation of big tech platforms that particular act is sponsored by leader rogers. and we released a it earlier this year. one issue that is very near to my constituents is the growing rise of illegal activity like upscaled deadly fentanyl products that are plaguingsocial media platforms . and in fact i was able to question the dea on this issue last week and i'm pulling a roundtable for my district in florida, the 12th
1:14 pm
congressional district. to discuss the fentanyl crisis with local leaders and law enforcement . wheredoing that on monday at noon . two turned the tide of this activity i also offer draft legislation that would direct a gao to conduct a study on how online platforms can better work with law enforcement to address illegal content area and crimes under platforms. the question is for mister lane. what do you believe mister lane isimportant for us to consider as part of this particular discussion ? >> as you know i've been working with families who have children die from antenatal poisoning and it's a very sad situation that we're facing. i do believe that working with the fda and others are taking important steps. there's a lot of groups out here focusing on this but two things have to occur one i know the groups have asked
1:15 pm
especially to have an open and accessible and accurate database. because that's how their finding websites that are engaged in selling these drugs. right now it's dark and the fda itself has asked for an open accessible and accurate database so that's a very important step in moving forward. the other important step is that everyone talks about how these social network sites are radicals. radicals were 1996 when you 11 board services and you had to find the rabbit hole. these networking sites are more like black holes . they have a gravitational force sucking people into the darkness and it's hard for them to see the light again and those are the issues we have to address. what are the algorithms ? how are these social networking sites talking these young people in and exposing them to drugs that they would never haveaccess to and how do we stop that ?
1:16 pm
>> i appreciate it and i want to discuss that further with you so i appreciate your response. during the senate commerce committee nomination of alan davidson, both nominees discussed the harms occurring regarding the misuse of consumer personal information and ultimately expressed support for passing a comprehensive privacy bill. i think this highlights how important it is for congress to pass a national law on privacy and data security . to the entire panel a yes, sir no answer would be fine. would you support this committee passing a comprehensive national privacy and data security bill that sets onenational standard . provides new rights to consumers and sets clear guidelines for business to comply? again, a yes, sir number miss rich please, iknow what your answer is going to be . mister garland please.
1:17 pm
>> yes mister lane please. >> yes. >> mister marechal, doctor marechal excuse me. >> yes but it must be a strong standard and must be with appropriate enforcement . >> mister my? and mister greenblatt. >> yes but i want more information. >> thank you and i yield back madam chair, thank youfor the extra time . >> i would say yes. definitely. and now i recognize the chairman of the full committee for five minutesfor questioning . .ister colc >> thank you chairman three and i mentioned in my opening statement that we tell several earrings in the committee examining the real harms from social media companies and obviously we're here today to discuss meaningful solutions but i
1:18 pm
wanted to start out with mister greenblatt. the anti-defamation league has done important work showing the role of social media companies play in amplifying racist extreme divisive content . you've also shown how those actions is proportionately affect marginalized communities so can you talk about the real harms you seen social media companies cause through the use of their algorithms inthat respect ? >> thank you for the question is to chairman. and i would say right off the bat the companies often use the smokescreen of readable speech to explain why they should be regulated the founding fathers wrote the constitution for americans. not algorithms. i like products are people and they don't deserve to be protected but citizens do. and we indeed have a situation where the crimes are on the rise in this country. the fbi reported 13 percent increase in 2020 the largest total sense 2001.
1:19 pm
an adl indeed has been studying online and harassment and we find that one out of three users who report being harassed online were related back to the characteristics like race, religion, gender, actual orientation. we see real examples. i think about taylor dunstan was a young woman. she was the first african-american female resident of thestudent government and american university . she testified before you year or two ago and she was after she was elected president she was mercilessly attacked with a campaign that was conducted online. it originated on a disgusting blog, neo-nazi blog perpetrated through this book and other platforms. and it ended up started with the online mister chairman then you had nooses being placed all over campus. adl worked closely with miss dunstan and she's in a much better place today. i think about a woman named tonya. a jewish woman from wet fish
1:20 pm
montana to have the misfortune of being from the same town richard spencer the notoriously leader of the right was from and when this verse was identified and then doxxed, she was so mercilessly attacked, she and her family had to not only change all their information like their phonenumbers . they had to move to a different home. they had to get 24 seven protection. literally death threats happen off-linebecause of what started online . so algorithms we need much more transparency around them to ensure that they don't discriminate against marginalized communities . we need to realize that as we were saying earlier . facebook ai, their vaunted machine learning literally misses 95 to 95 97 percent of the hate speech.
1:21 pm
i used to be an executive at starbucks, mister pallone. i didn't get to say to my customers 3 to 5 percent of our copies don't have poison so we think they are pretty good. you have to have a success rate of 100 percent and i don't think it's too much to ask of literally one of the most well-capitalized and profitable companies in america to ensure that their products simply work. and don't harm their customers or the public. >> i wanted to issue another question about transparency because in the case of a tech accountable, increased transparency would go a long way towards making it a safer place so how would the bills before us bring greater transparency and with greater accountability to the big tech platforms ? >> first and foremost making the companies simply share their data about how the algorithms perform for the
1:22 pm
benefit of researchers and watchdogs. think about it, these are public companies who have the privilege of getting resources on the public. like selling shares butthey don't disclose their information . forget the risk to the companies, it's a risk to the general public. the right analogy here is really big tobacco. or big oil. we've learned later that big tobacco knew the damage that their products were doing to their consumers but suppressed the research and we didn't have insight into what became revealed and we learned oil new the damage that fossil fuels were doing to the environment yet they denied it and lied until it was revealed. now we know the damage big tech is doing to our children and to our communities. so asking them to simply be transparent, to simply make the information available. the last thing i'll say to keep in mind is what other
1:23 pm
information we are asking for. it's user data. you know, there's an expression. if the product is free, you're the product. the information that we want is information about us. that shouldn't be too much to ask . >> thank you madam chair. >> you are recognized for five minutes. >> i think the chair and my good friend for your hearing today. very good information and i want to thank the witnesses for being with us . miss rish if i can start my questions with you and my good friend ranking member of the subcommittee was getting into some privacy questions that's one of the issues that is we struggle with today. looking at the testimony you submitted, serving one survey shows consumers are confused abouttheir privacy .
1:24 pm
then it says consumers need a clear and consistent privacy law . businesses areconfused . then we look at the enforcers. and this is kind of awesome, but lack of a clear privacy standards are undermined has undermined the ftc to. and you state that among other things the law does not establish clear standards for everyone to follow before problems occur. what are some of these incidences are largely reactive so what's out there with the ftc has been doing eventhough they're trying to do what they're supposed to be doing an enforcement but what are some of the standards they need to have right now to go forward and be clearer to the public ? >> some of the basic building blocks we see in every privacy lawrequired by the fcc act . basic transparency. choices, accountability. there isn't data security law applies across the country.
1:25 pm
so and you may not like this in a long access, correction, deletion. all those rights you see in law after law. antidiscrimination prevention. all that, the ftc has to examine a specific company and decide after the fact using its authority to police unfair or deceptive practices whether a practice is unfair or deceptive but there are clear requirements. all those elements are clearly required in any nationwide law that applies across different situations so as i think i said in my testimony the ftc has been able to do a lot with its authority under the ftc act but it would be somuch better for the public, for consumers there's , for the marketplace
1:26 pm
to have rules that everyone knows what they are and they know what theconsequences are if they violate them . >> thank you very much. mister lane, i'm glad you're at this hearing today where we can consider legislative proposal like the big tech discussion draft that would require companies to disclose their content enforcement decisions. this is intended to cover illegal activity and arms happeningonline such as fraud, illegal drug sales and human trafficking . i think complementary to this goal is the ability to have access to data. this would go a long way in helping solve theseproblems . as you mentioned in your testimony this information can play a vital role in combating fraudand facilitating better cyber security . in 2020 i sent letters to several executive branch agencies to ask them about the importance of was conducting their investigative prosecutorial obligations.
1:27 pm
their responses from the fda, ftc and dhs emphasized the importance of this information and identifying bad actors and connecting rental networks and protecting consumers about our cyber assets . with restored access would make the internet safer? >> first of all i want to thank you and your staff for taking a leading role on this issue. your letters have been important to show and highlight the real concerns and cyber security threats our nation is facing based on the decisions and the european union and gdp are the broad interpretation of having togo dark . i also want to add one thing and it's not just me saying it. in 2021 a survey by the two leading cyber security working groups that restricted access to a imputed investigations of attacks. two thirds of the 207 responses said there they
1:28 pm
were detected malicious domains had decreased and 70 percent indicate can no longer address threats in a timely manner and more than 80 percent said before the time it takes to address the abuse has increased that cyber attacks arms of victims last longer. the group basically said changes whose access following implementation of the gdp are continue to significantly indeed are applications of forensic investigation and thus cause harm to victims of phishing, malware and other cyber attacks . the federal trade commission as well as i can strive to fix this problem and it is what you're pushing in your legislationand your letters and hopefully this congress will end legislation is critical .we can no longer put the multi-stakeholder process i had of the american people and the safety and security or national security
1:29 pm
needs to be protected by this congress and we should not be kowtowing to a law regulation that's from another country. i want to end on this area i can see the chairman ceo of vyvanse has said that they are limited in their actions because of the dvr. not because of us law, not because of the california piracy laws by the dvr so we are at risk of having our own security for that risk because of a foreign entity legislation and regulation thank you so much for everything you're doing in the space . >> before i go back i like to ask unanimous consent to ask for entering the documents from the vhs fcc and fta and a report from the gdp are at user survey into the record. thank you for your indulgence and i yelled back . >> i recognize mister rush
1:30 pm
for five minutes for his questions . >> i want to thank you for this important hearing. and then my colleagues also for a comprehensive federal privacy legislation. in fact when i served as chair of this subcommittee we passed a strong bipartisan bill that openly and enforcement lead narrowed in the senate. but i'm continuing tocampaign for privacy legislation . madam chair, i am also cognizant of the fact that privacy is not a panacea that would solve all of our internet connections that i
1:31 pm addition to privacy issues, were also faced very real and very threatening threats like misinformation and biases. and with that in mind, we're working on comprehensive permit legislation. i am pleased that we are addressing these important issues aswell . that said, [inaudible] in your testimony you say. [inaudible] then children
1:32 pm
from higher income households and significantly more time on screen then there white peers. increased exposure to screen time is left to more health issues, no question. and it's true then when white americans catches pneumonia. [inaudible] to that point, what type of impact is this increased screen time in low income households. of these outcomes compared to
1:33 pm
children in higher income households? >> thank you for that question. as you referenced, the data shows that low income and black and hispanic children have more screen time and spend more time playing games online than their higher income and white peers . the data also shows that screen time linked problems like childhood obesity , there are much higher rates for low income children and black and hispanic children so i think that given what we know about the severity of the problems linked to excessive screen time and that these and from these communities are having even higher rates, it is absolutely essential that we passed policies to protect them. like all issues this affects all children but like every issue, children from
1:34 pm
marginalized communities, children from more vulnerable communities are getting the worst of it so that's why it's so important we create a new set of rules and build a better internet because we need to protect the most vulnerable among us. >>. [inaudible] is there any data that supports other ramifications of this particular phenomenon? hello? >> i'm sorry, i don't think i heard the question. i'm not sure if i heard it correctly. >> this is the second question. is there any data that lists particular phenomenon or common education systems? is there an effect on what
1:35 pm
increases. [inaudible] >> there is data that shows the more timekids are spending online for entertainment , is correlated with lower academic achievement . there has also been a rush to use and tech in our schools and to see and as a panacea for fixing educational inequality when in fact what the data is showing is that the more hands-on learning that kids get, it's actually better for their academic achievement so what's really worrisome is there's this idea that if schools conduct heavily in and clap platforms fix educational any quality and in fact i think there's a real danger is going to worsen it because what kids need is quality teachers, smaller class sizes. they need to interact with each other and more time kids
1:36 pm
are spending on screenstaking away from those things . >> thanks for your indulgence . >> chairman you back and now ms. rogers is recognized for five minutes. >> iq medicare, mismatched. >> your experience was under a democratic chair. yet i appreciate your dedication tobipartisan consensus when possible . it has been the commissions tradition. yesterday mister bilirakis and i sent a letter to chairwoman on regardingthe ftc's current direction . the explicit concern was the commissions use of zombie voting. past rules and the recent decision to delete legitimate business activities. from the ftc mission statement. given the number of bills before us i think it's essential that we find unrealistic enforcement balance. we needed operation would manage all these competing priorities about hurting legitimate business
1:37 pm
activities. so this alarming mission statement change happened while build back better is pending in the senate that legislation includes amendments to the ftc act which would give the commission brought for offense penalty authority. how expensive is this proposed authority? is there any commercial activity or sector of the economy itwouldn't apply to ? >> the civil penalty provision in the bill back better at as i read it would apply to anything covered by the ftc act. unfair deceptivepractices under the ftc act .so the ftc does lack the restriction over certain sectors of the marketplace. banks, nonprofits, certain functions of common carriers that otherwise as i understand the provision if it were to pass it would apply across wide slots of the marketplace.
1:38 pm
>> regarding the proposed new authority and i correct this only deals with civil penalties and not remedies like discharge department for restitution . >> that's right, civil penalties only. >> during your service was the commission able to predict how many violations with for each year? >> number. >> that's in line with our experience. the ftc cannot predict who is going to break the law. we supported and enacted such civil penalty authority targeting covid scans and such revenues were insignificant over the 2021 to 2030 period. this might be a basic question but if all companies are following the law , there is no violation of the ftc act. and thus revenue is not generated via enforcement action. correct west and mark.
1:39 pm
>> yes although i've never seen a situation where all companies are not violating the law. >> given how broad the authority is, what we know of recent ftc changes and actions i worry about the lack of regulatory certainty for small businesses. they after all are not experts like you on what protections they may have under the ftc act . is it fair to say that they may not have the resources or the sophistication to manage the review by the ftc of their operations. >> yes but i'm not to be a broken record but i think congress can fix this problem by passing a privacy law that does provide standards. >> i appreciate your answering those questions and providing the insight. i do think all the witnesses for being here. i want to note that we have
1:40 pm
incorporated first offense penalty authority in our comprehensive privacy and data security legislation. the controller data as a means of policy enforcement and i heard this committee to take action. i feel back. >> the gentlewoman yields back and i now recognize i'm muslim caster for her five minutes of questions . >> thank you chair for holding this very important hearing for including my internet design and safety act i'm leading with representative clark, trey and weston and of course senator marty blumenthal. including the social media data that trey and i are leaving as well. we really did come to this hearing more than so that other humans as parents and as grandparents. we know as mister greenblatt said these big tech companies are complicit in the harm that's being caused by online
1:41 pm
operations and as mister on it pointed out promised profiting from the harm. we clearly have to take action on 230. on children's privacy. everyone's privacy and especially the design of these platforms i want to focus in on the kids act. mister golin, thanks for your years of work on this. your testimony is the big tech platforms like instagram and youtube and others intentionally design the way children interact online to keep them addicted. would you go into a little moredetail on that ? >> sure and reversible representative caster thank you for your work to make sure children get the online protections they deserve . the business model for all this media is to maximize
1:42 pm
engagement because the more time kit is on a platform the more money there were so they design their platforms in ways to keep kids on those platforms and keep them checking those platforms as often as possible . they use things like rewards and nudges and push notifications so things like snap street so on snapchat kids are incentivized to communicate through snapchat every day with a friend and keep the streak going andthat becomes a powerful motivation . gain of five the relationship and kids want to keep that going. they use things like autoplay and scroll tiktok to make it easy tokeep using the platform and hard to disconnect . they use things like linux and follower counts so everybody can see who's popular at any given moment and this is a powerful incentive for kids to create content and not only create content but to create provocative content and risk
1:43 pm
content because they know that's what's most likely to get them attention . then of course there's the algorithmic recommendations which personalize everything to kids to show them the content that is most likely to keep them engaged and keep going on a platform regardless of whether content is good for them and as we've been talking a lot about you often that content is terrible for them. >> i've been out when i'm out with them and i see their young children now on tablets and iphones. we're talking toddler's and what does the latest research tell us about how young children are when their first interacting with online platforms. >> one of the things that's really disturbing is that we all know the age for social media and you're supposed to go on social media is 13. 40 percent of 9 to 12-year-olds report using
1:44 pm
tiktok every day and the numbers are about identical for instagram and snapchat. >> do they have the ability to kind of self regulate at that age? >> absolutely not. executive functioning is developing. these are platforms adults get lost in. these are platforms we're all struggling with as adults and to think developing developing children whose habits are being formed are using these platforms. >> how will the kids act that help parents and help address these harms that these online platforms are profiting off of. >> the kids act as the number of important things. first of all it prohibits those design choices that are there to maximize engagement. things like two children. things like autoplay,things like rewards . things like undefined popularity. it prohibits platforms from using algorithms to occupy harmful: content to children,
1:45 pm
nothing we've all been talking about a lot lately and it also there's influencer marketing which is one of the most innovative forms of advertising so it would do a huge amount to start creating that online environment. >> then we have to parent to privacy protections. iworked with you on the kids online privacy act . the variables need to be passed again? >> if we could pass those we would go far towards creating the internet kids deserve. >> thank you, i yields back. >> mister don, you are recognized for five minutes. >> i appreciate the opportunity to discuss these important issues. the chinese communist party is probably the single greatest threat to the free
1:46 pm
world since the cold war and they seek to sabotagefreedom, democracy everywhere it exists at a malign influence permeates all their corporations including those that operate in the united states . they have ccp members in keyboard positions and many of those organizations have direct control over decision-making. despite that american tech companies still continue to operate within china and we allow them companies with those type to operate quite freely in the united states as well. just this year microsoft was the victim of a chinese state-sponsored cyber attack yet if you look at the number of job postings for microsoft in china, you get the feeling they are expanding rapidly in china . so i think it's the concern of this committee what these us tech companies are doing within china and what those chinese companies are doing here for purposes of the of this hearing i want to focus on what the ccp affiliated companies might be doing here in the united states. the ccp doesn't respect the rights of their owncitizens, why should they respect hours . congress has a responsibility
1:47 pm
to share with american consumers are protected from these involving evolving threats and i think this can be published in a number of you said that today. as if we can get a comprehensive data security built through the protection of citizens without sacrificing innovation or competitiveness. in our nations technological threats. mister lane, i like many of my constituents and very concerned about the amount of personal information that is currently collected without any basic level of protection. a specific example is bgi, that's the chinese people's genomics giant. the activities that they instituted during the. they sold millions of tests to kids to us labs and offered their own sequencing this is to government and individual states. the lack of privacy standards attest to that. and i'd like to know what concerns you most when it
1:48 pm
comes to protecting americans consumer data from foreign adversaries. what keeps you awake at night ? >> thank you for the question congressman. what keeps me awake at night is that most people don't realize that the driver in this artificial intelligence race and machine learning is human interaction and data. those who collect it the most will win in that fight. i do have a strong concern that we don't know how data is being collected and used. there is great legislation. the duncanville and community are great examples of how we can try to know that but we have to be concerned because the head of government affairs for tick-tock in the senate basically talked about howthe data is stored in singapore . my pictures are stored, i don't know where. somewhere in the cloudlike
1:49 pm
and manipulative. i can access them and even create a. so we need to make sure that we know not just where the data is sort of how they're getting access to it. and one of the things that has always bothered me about one of the tiktok statements is they will never hand over us american citizens information to china. maybe they believe that. but if someone gets a knock on their door and a family member was still living in china and the chinese communist party and says we would like your relative to handover the data. i don't, i know what i would do. it just as a person, if it was my family being threatened what i have that data over? probably. so those assurances cannot be taken seriously. >> so physical location which is real even in the cloud is something that's important and of course the jurisdiction over that data is important. miss which, in the remaining
1:50 pm
seconds we have like you to address what you would like from congress to give it to the ftc to improve the security of ourdata . >> specific data security requirements which do not apply across the market right now. there is no general data security law that applies to the us marketplace that would include process requirements such as doing a risk assessment. accountability, among officers and the company. oversight and service providers. contracts with service providers. there's many elements. >> a reliable audit on these companies perhaps as well. all of you have been excellent witnesses. madam chair, i yields back. >> thank you mister don and i would recognizecongresswoman
1:51 pm
trade hands for five minutes . >> thank you chairwoman and ranking member. thank you for convening this important hearing thank you to the witnesses. many of you have offered invaluable expertise to my team and we introduced social media data apps. now as we draft to create a new deal at the ftc focused on platform transparency and safety, mister bolin, formally the campaign for commercial has been studying the impact of advertising on children for decades. can you explain why families advertising is used by instagram and youtube is particularly harmful for our teenagers ? >> sure. there's a couplereasons it's so harmful and thank you so much for all your work to protect children online . so it's artful because it allows companies to target teenagers vulnerabilities. in fact he spoke a couple of years ago right to their advertisers that they were able to target a teenager at
1:52 pm
the exact moment that they were feeling bad about themselves and including when they feel about their bodies. this leads to things like girls who expressed interest in dying getting targeted with ads for flat tummy tees and dangerous exercise routines . so being able to target those things people are very vulnerable to trying to encourage consumption of products that will make those things worse. the other thing is there's a complete asymmetry of information. it's completely unfair. the only thing teenagers may not about surveillance advertising is there some creepy ad that keeps following them around and they do use the word creepy to describe the advertising but the advertisers know everything about that child . they know every website they ever visited, every favorablelight . how much money their parents make, where they live, all the places they go so it's completely unfair. the advertising knows everything about the child and the child knows little about how the advertising works and this last thing
1:53 pm
i'll say is leads to a tremendousamount of data collection and that data can be misused in all sorts of ways . >> i thank you for that. congresswoman castor, many of us i'm the mother of two young girls. are very concerned that we are watching an online data that can only be targeted with a dangerous weight loss supplement and we need more transparency. director, can you why it's important for researchers to be able to study all digital advertisements asopposed to just the subsequent political ads . >> first, very difficult to draw a clear line around what adsare political or not . went well company runs ads advertising its commitment to bridge energy is that political? how about when facebook runs ads claiming to support updated internet regulation while lobbying against it behind closed doors. what about the diet ads that we were just talking about. is that political?
1:54 pm
moreover, even if we agree where to draw the line can we trust the platform to enforce it? it's clearly answer there is no more importantlyas can be dangerous determinates or even if they're not political . these diet ads are a great example but more importantly, many people would say that the housing ad is not political but if it's targeted in such a way that black users can't see it that's discriminatory and harmful . that's exactly what advertising enables. >> i'm hoping you can why researchers need to have details regarding not just the aggregated description of this audience was also a description of the aggregate users who saw are engaged with an. >> the targeting permits only tells you the advertiser was trying to reach. they'll tell you who saw the ads. many times thoseare the same but if not there's one or two things that are happening .
1:55 pm
either the platform is starting for the service they didn't deliver or it's optimizing targeting beyond what the advertiser asked for in ways that are discriminatory. either way it's somethingwe should know so we can put an end to it . >> you, i want to emphasize i think political transparency is important . i know more and more upon the recent stage of my website i started a digital ad library where i'm posting all my political ads and i've included all the data outlined in the social media ads and had a chat with my fellow members if they like to join me but i do have a few more questions i'll submit for the record but facebook told us last week that researchers have begged and begged for a very basic data. the data they will never get unless congress acts the social media data apps again to address this issue and i look forward to continuing to work with all of you onthe transparency issue . that will pave the way for legislation.
1:56 pm
>> the gentlewoman yields back and i recognize mister pence for his five minutes of questions. >> thank you chairwoman sharkansky and ranking member bilirakis and thank you to the witnesses for appearing here today . this hearing is imperative to exploring the parts of the text that could be negatively impacted the social fabric of our country and harm the well-being of whom here is an all-american. i'm increasingly concerned with the growth at any cost mindset of silicon valley which has been around for a long time as we heard last week . social media platforms ties inflammatory content using algorithms and tactics intended to manipulate and in the tendency of its users. this information allows tech
1:57 pm
platforms to sell highly valued advertising space for precisely placed ads is the most optimal times . it is the ultimate goal there's nothing wrong with makingmoney , one way to get there is to jen users by promoting content elicits the strongest response. this creates a feedback loop of more clicks that lead to more data which leads to smarter algorithms that can collect even more data. these efforts seem to work in conjunction with the expansive shield of section 230 to evade accountability. for big tobacco, warning labels plastered on the side of cigarettes served as a longtime immunity defense. for contact it is section 230. much like big tobacco, tech companies use the same tactics on to bring in lifelong customers and some of you remember joe camel. unfortunately for my constituents there's alittle insight , there's little insight into algorithms tech employees to take advantage
1:58 pm
of their sweeping access in our everyday lives. nor do hoosiers have adequate control over the amount of information elected or how to use to tailor personal and curated content . we had truth in lending, we have to take care of that many years ago. building off the united nations and technology subcommittee hearing which many of my colleagues attended, it's clear this committee needs to getserious with our efforts to reign in big tech . mister greenblatt, i think you would agree there are positive aspects of social media. whether it's checking in with family or friends or for small businesses to expand their reach, that are there are healthy uses of social media. but it seems to me these tech companies realized early on base it on top of a gold mine of user information with virtually no guardrails to protect consumers. and as you detailed in your
1:59 pm
testimony , incendiary and controversial content is good for business. about this hearing we acknowledged the harmful aspects of overexposure to hateful content . this is very much become a bipartisan issue. we in my opinion to consider proposals that stop a platforms ability to generate revenue off content that has been adjudicated tohave harmed the well-being of its users . if platforms, mister greenblatt, if platforms were eliminated eliminating their ability to use our rhythms to carry content for users, what wouldhappen to social media companies ? would they still beprofitable enough to stay in business ? >> first i would say representative pence i agree with the analogy you drew two big tobacco.
2:00 pm
speech may be different than cigarettes but addictive products that the companies set failed to manage about which they obfuscateand lie to elected officials and two watchdogs, there's clearly a problem that requires government intervention . i wish it were different but unfortunately it is not the case. i agree that liketobacco social media can be used in moderation for fun . >> ..
2:01 pm
i believe they could fix this problem today if they wanted to. sure it might hurt their margins a little bit as they make capitol investments but if they have the resources, think about facebook, it's 16 years old and get it has three billing users across the planet earth. it has the most sophisticated advertising -- >> you think they could be profitable, they wouldn't necessarily go out -- thank you. i yield back. >> i think the gentleman and now minority center -- your recognize for five minutes. >> i think the careful selection is the observation your testimony is very stark and important. i want to say i appreciate your observation fake text is counting on partisan decisions to prevent meaningful reform so we take upon ourselves to make sure that is the case.
2:02 pm
a.i. and machine learning are more efficient for targeting specific consumers and monitoring content they can amplify and shape in a way that entirely what you are hearing about this morning how does the use of a.i. and machine learning accelerates harmful content on minds prioritizing engagements across it? >> think about that question. i want to be clear we are talking about two different types of algorithms for this on one hand we have algorithms that boost content from algorithms that tell you which groups to join and what people to add as friends order the content on your timeline. that's based primarily on correlation and engagement. what are you most likely to watch, comment on it like and etc.? on the other hand we have
2:03 pm
algorithms meant to perform content, identify content that is illegal and against platform world because it's harmful to users and society. a.i. not good of the latter part. this is one of the areas the tech industry has been on, we are just around the corner from a big achievement in a.i., suddenly making it possible to have profitable platforms where the goal is to have as much human economic and life through these platforms to make money off of it. i want to believe they are just around the corner from being able to identify and moderate away direct sales incitement of
2:04 pm
violence and hate speech and all the content we are rightfully concerned about today. again, that's not true, only human judgment can do that. >> thank you for the clarification. could increased transparency and artificial intelligence by internet platforms improve online safety? >> absolutely. on content moderation from we need to know more of the state-of-the-art as it is today and what technology can and cannot do. we've learned from whistleblowers previously facebook in particular basically does not moderate content. i'm exaggerating a bit here but if you look at -- again, the testimony before congress and
2:05 pm
other places, it is clear things in the u.s. and other places around the world, it is worse than that elsewhere. when it comes to content recommendation systems, we need to understand what recommendations we are getting and what other people are getting, i have a sense of what is being recommended to me, i have no idea what's being recommended to you or other people in society. policymaking in this area requires evidence, the first step to getting evidence is transparency. >> and q for the clarification. i want to thank you for your recommendation for not allowing ceos and majority shareholders, hopefully we can work with committees to get that done.
2:06 pm
i also recommend we should create conditions to help us produce evidence-based policy, would you expound on that a little bit? >> absolutely. that is what i was referring to speaking for transparency and research access to platform data so much of what we believe or think we know about platforms is based on our own individual experience and anecdotes and investigative journalism, one-off research studies but is not copperheads of. we have little information on a huge robin but is not enough to fully understand the extent of the problems because only the platforms have access to the information to i believe in order to legislate effectively, we need a much more detailed understanding of the facts on
2:07 pm
the ground. >> the judgment yields back from mr. armstrong, you are recognized for five minutes. >> thank you, madam chair. i appreciate everybody being here today and i think get here, i've sat through a lot of hearings in this committee and a former committee and i think we come down to the simple truth that as the larger platform gets the more data is collected from sophisticated algorithms are developed which further entrenches that place in the marketplace and stifles competition continues to incentivize collection and use of the data to maximize profit. several of you have basically set list and you've heard it. the problem is with the business model, one designed to attract attention, collect and analyze what keeps the intelligent in place. whether content is detrimental to the individual, minor or adult, society in general isn't a concern.
2:08 pm
tech companies recently announced they will eliminate targeted advertising on certain topics and cleaned up contextual advertising still occurs in other media factor during this nearly three years, my question basically is, should we restrict targeted advertising? we just restrict it? should be banned targeted advertising to children? i understand there would be significant consequences but if the societal cost is as high as some of the witnesses here and what we have heard doctor mark today becomes cost-benefit analysis, the business model is a feature continues. republicans talk about increasing competition in the marketplace and these talk abouy
2:09 pm
platforms to engage in that harmful conduct? their claims that target ads are not only much more effective but sell at a substantial premium. i know at least one study from 2019 includes after accounting for other factors like user's device information or geolocation data, revenue only increases by about 4% when users available. that corresponds to an average of .0000 8 cents an advertisement. as we continue to do this and we move around and talk how we do all of this, i think the question has to become, how to treat this incentivize the company's financial property on conduct that is harmful to adults and children? i think we do this -- i've learned just enough about all of us to be dangerous and we
2:10 pm
continue to move through this. as a legislative body and people who interact in this, i think it's about time we have a real conversation about that i've got one minute and 56 -- yes? >> question. the industry is moving away from targeted advertising because of the gdp are and others from related roles moving away. the question isn't targeted advertising, it is what are they watching? the algorithms -- i worked for fox so the goal was to spend money to watch the super bowl because he got people watching it. the ads were relevant so people will pay for the ads in the paper super bowl ads not targeted because of the crowd your ship so how are the
2:11 pm
algorithms, this black hole where they try to create people to be stuck in a system, edge players, how do we deal with that? i don't think getting rid of targeted advertising will help as much of the issues around what jonathan is talking about as the issue of manipulation of people and bringing them down this black hole. >> i would reinforce what rick said, it is surveillance advertising that is the problem. i don't have a problem with advertising to children, it happens on saturday morning cartoons since the dawn of television and other media. the challenges we don't know what information they are collecting and refuses to be transparent about it. it's one to use the term one big black hole so we need is to submit to a great degree of transparency and elucidate how the rocketing works and prevent
2:12 pm
children and others from being manipulated. >> if i have one area talking with the worksite group, worksite, the control groups to on a set of off and i would go a long way protecting kids. they don't know how to turn on parental controls and set them on, the device level as well as networking level would be very helpful. >> can i just agree with you i think getting rid of data driven advertising is one of the most important things we can do? >> i'm 26 seconds over but the one thanks to that, whatever the new financial incentive is, will have to deal with that. i bring it up for the financial benefit. >> congressman clark, you are recognized for five minutes. >> thank you. thank you for holding this important hearing.
2:13 pm
thank you for your insightful testimony today. technology will always be a double-edged sword. while it's often a source of progress in the world, we must take care to limit harms and abuses that inevitably occur. as i mentioned during our hearing last week as a communication technology subcommittee, widespread views by social media platforms to the content that users view as far too often resulted in discriminatory practices and harmfulness information. it's quite clear the platforms knowingly amplify the most dangerous divisive content, it is central to their business model. this is a major turn of mind when it came's to safeguarding our democracy and stopping the
2:14 pm
spread of marmite and misinformation. after the election, senate intelligence committee found black americans in urban areas were disproportionately targeted on social media boss reports and conspiracy theories meant to propagate this mistrust in our democratic institution. reports specifically note russian operatives took advantage of the facebook recommendation algorithm, and assessment facebook officials cooperated. how would legislation online transparency act, the targeted flow of disinformation aimed at this like we saw in the 2016 election and now cease again with the covid vaccine sets.
2:15 pm
>> think about the question, i think there are two ways in which it would help in civil rights concerns, it would help us deal with the algorithms that feed uses materials to people who weren't already following. an algorithm showed people following wellness influences forfeit anti- vax content. anti-semitic content because it new product if deepen their extremism. the second thing, there is an issue where this information all think that's been around for a long time, social media is like retrofitting a package onto the misinformation and terms a dominant weapon into a smart weapon in the communities it's most effective with we have seen the ability of content produced
2:16 pm
by bad actors and anti- vaccine, robert f kennedy junior and misinformation about vaccines, the algorithms drives it to the audience vulnerable to it which has led to death. forty-nine out of the last 50 deaths in d.c. for of african-american people with covid. it's a direct reflection of misinformation in our community. >> thank you. lack of accountability and transparency into how companies use the systems and sounding the alarm on for years. it's important we recognize the use of fiscal mandatory algorithms limits social media platforms and algorithms are used by large companies to
2:17 pm
determine everything from who is eligible for healthcare coverage and whether or not a home buyer receives a mortgage. while this may have benefit, the reality is current safeguards are insufficient to protect americans from harmful biases and design flaws inherent in new algorithms. this is why will i will introduce an updated version of my act along with senators wyden and booker that requires a large company audit algorithms for bias and discrimination and report their findings in the ftc for review. from a general perspective, why is it so important address algorithmic biases that reach those decisions in people's minds? >> thank you for that question,
2:18 pm
i think you described very well and clearly yourself. algorithms make decisions based on data. the data even when accurate, it reflects information that was not taken into account when making certain decisions. we make them with things like race or gender or age or other key markers of identity in mind. algorithms can only make decisions based on data so right now is something perfectly legal in many cases.
2:19 pm
>> i'm so sorry, i am over time and didn't realize it, thank you for your response. i yield back. please pardon me. >> thank you. you are recognized for five minutes. >> thank you, madam chair. in recent years, proposals for the creation of internet platforms and services to children some of us i know we've covid, i apologize for missing part of the hearing which i am thankful it's on at the indefinite hold since i'm certain there were cyber bullies. our society has been terrible impacts and bully with far too many injured or even losing their lives as a result of malicious actors online. i applaud you for your work as a child safety advocate, one proposal i put. reporter: require publication
2:20 pm
and enteral updating of content moderation practices related to cyber bullying for internet platforms. this transparency will be a powerful tool for users to know what kinds of content would not be tolerated on a platform could be used from a they could use the information to allow and restrict child's access. would you agree providing clear consistent rules in this space would reduce cyber bullying? >> ideal for the myspace, people may remember myspace, the largest social networking at the time, it's an area we focus on because of the concern our ceo and others had we purchased it the harm that could occur through cyber bowling it was the first time and we did and still practices to try to stop it and monitor and report to hinder cyber bullying one another. clear processes in place would be very helpful but i also think what i said earlier about having
2:21 pm
parental controls on, which kids can talk to which kids and make sure their kids are safe is critically important. >> i got for kids, a tough nut to crack. sometimes you don't even know your kids are on certain sites. they have dual site and one where they share their parents and one they are actually communicating on. i do think parent engagement is extremely important in this situation because we as parents said we have access to your phone information and computer information the first time you don't give it to us, you lose your phone and access to the computer. >> active in this space because of the harms kids go down a bad rabbit hole in this area and it can be detrimental to the health
2:22 pm
and safety and fair education, something that really needs addressed. >> we could have everything in place and if the parents or guardians are not daily really, i've got poor kids, daily engaged in what their kids are doing, we could do all we want here but we may not still be able to stop it but is still important to do it. you think the current password policies relating space today have helped allow cyber bullying in many cases? >> i don't know, the hard part even with myspace, free speech and first amendment, what is cyber bullying and what is bullying? it's difficult to address. patrick's work with different state laws, we try to figure this out ourselves how you draft a law but stop cyber bullying.
2:23 pm
>> i am curious, did you have childhood and teen consultants on this? i don't sound crazy but all of us who off understand, what we think is parents might be one thing, kids have quite a bit of insight. i talked to my kids and i'm like okay, i don't get this at but it would be interesting to know if you think it would be helpful for companies and maybe even congress hear from the kids what's happening out there? >> we didn't have any teens with us but one of the leaders in child safety advocates in the early days had a group called teen angels and she would talk to them and we would talk to her and get ideas. the other thing we did is we have a direct line for children to see what could we do to fix it and make it better who
2:24 pm
basically took every recommendation they made some say it's all about facebook and no one knows about myspace but we thought was the right thing to do and we took steps, we would not implement certain functions because it couldn't figure out how we could protect children that made sense. our chief safety officer, we would talk every day and what we could do to make myspace safer and it tough but you can do it. >> not only doesn't need to make sense to us and the people potentially i cyber bullying, i suggest we consider that in the future talking about this subject, we might have a few people, young people in the arena so to speak, us some advice. i think it's not a bad idea. i go back. >> the gentleman yelled back. now mr. -- you are recognized
2:25 pm
for five minutes. >> thank you very much for holding this critical hearing and i want to think all the witnesses for your education, hopefully we can make good policy to guide what is going on under our noses every day. every day americans are forced to accept complex opaque and one-sided terms of service to enjoy popular platforms that often market themselves as free but i'm holding your 27 pages of an agreement that anybody who uses snapchat has agreed to these 27 pages. roughly 106 million active americans on snapchat, how many of the users do you think have the time or formal legal education to understand and agree to a contract such as
2:26 pm
this? written by a team of lawyers by the way. the average american does not have a team of lawyers work with a aborted. >> i predict right around none is the number of americans who have actually read every single one of these pages. this goes for many, many platforms, some of the platforms reduced their agreement to two pages, probably much finer print and a lot more legal fees and once again at the end of the day, same difficult terms. snapchat private, protecting user privacy and those who use the platform believe there snaps exist temporarily before being automatically deleted but when you read the terms of service, you realize that's not the case. snapchat employees can access private user data including photos and/or video to go even further, hitting in snapchat terms of service, you grant snapchat and its affiliates on
2:27 pm
unrestricted worldwide royalty-free irrevocable and perpetual right and license to use the name, likeness and voice of any one feature in your public content for commercial and noncommercial purposes. that is one of the causes buried in these 27 pages. folks, i said one of anyone, anyone featured in your content. that's what that means, anyone featured in your content sort of pipe what our content and my colleague kelly next to me, all of a sudden i brought her in and she has not agreed to anything but it applies to what i've done and i may have injured or aggrieved somebody i care about. that means people who don't even sign up are subject to this agreement and again even if that person disagrees, they have a team of lawyers to go ahead and fight for their rights?
2:28 pm
those who read the terms with notice platforms often include an arbitration clause, the ability of users to take the company's to court, instead they force users to resolve issues in house on the company's own turf with their team of lawyers against you. free services, please platforms seem to take a lot of our users for granted and a lot from us. can platforms use terms of service to include a provision that harms users and puts them outside reach of the law? >> thank you for the question. i'll preface my response saying i am not a lawyer or consumer, protection lawyer at that that being said, it seems to me what you've said is incredibly valid. pages and pages and pages of legalese and expecting my 15-year-old or 12-year-old to
2:29 pm
understand that is laughable at best and malicious at worst. the reality is, this is why we need transparency. transparency in how the algorithms work, transparency in the data if they are collecting and we need not truth of advertising for a truth and terms. what you just laid out is indefensible when directed at a minor. >> not just a minor, the average american, it's not a even playing field. not at all. >> briefly, this is why we need section 2d230 reform progresses if there is a violation of terms, we need civil litigation to find out if there is a violation so we can get teams of lawyers engaged in this process. without section 230 reform we are talking about and duty of care, where waiting for a
2:30 pm
whistleblower which we hope comes but may never. >> can i jump and? i realize i'm off but section 230 has nothing to do with this, this is about privacy. >> okay, thank you. i'd like to ask a yes or no. >> can i just say, any value we care about shouldn't be subject to notice and choice in terms of service. >> thank you very much and this issue is obviously important not only to the average american especially for those of you deeply involved in this every day as i can see by your answers. very quickly -- >> gentlemen's time has expired, and my right? yes, you will have to put that in writing. >> the same generosity of my colleagues, i love you -- >> i got back, i get back. i go back. >> i would but -- >> my position as well -- thank
2:31 pm
you. >> now congresswoman dingle, you are recognized for five minutes. >> thank you, madam chair. thank you for holding this hearing and thank you for all of you testifying here today. in our march hearing major tech ceos, i raised the fact that provocative and divisive content often is more engagement on social media platforms which many of you have in your testimony. there were audits, investigations and reports continued to substantiate claims companies are aware of this and i believe this ensures they are not with profits and engagement over the safety and health of their users. i would like to move some questions focused on these protections. prioritizing engagement, to the panel if you answer this with a simple yes or no. are these companies actively
2:32 pm
making the choice to prioritize profits and engagement, overcome this information, violent content and negative health outcomes for individuals and children, yes or no? >> yes. >> yes. >> mr. ahmaud. >> yes. >> yes. >> mr. lane. >> yes. >> okay, so we've got that. my next question, is there significant evidence the changes we are proposing today for algorithms to have an outside impact on user engagement on the platform, what is the cost for consumers and companies incentivizing or reforming these
2:33 pm
changes? >> that is a great question. the single most impactful thing we could do to change these incentives, which as you say, push companies to prioritize engagement above all else, to ban surveillance advertising. this most effectively be done through privacy report. >> thank you for that. i firmly believe independent researchers and the ftc should have access to data from these companies to ensure user data is not exploited in ways that push individuals and children toward disinformation, extremism, negative health outcomes and that's why i asked the social media data act introduced by my colleagues to ensure they have access to information on targeted online advertisement to study potential harms to consumers and create working groups to establish guidance and
2:34 pm
best data. in march, i asked zuckerberg if he was opposed to a lock to enable regulators to access social media. in his response he said to give transparency into these systems was important but we sheer haven't seen progress on the issue so far. so far companies have resisted transparency on advertising data with independent regulars and researchers despite repeated commitments to do so and repeated revelations that they are aware of the impact. >> in short, as bad as they are at moderating and governing user content on their platform, they are even worse at moderating advertising. facebook and other platforms
2:35 pm
have ads that are illegal in the country in which they are served, that violate platforms own stated rules and they don't want to be caught doing that. they know in the case of facebook, 99% of revenue comes from targeted advertising. for global it is in the 90% or something like that, very hyper up there platforms as well. once you start tugging at that string, the whole "house of cards" is likely to come down. this is completely ungoverned and anticompetitive sector in the economy that needs to be regulated as soon as possible. >> i have more questions which i will submit for the record but i will give you my last one. how do platforms create additional barriers or in some cases, completely block independent researchers on retaining data? how would social media data
2:36 pm
alleviate some of these articles? >> the nyu case from this summer is the prime example of that. companies constantly are changing their code to make it harder for researchers to automatically connect and collect information public on the internet, you don't need to log into access. they also shut down accounts of the platform, individual researchers when they do research the company signed threatening. >> you will have to wind up your answer right now. >> thank you, they also suit individual researchers which was showing to research. >> thank you, madam chair. i will say one thing, consequences of these decisions are apparent in make this deadly. thank you for holding these
2:37 pm
hearings and i hope our committee asked soon. >> the gentle lady yields back and mike kollek from illinois, congresswoman kelly for five minutes. >> thank you for holding this hearing today in our productive technologies. i want to thank the witnesses were testifying today and helping us grab legislation and hold big tech accountable. i want to say to you 20 years ago, maybe more now, i got engage with antidefamation league and it changed my life because i got involved in a campus up difference and you helped me see things through a great months that i still have with me. one of the fastest-growing methods for firing customers
2:38 pm
online is to influence their marketing. influencers are people who have a lot of followers and social influencers online use the influence for self products. influencing marketing is a multimillion dollar industry in the u.s. what i find concerning so many of today's top influencers are children, so-called kid influencers with massive bombings on social media. it's not clear what content is organic or sponsored advertising. studies show it is significantly worse with children because children do not yet have cognitive abilities to make these distinctions. can you talk about the harms of kid influencers for children online and why do you believe such advertising has become so prevalent? >> the reason it is so prevalent because it allowed on the internet and not on children's television so on children's television we have children's television act prohibiting
2:39 pm
product so children's understanding, they already understand advertising less than adults but the way we can get children to understand advertising better is having it clearly separated from content. research shows number advertising is embedded, the less children understand what's going on so you have situations like on youtube, unbuckling videos, lines and twice reviewed literally billions of use were kids where ryan talks about a toy, he's been paid to talk about for ten or 15 minutes, kids are washing infomercials. kids who watch the videos are more likely to enact their
2:40 pm
parents for what's advertised and more likely to throw a temper tantrum if they say no. influencer marketing is linked to higher levels of materialism and if you look at francis haugen's documents, one thing teens are saying is influencer culture is toxic and makes them feel bad about themselves. >> you also know social media platforms often facilitate and make a lot of money from influencer marketing. what responsibility do you think the platforms have to protect children from this kind of marketing and in your mind, are they fulfilling these responsibilities? >> they absolutely are not the filling piece responsibly. youtube makes so much money on influencer content an instagram makes those platforms but i don't think we can wait for the platforms to do the right thing and that's why we need legislation like the kids act that would ban the platforms from recommending influencer marketing to get. >> how do you think the kids act would help protect children in
2:41 pm
these instances where it is hard to distinguish between authentic and sponsored content? >> it would prohibit platforms from amplifying the content to children so i would be a mechanism for platforms could be held responsible and if they face fines for doing that but they would clean up their act. >> i have a little bit more time, does anybody else want to make a comment about this? no? okay, i will yield back. thank you, madam chair. >> the gentle lady yields back. you have five minutes. >> thank you, madam chair. transparency, privacy, information protecting our kids, all critical ideals are committee is charged with helping uphold in social media. these are a challenge in english, it is pure chaos right now in spanish and other
2:42 pm
gorgeous trying to uphold these ideals so i applaud the chair and ranking member for bipartisan group of bills before today that we are starting to review. we've seen lies about the vaccines and january 6 and the 2020 election. we've seen lies that breed hate and division in our nation so this committee take this very seriously. for spanish language content, it's often less moderated from misinformation and from and spent english content. the content is often allowed to remain on social media for longer durations than english content. how does having unregulated misinformation hurt and arctic communities and people of color?
2:43 pm
>> facebook spent upwards of 90% of its resources and misinformation in english despite the fact that 10% of its users are doing so in english. last miss allocations do a poor job as stated already. adl participates probably participates in spanish-language disinformation coalition and we work a great deal to look at the issues and i can tell you we've found examples, we did an analysis last november in spanish-language anti-semitism on facebook. we found a few keystrokes about 2000 spanish-language accounts in violation of facebook's terms of service they failed to take down from hundreds of thousands
2:44 pm
of users, upwards of 55000 so the big problem. >> we have seen local newspapers and local television and our state so we are concerned about it and it's repeated. foreign-language misinformation in communities of color algorithms failed to address this. >> we are good at targeting the right misinformation in the most vulnerable audiences and bad actors to understand spanish-speaking market is easier want to sell this information into the moderation of the content lower potential of the content being removed.
2:45 pm
if you take vaccine misinformation, the content targeted to spanish audiences non- spanish originators, the key members of disinformation having con into spanish at the same time and putting it out into spanish-speaking audiences. we saw that taken up and people debating, not vaccinated initially and what is that mean? literally latin x communities in america were dying because they were more exposed, a higher prevalence of covid and they were persuaded not to take the vaccine from other thing to protect them. >> has a comparison, we sought vaccination rates high and
2:46 pm
central florida. it wasn't politicized in social media but without are there areas in south texas were misinformation campaigns were deliberate and where did that lead to? i heard crazy things said about the vaccines when the only crazy thing is not taking them to stop the deadly virus thank you for your input and i go back. >> the judgment yield back. we welcome a wave onto the committee, also a chairman of the subcommittee, mr. doyle for
2:47 pm
his five minutes of questions. >> thank you very much, to both of you for continuing this series of legislative hearings. looking over to protecting consumers online and to hold online platforms accountable for their actions. last week at the communications and technology subcommittee we heard about online platforms as well as experts on legislative solutions to address this. as we have heard today providing victims access to the courts is not enough to address representative of issues surrounding the tech platforms. i agree transparency and other accountability measures are necessary as well so the witnesses testimony are very important to move forward. you made comments to this effect in your testimony, you know hate
2:48 pm
speech and potentially disinformation and other dangerous content is protected by the first amendment and you go on to say we need to do more than section 230 reform and hold platforms accountable. can you talk about how some platforms are attuned for disinformation? i would like detail on platforms discourage information, hate speech and harmful content. >> thank you for the question. first of all, let's acknowledge hate speech is part of living in a free society first amendment protects ideas, even those we don't like. i'm sorry, speech that causes direct harm is different. freedom of speech is not the freedom to slander people.
2:49 pm
freedom of expression is not the freedom to incite violence. platforms like facebook or twitter often will use to not take down most threatening people or express lies or misinformation, directly damaging to the public good. the reality is, there's a reason why newspapers, magazines, movies and television, radio and all other media do not allow such content on their services because they would be liable to mitigation and lawsuits if they did only social media companies enjoy the privilege of non- accountability because of section 230 referenced earlier. >> research has shown with little information about a user, facebook's algorithms can show conspiracy theory and other disinformation of the user. it's a good policy to protect
2:50 pm
facebook from harm that comes to the user as a result of that information. >> absolutely it is bad policy. it's unambiguously bad public policy and a loophole extremists split it to great effect. we've seen in the open extremists use facebook groups to organize actions against other individuals. if this would be inexcusable and on the other context. people are allowed to say hateful things. the question is whether facebook and services should privilege them, amplify them, elevate them. i say the answer is no. >> how to repair transparency and reporting requirements with other forms like discussed last week to protect both online users and maintain back online ecosystem? how do we have meaningful transparency requirements not used by those promoting odious forms of speech even if
2:51 pm
protected by the first amendment? >> one thing that could be done right away is to allow researchers access to the information. you don't necessarily have to make it available to the entire public but accredited researchers who apply could be given access. you need real criteria so facebook and other companies couldn't deny credible requests but as public servants, you and the government have to be compliant with that request. there's no reason to create this requirement of these companies because the data public data, citizen data and they should be shared or transparent in sharing. >> we know through your research and facebook's research a small number of users are responsible for disinformation we see online. incentives are not aligned for these platforms despite the content taken seriously even
2:52 pm
when we know it leads to real world harms -- can you tell us -- >> they still have around 52% of their audiences before we wrote that report. some action is taken but for the main part, it's still up there. why is that true? what these would collectively do is create transparency and therefore accountability for those failures. >> thank you, madam chair for holding this hearing and i go
2:53 pm
back. >> thank you, we are honored to have your presence today. i want to now recognize representative lesko for your five minutes. >> thank you very much and thank you to all the panel members for testifying today. this is such an important issue, as he said false information so much faster on social media and accurate information, i found that to be true and a lot is because whether it media or whoever it is for the titles and things we click on and use it, my first question is for jessica rich. ftc recently released the draft 2022 to 26 span. i understand the language from
2:54 pm
the mission specifically says they will accomplish the mission without unduly burdening activity. how concerned are you this mission statement could lead to increased regulatory burdens on businesses? >> the deletion about language sends a bad message i would like to think of my former agency did a mistake and they plan to put it back in. one thing important to remember is regardless whether the languages and a mission statement, back concept runs throughout so much law and policy, regardless of mission statement or no mission statement, it's going to be very
2:55 pm
hard to ignore undue burdens on activity. still unfairness and substantiation, so many doctrines but it was ill advised to take it out and sends a terrible message. >> thank you for the answer and also to you, jessica richard, your former at tc consumer protection, what is your reaction to granting ftc penalty authorities on a language in the mission statement? under the build back better act. ftc badly needs a stronger remedy especially with the rollback of this authority but it would be far better for the ftc and the public if this
2:56 pm
authority came with more direction from congress regarding the situations where this would apply. one thing to note not talked about very much is even with this new authority, ftc would still need to prove any company before paying civil penalties has knowledge they are violating the law so that would be an important safeguard that would still be in the. >> all right, thank you very much. my next question is for mr. rick lane. areas of vulnerability, he said in your testimony, putting personal data at risk are situations where personal information is stored in foreign countries. one is china. how important is it for section
2:57 pm
230 includes reforms to transparency and content moderation and storing personal information. >> i think it is very important. we have treaties we sign how we can't require data localization so we can't say where people can store based on treaty and it should be looked at as well but in terms of what's happening with tik tok and others, i do believe we need to take a closer look how the data is being accessed, who is accessing it. the concern i have is a documentary social dilemma is where they, turning the dial to influence our behaviors just a little bit. elections are won or lost by two percentage points sometimes and i would hate to see information derived someone behind the scenes turning the dial may be
2:58 pm
hostile to our u.s. interest. >> i agree with you. i did watch social dilemma and it kind of opens your eyes how we are being influenced behind the scenes. thank you and i yield back. >> the gentlewoman yields back. i recognize congresswoman rochester for five minutes of questions. >> thank you madame chairwoman for the recognition and allowing me to enjoy joined this timely hearing. potential has been used to create and innovate, unfortunately it is also misused by bad actors and misinform, divide and distract playing on unsuspecting americans. this hearing today represent a bipartisan consensus large tech companies must reform practices to ensure internet remains a
2:59 pm
place of potential. the common denominator underlying common things we've heard about is the ability for tech companies to use design practices to undermine user choice for the psaki of profit. i introduced bipartisan bicameral detour act because tech companies that use decades work of research and compulsion and manipulation often conducted to design products that trick people into giving up data or consent to potentially harmful content. today we have dark patterns and exist on virtually every area because this data collection skews algorithms and targeted at and of bipartisan way. if we allow them to hamper americans from making choices in their own self interest, will never be there will potential. i would like to begin with you,
3:00 pm
can you provide an example of dark patterns that undermines user choice on the internet today and what makes these effective and influencing user behavior? >> cc pa internet users have gotten used to seeing data collection consent in the reason is to give us choice over whether or not to make it possible for companies to collect data but this is undermined by deceptive designs. you notice many of them make it easier to allow the website to collect whatever data it wants and get details about what data we want to allow or not be collected. someone like me, i am often pressed for time so i click accept whether going through
3:01 pm
half a dozen clicks to limit data collection was needed to work properly. sites should only be able to collect data actually need to do what they need and it should be just as easy to protect your data as it is to give it away. >> considering regulation to target children especially those called compulsive behaviors. >> we should regulate dark patterns aimed at children for three reasons. first, they are extremely prevalent. most of the apps and games children use manipulative techniques on the owned by ap testing to get kids to stay on platforms longer and get them to watch more at get them to make and game purchases. the second reason is because it is unfair. when the idea is to undermine user autonomy and manipulate children, it is unfair. a couple of examples, there are
3:02 pm
apps aimed at young children for the characters and games mock children if they try to stop playing and taunts them into playing longer. so many games children play use virtual currencies with no fixed rate so they manipulate currencies so kids to understand when they buy things with real money how much money they are spending. we should regulate them because they cause harm to children from the financial harm i mentioned were kids rack up hundreds and thousands of dollars in game purchases but they are being used to drive compulsive use and get kids to have more screen time which displaces things they could be doing that would have more benefit to them. >> and contribute to healthy job development, you are correct. a lot of times when we discussed patterns companies shouldn't do, you mentioned social pattern library and considers important
3:03 pm
things, can you describe findings and recommendations made as part of the library? >> thank you for the question. number one, nudges are useful. we've seen services like youtube and twitter clement them based on recommendations to decrease heat on their platforms. doing things like turning things off like the automatic auto real you see on services like youtube so videos keep playing over and over and the children are fed content without actively choosing it. another design principle is you don't have to have controversial videos, i think you have to have controversial videos but videos that violate the policies, there's no reason to promote them, they should be taken down. while they are being viewed,
3:04 pm
there are a lot of techniques product managers can do to iterate results slightly in a way consistent with preserving freedom of speech. >> thank you. my time has run out but i will follow-up with the question, thank you for this important hearing, i yield back. >> thank you, you are recognized for five minutes. >> thank you. i appreciate being waived on today, this is a hearing that i think is important, multiple hearings on big tech and its impact. i know members of the committee on both sides have supported comprehensive national privacy data security framework and we have a record of working in a bipartisan manner to achieve that and i'm grateful. many where the proposals are considered today, i fear without bipartisan cohesive framework we
3:05 pm
continued out a path of patchwork loss for consumers in place undue compliance burdens on businesses. we may have significant differences on issues such as section 230 reform but privacy, particularly the comes to children should be a no-brainer. maybe that is the wrong term to use, it should be a good breeder. that's why i introduced my good friend congressman rush, a bipartisan bill to update and modernize children's online privacy act. i wish it was part of the hearing today but isn't but still it can be in the future and i hope it is. mr. lane, you know this is not the only legislation aimed at enhancing child privacy laws proposals in the house and senate that reemphasizes my points that this should be a
3:06 pm
bipartisan issue. i have concerns with some of the copout legislation introduced including language to grant new authorities to the ftc that may burden legitimate business activity such as good actors that have self-regulatory guidelines. lane, could you speak to white elimination self-regulatory guidelines is harmful and what might be unintended consequences of doing that? >> thank you for the question. i am a big supporter of reforming it. i think it should get starting at 17 and go younger, not 16. things have changed since the bill was wrote in 1998 but one of the pieces of the bill that is important that may be left out or included in reform bills is self-regulatory environment
3:07 pm
having ftc compliant entities being certified. the reason we support that in the past and why we liked it, it was to help parents know if their kids go out of sight, for 12 and under, there was a mechanism like a good house keeping seal of approval because we were concerned lack of resources, they can't investigate everybody so we thought we could help put together a mechanism to say you have certification program you go through and that program company certified by the federal trade commission help provide parents with information at the sites they were going to have their kids on will be compliant. there have been bad actors and recently one was booted from the program. i would support stronger enforcement doing a great job
3:08 pm
but it may do a disservice to parents if they have to guess and hope and pray that thousands of websites targeting 12 and under are compliant. i think that would be a mistake. >> my legislation as you may know, raises the age for consent online under 13 to under 16 years of age. it seems big tech in this space has a race to the bottom going on. >> jessica was one of the first i reached out to on this, the child privacy protection gap because what happened is as kids migrate into this digital e-commerce world having digital wallet and debit cards, they have an opt out regime and you help the parents what opt out.
3:09 pm
but no one reads the opt out. it's targeted for children 12 and under so the concern with this combination of cap i thinke filled by legislation. >> i appreciate that and have more questions. i appreciate you adding that. >> can i say one thing? >> i'm afraid it's going to have to go to respond in writing, we have to move on. i now recognize for five minutes mr. carter.
3:10 pm
>> thank you for allowing me to wait on. appreciate it very much. i want to go to the exchange you had with ranking member rogers. we've got a lot of supply chain issues going on and they can go beyond a local retail. save i'm an owner of a car dealership or even a grocer in a small town in west virginia. i am paying more now than i was before to get access to products not as available as they were before. i may have to charge more than i did a month ago simply because of increase in costs obviously. i don't know ins and outs of it so our process changes in authorities discussed today and other actions going to cause a lot of confusion?
3:11 pm
for me as a retailer trying to responsibly run my own business? >> i have not done that analysis but i do know right now there is a lot of confusion when ftc chooses to pursue something through unfair practices so it's always better off when it has direction from congress as to what the standards are for particular concerns like content moderation, privacy and etc. so at least in many circumstances, direction from congress increases confusion -- >> decreases confusion. >> decreases confusion. i think but maybe you are asking about is the issue of having multiple laws instead of one
3:12 pm
together which i have been advocating for privacy at least companies can look in one place for direction about important issues like data use. i do think having one comprehensive privacy law which could include many of these elements would be better off than having multiple sectors. >> i was in business over 32 here's an first of all, i didn't have time to do this research. second, we are inside baseball here but many of these business people don't know how to navigate all of this. >> i agree multiple laws, an area i'm an expert in, privacy, it has not been good for small companies or even big companies but it does worse for small companies who can't figure out
3:13 pm
what laws apply to them. >> let me move on. earlier this year there were several senate democrats that set a letter encouraging her to begin a rulemaking process on privacy. i'm hopeful my colleagues in the senate will second-guess this approach once they know how complicated it is because it is truly complicated. we need to simplify. simplify. i'm also concerned with the timeliness it will take to complete a rulemaking process, can you shed light on how long the process might take and what it might mean for consumers and companies looking to understand this patchwork of state laws? >> there's a tremendous overselling of potential of the ftc to issue a rule on its own using its authority. that is a cumbersome process that requires for each mandate
3:14 pm
in a row, ftc has to prove unser -- all sorts of procedural hurdles. many rules pursued under this have taken years to complete and given the controversy debate surrounding privacy over the course of 20 years, public would be best served if congress makes the tough choices in this area. >> understood but again, years of work it will take to get this. >> and litigation. >> absolutely. most business owners get so frustrated, they throw their arms up and a lot of them quit. i got a lot more but i will submit it in writing. thank you and i yield back. >> the gentleman yelled back. mr. duncan, you are recognized for five minutes.
3:15 pm
>> sometimes they save the best for last but i'm not sure that's the case here. i want to thank you for today's hearing including my bill, tell act disclosing whether china and their state on the interviews are storing access and transferring personal data of american citizens without being transparent. tick tock, one of the most popular social media platforms for our children is a beijing -based finance. no concerns about american companies doing business in china and accommodations they make to people's republic of china, it is astonishing there's any access control in the chinese communist party has over this conglomerate similar entities. it is great to see you again and thank you for being here. holding big tech accountable, are you concerned about data collected by tik tok and
3:16 pm
companies with similar relationships and china and what it might mean for national security for our country? >> i am concerned about that, we should all be concerned. >> thank you. what other provisions on vulnerabilities to think we should incorporate in legislation to protect economic and national security interest? >> i think the legislation starts in the right place, it's a teachable moment people will know where the information is house and where companies are based in hopefully they take self correction action necessary but i also worry about websites and other apps that will not disclose and how to refine those. russia and iran and china are well-known cyber warriors and a lot of mischief underneath the ones we see. my concern is we have this issue where we could find out so combining information, are they where they say they are and
3:17 pm
headquartered where they say they are? we could find information like that out, that's what forensics does but unfortunately, for the past five years, stonewalled congress taking action in this place. congressman was talking about letters he sent from home and security and you have companies, they will be on the hill talking about how we don't need to upset the process. that is going on five years now and if in five years of darkness, if they did develop something tomorrow, it would take three more years to implement. congress could act on this now and has the opportunity to fix cybersecurity at no cost to the u.s. taxpayer. it is in our hands and you can ask any expert, i have reports, whether top people talking about
3:18 pm
this, who add in your legislation where they are and where the data is stored on top of a strong legislation to fix this ddp problem. it's not a dress , it's foreign. imagine this law that shut down who is threatening national security for an iranian law, what we stand here is u.s. congress say we shouldn't upset the stakeholder process to address these loss? the answer would be no and i think it's time for u.s. congress to step up and try to fix this problem before more people get hurt. big tech is not just facebook or twitter. companies like microsoft and apple and google. i just want to say this because i thought about this while you were speaking. i don't know that we truly care
3:19 pm
about all this collected from our children to platforms like tik tok and others. raise awareness because the past two congresses i have tried to get this committee congress and one democrat to cosponsor legislation to stop importation of childlike sex dolls. dolls used by pedophiles, images, likenesses stolen from social media platforms the doll created crafted to look like a child of one of our constituents so someone can play out sex fantasies with a childlike sex toy, a double. very humanlike, robotic were even voices taken from child tik tok and digitally put into the
3:20 pm
child like sex toy so it can talk like that child to the pervert enjoying themselves with it. bottom chair, democrat, get that over and let's stop this. when i talked to your colleagues, showing pictures of the dolls, i'd be glad to share with you. we need to do something about that and nothing is done we continue to import sex dolls to look like the children of people in our communities, it is just wrong and i yield back. >> the gentleman yields back in fact concludes questioning and i want to thank from the bottom of my heart, this is a wonderful panel and i think all of you for
3:21 pm
the work you have done and it will lead to real action and i believe in the congress. before we adjourn, let me also thank my ranking member, i don't know if you want to make any final comment for our witnesses. okay. i request unanimous consent to enter into the following documents into the record, and online tracking study without obstruction so ordered. stay for one more second because i want to remind members committee rules they have ten business days to submit additional questions for the record, i know unfinished
3:22 pm
questions need answered. to be answered by the witnesses who have appeared today and i asked the witnesses to respond as promptly as possible to any questions that may come to your. thank you. thank you to the participation, there were five wave runs to the committee which is a lot showing the interest in this committee. at this time, the subcommittee is adjourned. in all of our conversations. [inaudible conversations] [inaudible conversations] [inaudible conversations]
3:23 pm
[inaudible conversations] >> senate sub finance committee examining competition. watch tonight 8:00 eastern on c-span2. if you can watch anytime or c-span now, a new video app. ♪♪ c-span is your unfiltered view of government funded by these television companies and mortgage including comcast. >> you think it's just a community center? it's way more than that. >> comcast for link with 1000 clearly centrist great wi-fi enabled so students from low income families can get the internet they need to be ready for anything. she spent as a public service along with these other television providers giving a front row seat to democracy.
3:24 pm
>> jim byron began working at the nixon foundation is a 14-year-old marketing intern. a 228, foundation and ceo, sunday on q&a talking about the life and career of president nixon in the work of the foundation. >> we are looking ahead to the 50th anniversary of his trip to china, russia, the ending of the vietnam war and the signing of airports and bring home their appeals in the yom kippur war and watergate. we as a foundation spew these experiences advanced conferences around these programs i should say, these anniversaries and make them into these programs across social media and we are connecting, it is working and we do hear from young people who say i didn't know about that or
3:25 pm
the same holds watergate, i didn't know president nixon was first negotiating arms control agreement with soviet union. there are real earnings being had and it's in support of our mission. >> jim byron sunday night 8:00 p.m. eastern on c-span's q&a. listen to q monday and all of our podcasts on our new c-span now app. ♪♪ >> next, a hearing on the president amtrak another frail officials around the country testified. this round two and a half hours


info Stream Only

Uploaded by TV Archive on