Skip to main content

tv   Social Media Company CE Os Testify on Online Child Sexual Exploitation -...  CSPAN  March 21, 2024 4:44am-5:50am EDT

4:44 am
and at best the biden administration is taking a pass on at worst may be in collaboration with. thank you. >> we are going to take a break now. members can take advantage as they wish. the break will last 10 minutes. please do your best to return. [indiscernible chatter] [indiscernible chatter]
4:45 am
[captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit ncicap.org] [captions copyright national cable satellite corp. 2024]
4:46 am
>> the senate judiciary committee will resume and we have had nine senators was not asked questions yet and we will turn first to senator padilla. >> thank you. once again i am one of the few senators with younger children
4:47 am
and i do lead with that because as we have this conversation today, my children are now all in the teenager or pre-teen category and with their friends i see this issue very up close and personal and in that spirit i want to take a second to acknowledge and thank parents in the audience today many of whom have shared their stories with our offices and i do credit them for finding strength through their suffering and their struggle and channeling that into the advocacy that is making a difference and i do think all of you -- thank all of you. i appreciate the challenges that parents and caretakers and school personnel and others are taking to help them navigate
4:48 am
this world of social media and technology in general and the services that children are growing up with provide them unrivaled access to information beyond what previous generations have experienced. including learning opportunities and socialization and much more and we do clearly have a lot of work to do to better protect our children from the predators and headed terry behavior that these technologies have enabled. and yes, mr. zuckerberg, it includes exacerbating the mental health crisis in america. and nearly all teenagers we know have access to smart phones and the internet and use the internet daily and while guardians have primarily responsibilities for caring for children, it does take a
4:49 am
village so society as a whole including leaders in the tech industry must prioritize the health and safety of our children and here specifically platform by plat from and witnessed by witness and part of the parental tools, how many minors have caretakers that have adopt did this and if you don't have that say that quickly and provided. >> we can follow up with you and. >> how do you make sure that young people in guardians are aware of the tools you offer? >> we make it clear for teams on our platform what teams -- >> what may be clear to you may not be clear to the public. >> this feature the team safety
4:50 am
assist which helps teenagers keep themselves safe in addition to that sent by them it can't be turned off and we also market to the users directly in our platform and relaunch the family center and create a promotional video and everybody opened the app. >> the data we are requesting. >> across all of these services from instagram beyond, how many minors use these applications and how many of those have to have a caretaker that has adopted the parental supervision tool you offer?'s >> sorry i can follow up with that. >> it would be helpful for you to know as a leader of your company and same question how can ensuring that young people
4:51 am
and guardians are aware of what you offer? >> we run pretty extensive advertising campaigns on our platforms and outside and we work with creators and organizations like girl scouts to make sure this is probably -- there is broad awareness. >> how many minors use this and of those how many have caretakers that register with your -- >> there are 200,000 parents use family center and about 400,000 link their account to their parents. >> so this sounds like a big number but a smaller percentage and what should they be aware of of the tools you offer? >> we create a banner for family center on the users profile so the accounts we believe to her parents can see the entry point easily. >> how many minors are in tick-
4:52 am
tock and how many of those have a user that uses those tools? >> i have to get back to ms. numbers and we were one of the first two do what we call family with parents and you use the teenagers qr code and it allows you to do to set time limits and filter out keywords and turnout a more restricted mode and we're always talking to parents and i met a group of parents and teachers last week to talk about this more and we can provide that. >> how many minors use x or with safety measures or guidance for caretakers? >> thank you, senator. less than 1% of all u.s. users are between the ages of 13 and 17. this is of 90 million u.s. users.
4:53 am
>> so hundreds of thousands? >> they are all important and being a 14-month-old company we have prioritize the safety measures and we have just begun to talk about and discuss what we can do to enhance those with parental controls. >> let me continue with a follow-up question. and this is keeping parents informed of the nature of the services and there is a lot more we need to do but for today's services will many companies offer a broad range of user empowerment tools it is helpful to understand whether they find this helpful and i appreciate your sharing this and how your advertising this but have you conducted any assessments of how these are impacting the use? >> our intention is to give teenagers tools capabilities to keep them safe and help and we
4:54 am
launched teen safety assist last year and i don't have the study of the top of my head but we will follow up. >> my time is up and i will have these questions for each of you but we will save this for the record on some of the assessment. >> thank you all for being here mr. spiegel. i see you hiding down there. what is yada yada yada mean? >> i am not familiar with the term. >> very uncool. >> can we agree with what you do and not what you say and everything else is just cottage cheese? >> yes, senator. >> do you agree with that?
4:55 am
speak up. don't be shy. >> i listened to you today and i heard a lot and i have heard you talk about the reforms you made and i do appreciate that and i have heard you talk about the reforms you are going to make. but i don't think you will solve the problem and i think congress will have to help you and i think the reforms to some extent will be like putting paint on rotten wood. and i am not sure you will support this legislation. i am not. the fact is that you and some of your and annette colleagues who are not here, you are not a companies. you are very very powerful.
4:56 am
and you and some of your colleagues who are not here have block to everything we have tried to do in terms of reasonable regulation in terms of privacy to child exploitation. and, in fact, we have a new definition of recession and we know we are in a recession when google has to lay off 25 members of congress and that is what we are down to. we're also down to this fact, that your platforms are hurting children. i'm not saying they're doing some good things, but they're hurting children. and i know how to count vote, and if this bill comes to the floor of the united states senate, it will pass. what we're going to have to do, and i say this with all the
4:57 am
respect i can muster is convince my good friend senator schumer to go to amazon, buy spine online and bring this bill to the senate floor. , and the house will then pass it. that's one person's opinion, but i doubt it. mr. zuckerberg, let me ask you a question, a little philosophical here. i have to hand it to you, you have -- you have convinced over 2 billion people to give up all of their personal information, every bit of it, in exchange for getting to see what their high school friends had for
4:58 am
dinner saturday night. that's pretty much your business model, isn't it? >> that's not how i would characterize it. we give people the ability to connect with the people they care about and to engage with the topics that they care about. >> and you take this information, this abundance of personal information, and then you develop algorithms to punch people's hot buttons and steer to them information that punches their hot buttons again and again and again to keep them coming back and to keep them staying longer. and as a result, your users see only one side of an issue. so to some extent your platform has become the killing field for the truth, hasn't it? >> senator, i disagree with
4:59 am
that characterization. we build ranking recognitions because people have a lot of friends and a lot of interests, and they want to make sure that they see the content that's relevant to them. we're trying to make a product that's useful to people and make our services as possible for people to connect to the people they care about and the interests they care about. >> but you don't show them both sides. you don't give them balanced information. you just keep punching their hot buttons, punching their hot buttons. you don't send them balanced information so people can discern the truth for themselves, and you rev them up so much that so often your platform and others becomes just cesspools of snark, where nobody learns anything, don't they? >> well, senator, i disagree with that. i think people can engage in the things that they're interested in and learn quite a bit about those. we have done a handful of different experiments and things in the past around news and trying to show content on,
5:00 am
you know, diverse set of perspectives. i feel there's more that needs to be explored there. but i don't think that we can solve that by ourselves. >> i'm sorry to cut you off, mr. president, but i'm going to run out of time. do you think your users really understand what they're giving to you, all of their personal information and how you process it and how you monetize it, do you think people really understand? >> senator, i think people understand the basic terms. i actually think -- >> let me put it another way. we spent a couple years, we talked about this, does your user agreement still suck? >> i'm not sure how to answer that, senator. >> can you still have a dead pot body in all that legalese where nobody can find it? >> senator, i'm not quite sure what you're referring to. but i think people get the
5:01 am
basic deal of using these services. it's a free service. you're using it to connect with the people you care about. if you share something with people, other people will be able to see your information. it's inherently -- and if you're putting something out there to be shared publicly over the private set of people, you know, you're inherently putting it out there. so i think people get that basic part of how -- >> mr. zuckerberg, you're in the foothills of creepy. you track people who aren't even facebook users, you track your own people, your own users who are your product, even when they're not on facebook. i mean, i'm going to land this plane pretty quickly, mr. chairman. i mean, it's creepy. and i understand you make a lot of money doing it, but i just wonder if our technology is greater than our humanity. i mean, let me ask you this
5:02 am
final question. instagram is harmful to young people, isn't it? >> senator, i disagree with that. that's not what the research shows on balance. that doesn't mean that individual people don't have issues and that there aren't things that we need to do to help provide the right tools for people. but across all the research that we've done internally, the survey that the senator previously cited, you know, there are 12 or 15 different categories of harm that we asked teens if they felt at instagram whether it's worse or better. and across all of them, except for the one that senator hawley cited, more people said using instagram -- >> i gotta land this plane, mr. zuckerberg. we just have to agree to disagree. if you believe that instagram -- i know -- i'm not saying it's intentional. but if you agree that instagram -- if you think that instagram
5:03 am
is not hurting millions of our young people, particularly young teens, particularly young women, you shouldn't be driving. it is. thanks. >> senator butler. >> thank you, mr. chair. and thank you to our panelists who have come to have an important conversation with us. most importantly, i want to appreciate the families who have shown up to continue to be remarkable champions of your children and your loved ones for being here, and in particular to california families that i was able to just talk to on the break, the families of sammy chatman from los angeles and danielporto from santa clarita. they are here today and doing some incredible work to just not protect the memory and legacy of their boys but the work that they are doing is
5:04 am
going to protect my 9-year-old. and that is indeed why we're here. there are a couple questions that i want to ask some individuals. let me start with a question for each of you. mr. citron, have you ever sat with a family and talked about their experience and what they need from your product? yes or no. >> yes, i have spoken with parents about how we can build tools to help them. >> mr. spiegel, have you sat with families and young people to talk about your products and what they need from your products? >> yes, senator. >> mr. chew. >> yes. i just did it two weeks ago, for example -- >> i don't want to know what you did for the hearing prep, mr. chew. i just wanted to know -- >> as an example. senator, it's an example. >> -- in terms of designing the product that you are creating. mr. zuckerberg, have you sat with parents and young people
5:05 am
to talk about how you design products for your consumers? >> yes. over the years, i've had a lot of conversations with parents. >> you know, that's interesting, mr. zuckerberg, because we talked about this last night, and you gave me a very different answer. i asked you this very question. >> well, i told you that i wasn't -- that i didn't know what specific processes our company has. >> no, mr. zuckerberg. you said to me that you had not. >> i must have misspoke. >> i want to give you the room to misspeak, mr. zuckerberg, but i ask you this very question. i ask all of you this question, and you told me a very different answer when we spoke. but i won't belabor it. can i -- a number of you have talked about -- i'm sorry, x, ms. yaccarino, have you talked to parents directly, young people about designing your product? >> as a new leader of x, the
5:06 am
answer is, yes, i've spoken to them about the behavioral patterns because less than 1% of our users are in that age group, but, yes, i have spoken to them. >> thank you, ma'am. mr. spiegel, there are a number of parents whose children have been able to access illegal drugs on your platform. what do you say to those parents? >> well, senator, we are devastated that we cannot -- >> to the parents. what do you say to those parents, mr. spiegel? >> i am so sorry we have not been able to prevent these tragedies. we work very hard to block all search terms related to drugs from our platform. we proactively look for and detect drug-related content. we remove it from our platform, preserve it as evidence, and then we refer it to law enforcement for action. we've worked together with nonprofits and with families on education campaigns because the scale of the epidemic is
5:07 am
extraordinary. over 100,000 people lost their lives last year, and we need those to know one pill can kill. that reached 260 million times on snapchat. >> mr. spiegel, there are two fathers in this room who lost their sons. they are 16 years old. their children were able to get those pills from snapchat. i know that there are statistics, and i know that there are good efforts. none of those efforts are keeping our kids from getting access to those drugs on your platform. as california company, all of you, i've talked with you about what it means to be a good neighbor and what californian families and all families should be expecting from you. you owe them more than just a set of statistics. and i look forward to you showing up on all pieces of these legislation, all of you, showing up on all pieces of legislation to keep our children safe. mr. zuckerberg, i want to come
5:08 am
back to you. i talked with you about being a parent to a young child who doesn't have a phone, doesn't -- you know, is not on social media at all and one of the things i am deeply concerned with as a parent to a young black girl is the utilization of filters on your platform that would suggest to young girls utilizing your platform the evidence that they are not good enough as they are. i want to ask more specifically and refer to some unredacted court documents that reveal that your own researchers concluded that these face filters that mimic plastic
5:09 am
surgery negatively impact youth mental health indeed and will be -- why should we believe, why should we believe that because you are going to do more to protect young women and young girls when it is that you give them the tools to affirm the self-hate that is spewed across your platforms, why is it that we should believe that you are committed to doing anything more to keep our children safe? >> there's a lot to unpack there. >> there is a lot. >> the tools to express themselves in different ways. and people who use face filters and different tools to make media and photos and videos that are fun or interesting across a lot of the different products that -- >> plastic surgery pins are good tools to express creativity? >> senator, i'm not speaking to that. >> skin lightening tools are tools that express creativity? this is the direct thing i'm
5:10 am
asking about. >> yeah. this is something that -- any specific one of those, i think the ability to kind of filter and edit images is generally a useful tool for expression. for that specifically, i'm not familiar with the study you're referring to, but we did make it to that, we're not recommending this kind of content to -- >> i made no reference to a study, to court documents that revealed your knowledge of the impact of these types of filters on young people, generally young girls -- >> i generally disagree with that characterization. >> with court documents? >> i haven't seen any documents. >> okay. mr. zuckerberg, my time is up. i hope that you hear what is being offered to you and are prepared to step up and do better. i know this senate committee is going to do our work to hold you to greater account.
5:11 am
thank you, mr. chair. >> senator tillis. >> thank you, mr. chair. thank you all for being here. i don't feel like i'm going to have an opportunity to ask a lot of questions, so i'm going to reserve the right to submit some for the record. but i have heard -- we've had hearings like this before. i've been in the senate for nine years. i've heard hearings like this before, i've heard horrible stories about people who have died, committed suicide, been embarrassed. every year we have an annual flogging, every year. at what materially has occurred over the last nine years. do any of you all -- just yes or no question, do any of you all participate in an industry consortium trying to make this fundamentally safe across platforms? yes or no, mr. zuckerberg? >> yes. >> there's a variety of organizations.
5:12 am
>> do you participate? >> which organizations? >> does anyone here not participate in an industry -- i actually think it would be immoral for you all to consider it a strategic advantage to keep private something that would secure all these platforms to avoid this -- do you all agree with that, that anybody that would be saying you want ours because ours is the safest and -- that you as an industry realize this is an existential threat if we don't get it right, right? i mean, you've got to secure your platforms. you've got to deal with this. do you not have an inherent mandate to do this? because it would seem to me if you don't, you're going to cease to exist. i mean, we can regulate you out of business, if we wanted to. the reason i'm saying -- it may sound like a criticism. it's not a criticism. i think we have to understand that there should be an inherent motivation for you to get this right. our congress will make a decision that could potentially put you out of business. here's the reason i have a concern with that, though. i just went on the internet
5:13 am
while i was listening intently to all the other members speaking, and i found a dozen different platforms outside of the united states, 10 of which are in china, two of which are in russia. their daily average active membership numbers in the billions. people say you can't get on china's version of tiktok. it took me one quick search on my favorite search engine to find out exactly how i could get on this platform today: and so the other thing that we have to keep in mind -- i come from technology. i could figure out -- ladies and gentlemen, i could figure out how to influence your kid without them ever being on a social media platform. i can randomly send texts and get a bite and then find out an email address and get compromising information. it is horrible to hear some of
5:14 am
these stories, and i've had these stories occur in my hometown down in north carolina. but if we only come here and make a point today and don't start focusing on making a difference which requires people to stop shouting and start listening and start passing language here, the bad actors are just going to be off our shores. i have another question for you all. how many question people roughly -- if you don't know the exact number, it's okay. how many people roughly do you have looking 24 hours a day at these horrible image, and just go real quick with an answer down the line and filtering it out? >> it's most of the 40,000 of our people who work on -- >> and, again? >> we have 2,300 all over the world. >> we have 40,000 trust and safety professionals around the world. >> we have approximately 2,000 people dedicated to trust and safety and content moderation. >> our platform is much smaller than these folks. it's hundreds of people, and
5:15 am
it's looking at content 80% of our work -- >> as i mentioned, these people have a horrible job. many of them experience -- they have to get counseling for all of things they see. we have evil people out there, and we're not going to fix this by shouting -- talking past each other. we're going to fix this with one of ya'll being at the table and coming closer to what i heard one person say supporting a lot of the good bills, like the one i hope senator blackburn mentions when she gets a chance to talk. guys, if you're not at the table and securing these platforms, you're going to be on it. and the reason why i'm not okay with that is that if we ultimately destroy your ability to create value and drive you out of business, the evil people will find another way to get to these children. and i do have to admit -- i don't think my mom is watching this one. but there is -- we can't look past good that is occurring. my mom who lives in nashville, tennessee, and i talked yesterday, and we talked about
5:16 am
a facebook post that she made a couple of days ago. we don't let her talk to anybody else. that could coax my 92-year-old mother with her grandchildren and great grandchildren. that makes a kid feeling awkward into school to get together with people and relate to people. let's not throw out the good because we haven't all together focused on rooting out the bad. now, i guarantee you, i could go through some of your governance downtowns and find a reason to flog every single one of you because you didn't place the emphasis on it that i think you should. but at the end of the day, i find it hard to believe that any of you people started this business, some of you in your college dorm rooms, for the purposes of creating the evil that is being perpetrated on your platforms. but i hope that every single waking hour you're doing everything you can to reduce you're not going to be able to eliminate it. i hope there's some
5:17 am
enterprising young tech people out there today that are going to go to parents and say, ladies and gentlemen, your children have a deadly weapon. they have a potentially deadly weapon, whether it's a phone or a tablet. you have to secure it. you can't assume that they're going to be honest and say that they are 16 when they're 12. we all have to recognize that we have a responsibility to play, and you guys are at the tip of the spear. so i hope that we can get to a point to where we are moving these bills. if you got a problem with them, state your problem. let's fix it. no is not an answer. and know that i want the united states to be the beacon for innovation, to be the beacon for safety and to prevent people from using other options that have existed since the internet has existed to exploit people. and count me in as somebody that will try and help out. thank you, mr. chair.
5:18 am
>> thank you, senator tillis. next is senator ossoff. >> thank you, mr. chairman, and thank you to our witnesses today. mr. zuckerberg, i want to begin by just asking a simple question which is do you want kids to use your platform more or less? >> well, we don't want people under the age of 13 using -- >> do you want teenagers 13 and up to use your platform more or less? >> well, we would like to build a product that is useful and that people want to use. >> my time is going to be limited. do you want them to use it more or less, teenagers, 13 to 17 years old, do you want them using meta products more or less? >> i'd like them to be useful enough that they want to use them more. >> you want them to use it more. i think herein we have one of the fundamental challenges. in fact, you have a fiduciary obligation, do you not, to try to get kids to use your
5:19 am
platform more? >> it depends on how you define that. we obviously are a business, but -- >> mr. zuckerberg, our time is -- it's not itself evident that you have a fiduciary obligation to get your users, including users under 18 to use and engage with your platform more rather than less, correct? >> over the long term. but in the near term, we often take a lot of steps, including we made a change to show less videos on the platform that reduced the amount of time by more than 50 million hours. >> okay. but if your shareholders ask you, mark, i wouldn't -- mr. zuckerberg here, but your shareholders might be on a first-name basis with you, mark, are you trying to get kids to use meta products more or less, you would say more, right? >> i think over the long-term we're trying to create -- >> the 10k you filed with the s.e.c., a few things you want to note, hee are some quote, and this is a filing you signed, correct? >> yeah. >> our financial performance has been and will continue to
5:20 am
be significantly determined by our success in adding, retaining and engaging active users. here's another quote. if our users decrease their level of engagement with our products, our revenue, financial results and business may be significantly harmed. here's another quote. we believe that some users, particularly younger users, are aware of and actively engaging with other products and services, similar to or as a substitute for hours, continues in the event that users increasingly engage in other products and services, we play get a decline in which case our business would likely be harmed. you have an obligation as the chief executive to encourage your teen to get -- your team to get kids to use your platform more. >> senator -- >> is that not self-evident? you have a fiduciary obligation to your shareholders to get kids to use your platform more? >> i think the things that's not intuitive is the direction is to make the products more useful so that way people want
5:21 am
to use them more. we don't give our -- the teams rub running the instagram feed or the facebook feed a goal to increase the amount of time people spend. >> but -- you want your users engaging more and using more the platform. and i think this gets to the root of the challenge because the overwhelming view of the public, certainly in my home state of georgia, and we've had some discussions about the underlying science, that this platform is harmful for children. i mean, you are familiar with -- not just your platform, by the way, social media in general, 2023 report from the surgeon general about the impact of social media on kids' mental health which cited evidence that kids who spend more than three hours a day on social media have double the risk of poor mental health outcomes, including depression and anxiety, are you familiar with that surgeon general report and the underlying study? >> i read the report, yes. >> do you dispute it? >> no. but i think it's important to characterize it correctly.
5:22 am
i think what he was flagging in the report is there seems to be a correlation, and obviously the mental health issue is very important. so it's something that needs -- >> the thing is, everyone knows there's a correlation, everyone knows that kids who spend a lot of time, too much time on your platforms are at risk. and it's not just the mental health issue. let me ask you another question. is your platform safe for kids? >> i believe it is. but there's -- >> hold on a second. >> there's a difference between correlation and causation. >> because we're not going to be able to get anywhere. we want to work in a productive, open, honest and collaborative way with the private sector to pass legislation that will protect americans, that will protect american children above all, and that will allow businesses to thrive in this country. if we don't start with an open, honest, candid, realistic assessment of the issues, we can't do that. the first point is you want kids to use the platform more. in fact, you have an obligation to. but if you're not willing to acknowledge it's a dangerous place for children, the
5:23 am
internet is a dangerous place for children, not just your platform, isn't it? isn't the internet a dangerous place for children? >> i think it can be. yeah. there's both great things that people can do, and there's harms we need to work on >> it's a dangerous place for children. there are families here who have lost their children. there are families across the country whose children have engaged in self-harm, who have experienced low self-esteem, who have been sold deadly pills on the internet. the internet a dangerous place for children, and your platforms are dangerous places for children, do you agree? >> i think that there are harms we need to work on. >> okay. >> i'm not going to -- i think -- >> why not? why not just acknowledge it? why do we have to do the very careful -- >> i disagree with the characterization. >> that the internet is a dangerous place for the children? >> i think you're trying to characterize our product as inherently dangerous -- >> inherent or not, your products are places where children can experience harm, they can experience harm to their mental health, they can be sold drugs, they can be
5:24 am
preyed upon by predators. you know, they are dangerous places. and yet you have an obligation to promote the use of these platforms by children. all i'm trying to suggest to you, mr. zuckerberg, and my time is running short, is that in order for you to succeed, you and your colleagues here, we have to acknowledge these basic truths. we have to be able to come before the american people, the american public, the people in my state of georgia and acknowledge the internet is dangerous, including your platforms. there are predators lurking. there are drugs being sold. there are harms to mental health that are taking a huge toll on kids' quality of life. and yet you have this incentive -- not just you, mr. zuckerberg, all of you have an incentive to boost maximize use, utilization and engagement, and that is where public policy has to step in to make sure that these platforms are safe for kids so kids are not dying, so kids are not overdosing, so kids are not
5:25 am
cutting themselves or killing themselves because they're spending all day scrolling instead of playing outside. and i appreciate all of you for your testimony. we will continue to engage as we develop this legislation. thank you. >> senator from tennessee. >> thank you, mr. chairman. thank you, you four, to each of you for coming. and i know some of you had to be subpoenaed to get here. but we do appreciate that you all are here. mr. chew, i want to come to you first. we've heard that you're looking at putting a headquarters in nashville. and likewise in silicon valley and seattle. and what you're going to find probably is the welcome mat is not going to be rolled out for you in nashville like it would be in california. there are a lot of people in tennessee that are very concerned about the way tiktok is basically building dossiers on our kids, the way they are
5:26 am
building those on their virtual you. and also this ad information is held in china, in beijing, as you responded to senator blumer thal and i last year in reference to that question. and we also know that a major music label yesterday said they were pulling all of their content off your site because your issues on payment, on artificial intelligence and because of the negative impact on our kids' mental health. so we will see how that progresses. mr. zuckerberg, i want to come to you. we have just had senator blumenthal and i, of course, have had some internal documents and emails that have come our way. one of the things that really concerned me is that you referred to your young users in
5:27 am
terms of their lifetime value of being roughly $270 per teenager. and each of you should be looking at these kids. their t-shirts they're wearing today say i'm worth more than $270. we've got some standing up in those t-shirts. [ applause ] >> now, -- and some of the children from our state, some of the children, the parents that we have worked with, just to think whether it is becca schmidt, david mollick, sarah glad, emily schote, would you say that life is only worth $270? what could possibly lead you -- i mean, i listen to that -- i
5:28 am
know you're a dad, i'm a mom, i'm a grandmom, and how could you possibly even have that thought? it is astounding to me. and i think this is one of the reasons that states, 42 states are now suing you because of features that they consider to be addictive, that you are pushing forward. and in the emails that we've got from 2021 that go from august to november, there is the staff plan that is being discussed. ant iany dave, alex schultz, adam maceri are all on this chain of emails on the well- being plan. and then we get to one, nick did email mark for emphasis to emphasize his support for the package, but it sounds like it lost out to various other
5:29 am
pressures and priorities. see, this is what bothers us. children are not your priority. children are your product. children you see as a way to make money. and children protecting children in this virtual space, you made a conscious decision, even though nick klagg and others were going through the process of saying this is what we do. these documents are really illuminating. and it just shows me that growing this business, expanding your revenue, what you are going to put on those quarterly filings, that was the
5:30 am
priority. the children were not. it's very clear. i want to talk with you about the pedophile ring because that came up earlier, and "the wall street journal" reported on that. and one of the things that we found out was after that became evident, then you didn't take that content down, and it was content that showed that teens were for sale and were offering themselves to older men. and you didn't take it down because it didn't violate your community standards. do you know how often a child is bought or sold for sex in this country? every two minutes. every two minutes a child is bought or sold for sex. that's not my stat. that is a tbi stat. now, finally, this content was taken down after a congressional staffer went to
5:31 am
meta's global head of safety. so would you please explain to me and to all these parents why explicit predatory content does not violate your platform's terms of service or your community standards? >> sure, senator. let me try to address all of the things you just said. it does violate our standards. we work very hard to take it down. >> didn't take it down. >> well, we've reported, i think it's more than 26 million examples of -- >> didn't take it down until a congressional staffer brought it up. >> it may be that in this case we made a mistake and missed something. >> i think you make a lot of mistakes. but let's move on. i want to talk with you about your instagram creator's program and about the push, we found out through these documents that are actually are pushing forward because you want to bring kids in early. you see these younger tween-
5:32 am
agers as valuable but an untapped audience, quoting from the emails and suggesting teens are actually household influencers to bring their younger siblings into your platform, into instagram. now, how can you ensure that instagram creators, your product, your program, does not facilitate illegal activities when you fail to remove content pertaining to the sale of minors? and it is happening once every two minutes in this country. >> senator, our tools for identifying that kind of content are industry leading. that doesn't mean we're perfect. there are definitely issues that we have. but we continue -- >> mr. zuckerberg, yes, there are a lot that is slipping through. it appears that you're trying to be the premier sex trafficking site. >> of course not, senator. senator, that's ridiculous. >> no, it is not ridiculous:
5:33 am
>> we don't want that content on your platforms. >> why don't you take it down? >> we do take it down. >> we are here discussing you all need to work with us. no, you're not. you are not. and the problem is we've been working on this -- senator welch is over there. we've been working on this stuff for a decade. you have an army of lawyers and lobbyists that have fought us on this every step of the way. you worked with netchoice, the cato institute, the taxpayers protection alliance and chamber of progress to actually fight our bipartisan legislation to keep kids safe online. so are you going to stop funding these groups? are you going to stop lobbying against this and come to the table and work with us? yes or no. >> senator, we have -- [ applause ] >> senator, we have -- >> yes or no. >> of course we'll work with you on the legislation. >> okay. the door is open. we've got all these bills.
5:34 am
you need to come to the table, each and every one of you need to come to the table. and you need to work with us. kids are dying. [ applause ] >> senator welch. >> i want to thank my colleague, senator blackburn, for her decade of work on this. i actually have some optimism. there is a consensus today that didn't exist, say, 10 years ago that there is a profound threat to children, to mental health, to safety. there's not a dispute. that was in debate before. that's a starting point. secondly, we're identifying concrete things that can be done in four different areas.
5:35 am
one is industry standards. two is legislation. three are the courts. and then four is a proposal that senator bennet, senator graham, myself and senator warren have to establish an agency, a governmental agency whose responsibility would be to engage in this on a systemic regular basis with proper resources. and i just want to go through those. i appreciate the industry standard decisions and steps that you've taken in your companies. but it's not enough, and that's what i think you're hearing from my colleagues. like, for instance, where there are layoffs is in the trusted -- the trust and verify programs. that's alarming because it looks like there is a reduction in emphasis on protecting things, like you just added, ms. yaccarino, 100 employees in
5:36 am
texas in this category. and how many did you have before? >> the company is just coming through a significant restructuring. so we've increased the number of trust and safety employees and agents all over the world by at least 10% so far in the last 14 months. and we will continue to do so, specifically in austin, texas. >> mr. zuckerberg, my understanding is that there have been layoffs in that area as well, there's added jobs there at twitter, but at meta, have there been reductions in that? >> there have been across the board, not really focused on that area. i think our investment is relatively consistent over the last couple of years. we invest almost $5 billion in this work last year, and i think this year will be on the same order of magnitude. >> another question that's come up is when -- it's the horror of any user of your platforms, somebody has an image on there that's very compromising, often of a sexual nature. is there any reason in the
5:37 am
world why a person who wants to take that down can't have the very simple same-day response to have it taken down? i'll start with twitter on that. >> i'm sorry, senator, i was taking notes. could you repeat the question? >> well, there's a lot of examples of a young person of finding out about an image that is of them and really compromises them and actually can create suicidal thoughts, and they want to call up or they want to send an email and say take it down. why is it not possible for that to be responded to immediately? >> we all strive to take down any type of violentive content or disturbing content immediately. at x we have increased our capabilities of staff reporting process. >> if i'm a parent or i'm a kid, and i want this down,
5:38 am
shouldn't there be methods in place where it comes down? you can see what the image is -- >> yeah. >> ecosystem-wide standard would actually improve and enhance the experience for users at all our platforms >> >> there is actually an organization that i think a number of the companies that are up here are a part of, called take it down. >> so you all are in favor of that. that's going to give some peace of mind to people, all right? it really, really matters. i don't have that much time. so we've talked about the legislation, and senator whitehouse had asked you to get back with your position on section 230 which i'll go to in a minute. i would welcome each you responding as to your company's position on the bills that are under consideration in this hearing, all right? i'm just asking you to do that. third, the court.
5:39 am
this big question of section 230, and today i'm pretty inspired by the presence -- this is the parents, who have turned their extraordinary grief into action in hopes that other parents may not have to suffer for what is them -- for everyone a devastating loss. senator whitehouse asked you all to get back very concretely about section 230 and your position on that. but it's an astonishing benefit that your industry has that no other industry has. they just don't have to worry about being held accountable in court if they're negligent. so you got some explaining to do. and i'm just reinforcing senator whitehouse's request that you get back specifically about that. and then finally, i want to ask about this notion, this idea of
5:40 am
a federal agency whose resourced and whose job is to be dealing with public interest matters that are really affected by big tech. it's extraordinary what has happened in our economy with technology, and your companies represent innovation and success. but just as when the railroads were ascendant and were in charge and ripping off farmers because of practices they were able to get away with, just as when wall street was flying high but there was no one regulating blue sky laws, we now have a whole new world in the economy. and, mr. zuckerberg, i remember you testifying in the energy and commerce committee, and i asked you your position on the concept of a federal regulatory agency. my recollection is that you were positive about that. is that still the case? >> i think it could be a reasonable solution. there are obviously pros and cons to doing that versus through the normal -- the
5:41 am
current structure of having different regulatory agencies focused on specific issues. but because a lot of the things trade off against each other, like one of the topics we talked about today is encryption, and that's really important for privacy and security. >> right. >> can we just go down the line? i'm at the end. th >> senator, i think the industry initiative to keep those conversations going would be something that x would bees very, very proactive about. if you talk about the shield act, the stop c. sam act, part of the safe childhood act, i think our intentions are clear to participate in the need here. >> senator, we support national privacy legislation, for example. so that sounds like a good idea. we just need to understand what it means. >> all right. mr. spiegel. >> senator, we'll continue to work with your team, and we'd certainly be open to exploring the right regulatory body for the technology. >> the regulatory body is something you can see has
5:42 am
merit. >> yes, senator. >> and mr. siff41? >> we're very open to working with you, our peers and anybody for making the internet a safer place. as you mentioned, this is not a one platform problem. >> right. >> we do look to collaborate with other properties. >> thank you, all. mr. chairman, i yield back. >> thank you, senator welch. well, we're going to conclude this hearing. thank you all for coming today. you probably have your scorecard out there, you've met at least 20 members of this committee and have your own impressions of their questioning, their approach and the like. but the one thing i want to make clear is chairman of this committee for the last three years is this was an extraordinary vote on an extraordinary issue a year ago. we passed five bills unanimously in this committee. you heard all the senators. every spot on the political spectrum was covered, every single senator voted unanimously in favor of the five pieces of legislation we've discussed today. it ought to tell everyone who
5:43 am
follows capitol hill and washington a pretty stark message. we get it. and we live it. as parents and grandparents, we know what our daughters and sons are going through. they cannot cope, they cannot handle this issue on their own. they're counting on us as much as they're counting on the industry to do the responsible thing. and some believe with impressions with the companies they represent that's your right as an american citizen. you also must leave to keep the spotlight on us to do something, not just to hold a hearing, bring out a good strong crowd of supporters for change but to get something done, no excuses. no excuses. we've got to bring this to a vote. look what i found in my time in the house and senate is that's the moment of reckoning. speech is notwithstanding press releases and the like the moment of reckoning is when you call a vote on these measures
5:44 am
it's time to do that i don't believe there's ever been a moment in america's wonderful history where a business or industry has stepped up and said regulate us put some legal limits on us businesses exist by and large to be profitable. and i think that we gotta get behind that and say profitability at what cost. senator kennedy, republican colleague said is our technology greater than our humanity. i think that is a fundamental question that he asked. what i would add to it -- or are politics greater than technology? we're going to find out. i want to thank a few people before we close up here. i've got several staffers who worked so hard on this, alexandria gelbert, thank you very much to alexandria. jeff hanson. [ applause ] >> last point i'll make, mr. zuckerberg is just a little advice to you. i think your opening statement object mental health needs to be explained. i don't think it makes any
5:45 am
sense. there isn't a parent in this room who's had a child that's gone through an emotional experience like this that wouldn't tell you and me they changed right in front of my eyes, they changed, they hold themselves up in their room, they no longer reach out to their friends, they've lost all interest in school, these are mental health consequences that i think come with the abuse of this right to have access to this kind of technology. so -- i see my colleague. you want to say a word? >> i think it was a good hearing. i hope something positive comes from it. thank you all for coming. >> the hearing record is going to remain open for a week for statements and maybe questions submitted by senators by 5:00 p.m. on wednesday. once again, thanks to witnesses for coming. the hearing stands adjourned.
5:46 am
5:47 am
5:48 am
5:49 am
5:50 am

12 Views

info Stream Only

Uploaded by TV Archive on