Skip to main content

tv   Fmr. Facebook Exec on Social Medias Harmful Impact on Children  CSPAN  March 7, 2024 8:01am-10:15am EST

8:01 am
8:02 am
hours. this hearing of the judiciary subcommittee on technology, privacy in law who will come to order. thank you everyone for attending. my thanks to ranking member hawley and particularly to the
8:03 am
chairman of the judiciary committee dick durbin for giving us this opportunity. he is interested in this topic i'm going to call on him after senator hawley for his remarks. we are gathered today to hear testimony from a whistleblower, an engineer widely respected and admired in the industry. and not just any expert at an engineer hired specifically by facebook to help protect against harms to children. and make recommendation for making facebook safer we have known for more than a decade
8:04 am
that suffering from suicide, hospitalizations or self-harm and depression have skyrocketed as he knows these numbers are more than statistics they are real people and his daughter is one of them. arturo behar is the former director of engineering for protect and care at facebook. and he will tell us about the evidence he brought directly to the attention of the top management of facebook and meta. mark zuckerberg, sandberg, and others in meetings and memos he resoundingly raised an alarm about statistics showing facebook prevalent and arms to
8:05 am
teens. telling mark sucker for example and a memo that more than half of facebook users had bad or harmful experience trust in the last week. instead of real reform he will testify that facebook engaged in a purposeful public strategy of distraction, denial and deception. they hid from this committee and all of congress evidence of the harms that they knew was credible. and they ignored and disregarded for making the site safer and they even rolled back some of the existing protection.
8:06 am
is not the first and only whistleblower to come forward. we heard from francis who showed facebook's own researchers describe instagram itself as a quote perfect storm. on that it exacerbates downward spirals of addiction, eating disorders and depression. as the first to show in documents and documents were going to show this for the record they will show for example a quarter of young teens 1314 years old report receiving sexual advances on instagram.
8:07 am
nearly a third of young teens have seen discrimination based on race and sexual orientation a quarter of young teens report have been bullied or threatened nearly a quarter of young teens report exploring worse about themselves, about their bodies and their social relationships. the type of experience to lead to depression and eating disorders. only 2% of the time. remedies only 2% there is a history here in august of 2021 senator blackburn and i wrote to fit wrote to facebook about the impact of their products on kids we asked quote has facebook research ever found its
8:08 am
platforms and products can have a negative effect on children's and teens in mental health or well-being? facebook refused to answer in october of 2021 center blackburn and i held a hearing. we heard from frances haugen about instagram and on that same day mr. behar sent an e-mail to mark zuckerberg sandberg and other executives validating the testimony that e-mail demonstrated even greater harms then were then a public a showing and searing indictment of instagram and facebook i'm going to ask to be made part of the record without objection. december 2021 then testified to the committee to our subcommittee after he met discussing these numbers and statistics relating to suicide
8:09 am
and during that hearing a number of us asked him about facebook promoting suicide he knew but he did not disclose that on a weekly basis around 7% of facebook users overall encounter content promoting suicide and self-harm 13 -- 15 -year-olds seeing it more often than others. there is a pattern here with facebook at hyde's risk by saying things like bullying and harassment is only .08% of content when in reality meta executives know 11% of the 13 -- 15 -year-olds face of bullying every single week. every single week on instagram.
8:10 am
and just to be absolutely clear that is in millions of children and teenagers it's not just a number. behind every one of those numbers is a real person, a teenager. a child whose life is changed may be forever by that searing experience of a bullying comic eating disorder contents, suicide promotion, we can no longer rely on social media mantra trust us. we can no longer on parents what is needed now legislative reform the kids online safety act. senator blackburn and i have enlisted more than 45 of our colleagues almost half the united states senate in favor of
8:11 am
the kids online safety act. and the final point i would make is still failed to take the start seriously. this june the wall street journal found instagram even recommending pedophiles to each other it was complicit. with specific recommendations to prevent teens to expenses unwanted they were never adopted your career the line to come forward experience and trusted
8:12 am
industry experts whose job was to make facebook a safer. your recommendations were purposefully ignored or disregarded or rejected. i am just going to remind my colleagues we heard from young people as well as parents about these harms. one of them told me, how many more children have to die before congress will do something? that is why we are here today. i want to thank all of my colleagues who are present, truly a bipartisan group on behalf of this cause and turn to the ranking member, et cetera holly perfect thank you very much mr. chairman print thank you for convening this hearing. this is such a vital hearing on a vital topic and to be honest with you this hearing concerns every parent nightmare you are nodding your father, that
8:13 am
subject composes that really composes some your testimony i am also a father of three and what you have brought to this committee today something every parent in america needs to hear. the numbers are stunning one in four teenagers, minor children will experience sexual solicitation on medicine platform at some .1 and eight so they've experience unwanted sexual advances we are talking children now have experience unwanted sexual advances just in the last week we know first data's own internal research that they knew the extent of this problem even if they were ignoring you and i want to turn to some of that research that are blumenthal just reference. these are met his own words particular young women quote make body image issues worse for what and three teen girls. teens and blame instagram for
8:14 am
increases in anxiety and depression for this reaction was unprompted and consistent across all groups. he is like the likely meta time they spent on the app but they feel they have to be present for the often feel it's bad for their mental health but feel unable to stop themselves they knew these things were happening there years old i just read you pointed this out you brought these concerns to them would you expose this rather than respond they cooked the books if i understand your testimony correctly. we need 90% of unwanted sexual material child abuse material, pornography, terrorism threats we take it down our ai systems
8:15 am
find it and take it down. what you expose is in fact the ai systems are catching only a small small percentage of that kind of abusive material online. when facebook is out there promoting to the world we are taking on the vast majority it simply is not true. in fact they know it's not true statistic is designed to mislead they are deliberately misleading parents about what is on their platform they are deliberately misleading parents about the safety of their children online. i want to echo something center blumenthal said it's time for congress to take action. it was time years ago for congress to take action. it is an indictment of the body to be honest with you we have not acted and we all know the reason why if i could just start with a little plain talk here this morning. big attack is the biggest most powerful lobby in the nine states congress. they spend millions upon millions upon millions of dollars every year to lobby this body.
8:16 am
and the truth is, as ever reported in this room knows i hope you will report it after this hearing, they do it successfully. they successfully shut down every meaningful piece of legislation every year. i believe in you for four years and i've seen it repeatedly in the short time i have been here. we will get all kinds of speeches in committee. bullet speeches on the floor about how we have to act and this body will do nothing, why? money that is why gobs of it, gobs of it. influencing votes eight hammer hold on this process it is time for it to be broken the only way i know to break it is to bring the truth forward and that is why i am so glad you're here today to do it. thank you, mr. chairman projects and consider holly's only footnote i would add is this time must be different. they have armies of lawyers and lobbyists they spend tons of money but this time must be different. senator durbin perfect thank you chairman blumenthal and senator hawley and let me follow up with senator hawley's comments, i could not agree more i could not
8:17 am
agree more and the senate judiciary committee after graphic hearings were parents and victims came forward and told us what had happened to them online we decided to take action. we passed six bills related to this issue. child sexual abuse similar issue. six bills and something happening is miraculous. all six passed unanimously. every democrat every republican take a look at the folks up at this end of the tables across the political spectrum we all agreed on this. what is happened since? nothing six bills waiting for a day on the calendar but six bills waiting for national debates. six bills passed unanimously on a bipartisan basis they put real teeth in enforcement too and i think that is why they've gone nowhere. big attack is the big kid on the block when it comes to this issue and many other issues before us. that is the reality of the thank
8:18 am
you chairman blumenthal and senator hawley for bringing together so many members of this hearing our philosophy in putting together the subcommittees was to say to each of the senators in charge of them, do your best take your issue that mean something to you and do your best to bring it to the american people and legislation to the floor the night state senate. this committee is when i'm counting on to be successful in this regard. thank you for the courage of stepping up and speaking up the only amendment i would make to the chairman's remarks and senator hawley's is not only a parents issue it's a grandparents issue two. wheat scares this in its scares us. thank you for what you brought us and i am particularly intrigued by your idea of a survey so we find out from the source what is really happening. my experience on capitol hill's goes back several years. took on a tobacco issue. we were hitting our head against the wall trying to penetrate this vast lobby at the time the one way we managed to penetrate
8:19 am
it was to make it a children's issue protecting kids from an addiction to tobacco and then a lot of good things start happening. why is that this issue which relates to our kids so much more and is so much more dangerous than even tobacco in my estimation why is it so difficult? senator hawley is correct we are fighting the biggest kid on the block when it comes to this issue. thank you, mr. chairman projects and government senator durbin and thank you for your leadership on this issue. i'm going to turn to senator graham if he is opening remarks. >> may be number seven is the magic number of bills. the next bill i hope and i want to thank you center blumenthal and holly for doing this is section 2:30. the other bills are going nowhere until they believe they can be sued in court the day they know the court room is open to the business practices they will flood us with all kinds of good ideas.
8:20 am
until that day comes nothing is going to happen. and i said as we passed mccord to go to the floor to die in veto senator schumer, senator mcconnell, it's the house doing? not that much. a society that cannot takes care of our their children or refuses to has a bleak future so thank you for doing this. >> thank you, senator graham. center blackburn? what's thank you, mr. chairman thank you so much thank you for the time you have given. center blumenthal staff, my staff, for being so open when you met with center blumenthal and i last week. i really appreciate this. as a center blumenthal said and we have worked on this for years. he built the time line outgoing to 2021. but the work we were doing looking at big tech and looking at some of the problems, the
8:21 am
lack of privacy. the frustration of people not being able to control who had access to their virtual you is what led us to this point to begin to look at what's happened to our children. and as i told you in our meeting, the day we had that first hearing looking at what was happening online with children it was like the floodgates opened we started hearing from moms and dads not only in tennessee and not only in connecticut but across the country who were saying can i please tell you my story? the reason they did this because their hearts were breaking. their children were committing suicide they'd met a drug dealer their children had met a
8:22 am
pedophile. their child had met a trafficker they'd been exposed to cyber bullying and had committed suicide. they were looking up ways to commit suicide. there are laws in the physical world that protect children from all of this. but online it has been the wild west and as my colleagues have said we have fought this army of lobbyists for years. big tech has proven they are completely incapable of governing themselves of setting up rules of having guidelines of designing for safety and it is so important that we move forward with this. one think i will add and it is so important for your being here and her colleagues who were not a part of what we were doing.
8:23 am
in 2021 when he came before us as the ceo of instagram indicated they were taking steps. but we find out they were not. we find out from the advice and awareness that you provided mark zuckerberg what did they do with that? they made a conscious decision to ignore your advice and guidance and use our kids as the product. the longer they are online the richer that data is. the richer the data is the more money they make. so they have monetized on what's
8:24 am
comes from our children being addicted to social media. thank you so much for being here today. >> thanks senator blackburn. we formally introduce the former security engineer with very significant experience work on user safety a specific team is outside until 2015 he reported to the cto he then came back as a consultant to help instagram's well-being from 2019 to 2021. is also compared to a courageous young girl, young woman who spoke up about her experiences
8:25 am
online. as is our custom going to administer the oath if you would please stand. do you swear the testimony you will give to this committee is the truth, the whole truth and nothing but the truth so help you god? thank you. please go ahead. german durbin, ranking member graham, chairman blumenthal, ranking member. and members of the subcommittee. thank you for the opportunity to appear before you. and for your interest in addressing one of the most urgent threats for our children today. the american children and the children everywhere.
8:26 am
i appear before you today as a dad with first-hand experience of a child who received unwanted sexual advances on instagram. as an expert with over 20 years of experience working as a senior leader including leading online security for safety and protection at facebook. it is unacceptable a 13-year-old girl gets her propositions on social media. unfortunately it happens all too frequently today. and a careful survey by instagram in 2021 we found one and eight kids age 13 -- 15 years old experience unwanted sexual advances in the last seven days. this is unacceptable my work has shown that does not need to be this way. starting in 2009 is the product
8:27 am
leader for facebook's efforts to reduce online threats including mark zuckerberg and they were supportive of this work as a parent i took the work personally i worked hard to help create a safer environment. by the time i left in 2013 i thought the work was going in the right direction. my 14 year old out of joint instagram they began having awful experiences. including repeated unwanted sexual advances, harassment, she reported this incident to the company and it did nothing in large part because of what i learned as a father in october 2019 i returned to facebook this time as i
8:28 am
consulted with instagram's team we tried to set goals based on the experiences of teens themselves instead they want to focus it's nearly defied policies regardless of whether the approach reduce the harm teens were experiencing i discovered most of the kids -- most had been put in place will remove. i had new features being developed in response to public outcry which were in reality kind of a placebo, a safety feature in name only two placates the regulators, i say this rather than being based on experience data they were dazed on deliberately now definitions of harm the company was creating its own homework. for example instagram knows when a kid it spends a significant
8:29 am
bout of time looking at harmful content, content they are recommending met it must be held accountable and for the unwanted sexual advances that instagram enables. as soon as they understood this gap i did what i had always done i researched the problem, vetted the numbers and informed mark zuckerberg, sandberg and other executives. i did this because for six years that was my job, to let them know of critical issues that affect the company. it has been two years since i left. and these are the conclusions i have come too. one, mehta knows the harm kids experience other platform. from the executives know their measures fail to address it. and two, there are actionable steps mehta could take to address the problem and a three,
8:30 am
they are deciding time and time again to not tackle these issues. instagram is the largest public directory of teenagers with pictures in the history of the world. mehta, which owns instagram as a company were all work is driven by data. they have been unwilling to be transparent about data regarding the harm kids experience. and unwilling to reduce them. social media companies must be required to become more transparent so that parents and public can hold them accountable. many have come to accept the false proposition unwanted advances, bullying, misogyny and other harms are unavoidable evil. this is just not true. we do not tolerate unwanted sexual advances against children
8:31 am
and any other public context and they can similarly be prevented on facebook, instagram and other social media products. what is the acceptable frequency for kids to receive unwanted sexual advances? this is an urgent crisis when asked has anyone threatened you, and damaged her reputation, insulted you, just respected you, excludes you are left to out, 11% of kids said yes and the last week. one in four witnessed it happening in the company does nothing about that. when asked if i saw a post that made them feel bad about themselves one in five kids said yes. in the last week. mehta executives know this the public now knows this.
8:32 am
when i left facebook in 2021 i thought the company of take my concerns and accommodation seriously. to heart and act. yet years have gone by and millions of teens are having their mental health compromise and are still being traumatized by unwanted sexual advances, harmful content on instagram and other social media platforms. there was a time when at home on the weekend at least the kids could escape these things, these harms but today just about every parent and grandparent has seen their kids faces change from happiness, to grief, to distress the moment they check social media. where can a child seek refuge? it is time the public and parents understand the true
8:33 am
level of harm enabled by these products and is time for congress to act, thank you for your time. >> thank you. we are going to begin with the questions in each of us will ask five minutes of questions will have a second round of folks want to do that. we put in the record mark zuckerberg of october 5, where you recommend that there be in effect not only a change in the business practices of the company but a culture shift. then you wrote separately on september 14 i'm going to ask that documents be made part of the record as well were you presented more of these statistics and very powerful
8:34 am
evidence of harm. and it seems to me the reaction was to pat you on the head and in effect tell you to go away, be a good boy, and pull the curtain. senator hawley referred to cooking the books i think what they did was bury this evidence, conceal it, hide it, and deny it and effect to congress and to the public. and then in the past year they have cut around 21000 jobs or a quarter of the global workforce and what mark zuckerberg has called the year of efficiency, hundreds of jobs involving content moderators and safety
8:35 am
jobs including from instagram's well-being team. what is the impact of cutting those resources devoted to online safety? >> thank you for the question. if you start from the point that work was already heavily under resourced when i was there, we were dealing 20%, 10% of people experiencing this and there were a small fraction of people dedicated to address that harm. and then they take up more resources away from that including the people who are doing the work to understand the harm kids are experiencing it seems to me the company culture is one of see no evil, hear no evil we do not want to understand what people are experiencing we are not willing to invest in that.
8:36 am
>> thank you. we spoke in advance of the hearing he told me a story about meeting with another senior executive, chris cox facebook chief product officer. it was so striking to me that he already knew a lot of the numbers and statistics and evidence of harm that you were bringing to market zuckerberg's attention. why was this meeting so memorable to you? >> when i return in 2019 i thought they didn't know. i began seeing a culture that was consistently ignoring what teens were experiencing i thought executives did not know and i did spend a year researching, vetting, validating with people across the organization and i would ask do
8:37 am
you know what percentage of people are experiencing this and no one was able to answer of the top of their head the first person to do that was chris and i found it heartbreaking because it meant they knew and that they were not acting on it. >> in effect there expressed caring about teens and safety and protecting children was all a charade, a mockery. they already had the evidence but you were bringing to their attention for they knew about it and they disregarded it, correct? >> yes, that is correct to records and they rejected your recommendations for making facebook an instagram safer, correct? >> that is correct. >> let me ask you before we go to our next witness, do you
8:38 am
think that we, the congress of the united states should now act, don't you think action is long overdue in this area? given the total lack of credibility of the social media? >> my experience after sending that e-mail and seeing what happened after words is they knew. there were things they could do about it, they chose not to do them and we cannot trust them with our children it's time for congress to act. evidence i believe is overwhelming for. >> i'm very hopeful your testimony added to the lawsuit that is been brought by state attorneys general across the country i'm a former state attorney general i believe strongly in enforcement by them. added to the interest i think is evidenced by the turn out of our subcommittee today will enable us to get the kids online safety act across the finish line along
8:39 am
with measures like senator durbin's proposal and others that can finally break the straight jacket that big tech has imposed on us. big tech is the next big tobacco i thought big tobacco in the 19 '90s i sued big tobacco i urge congress to act. the same kind of addictive product that big tobacco peddled to kids now is advanced to them and promoted and pitched by big tech and we need to break the straitjacket they have imposed their armies of lobbies. thank you. >> think it mr. chairman, thank you again for being here. i want to first establish a fact or two just to make sure everybody understands on october the fifth, 2021 you compose an e-mail which is now i think in the record to mark zuckerberg,
8:40 am
sandberg and a group of other executives at the medic, and my right so far? that is correct. >> in that memo you disclosed to them that according to your own research one in eight children, children had experienced unwanted sexual advances within the last seven days, is that correct? >> it is correctly. >> what in three is 27% have expense unwanted sexual advances outside of the seven day window so it's more than seven days comes that correct question. >> that is correct for. >> 's numbers are standing that one and it within seven days a third of children outside of that window. market zuckerberg did he reply to you? what's he did not reply. what did he meet with you? was he did not meet with me. >> sheryl sandberg did she meet with you? she did not meet with me. >> in other words the people who had recorded you to come back to facebook, meta, whatever it's hard to keep up.
8:41 am
they ignored your findings when you presented debts than they did not to see they turned a blind eye. let me ask you about something else this is from the wall street journal's report earlier this year this is june of this year they found the following and i'm going to quote. instagram helps connect and promote a vast network of accounts openly devoted to the commission and purchase of under age sex content pedophiles of long use the internet but unlike the forms and file transfer services that cater to people of interest in illicit content instagram does not really host these activities, instagram's algorithms promote them. instagram connected pedophiles and guides into content sows by recommendation systems that excel at linking this to shirley's interest, the journal and academic researchers found. this is a stunning stunning report that more than buttresses bears out what you were telling,
8:42 am
trying to tell the executives that ignored you give us a sense in your own view why do you think this is happening? why is instagram become is the words of the wall street journal it a pedophile pro why are people like your daughter every time there on instagram being bombarded with unwanted sexual advances, sexual content why is this happening? >> my experience of that is the most of the resources come close to all that they invest in this go towards a very narrow definition of harm. and so i would encourage anybody here when you are looking at this issue if you find an account a pedophile account selling things try and act on it. see what the company does with that. but see what happens if you like it or follow what you start getting recommended and of all the things that surfaced by the system how many of them are they acting on?
8:43 am
it is a fraction of a percent. >> one of the things you said change from when you left facebook in 2015 >> in 2019 is facebook it shifted to an automated driven process of safety standards, safety inspection monitoring for things like that they boast about this of the ai is great, it is doing great work that does not appear however to be the actual fact it appears the harms are proliferated. tell us toward the shift toward automated safety monitoring and what that has meant in your experience? >> is not there for the shift. but what i can say is algorithms are as good as their inputs. if you do not allow that's gross that makes me uncomfortable which is something you can do for an ad today you can take an ad in say that is sexually inappropriate but there is no way for a child to do that when they get a message or other
8:44 am
areas. how do these systems have a hope of addressing these issues? how can they as a company have a hope of addressing these issues if they are not willing to listen when a teen is trying to tell them they are experiencing gross contents unwanted sexual advances? that's-predators thoughts-the bad things. >> what your research found and what you elevated to leadership was at least in part these automated systems were not catching the vast majority of this unwanted content out there in the sexual advances of this pedophile material it simply does not begin to capture yet facebook did not shift more resources, did not change their process and here's the thing that gets mail and with this mrt a question i've been reading over and over and over again this case filed by my home state missouri versus biden. landmark first amendment case. two federal courts federal
8:45 am
district court and federal court of appeals have found facebook among others actively coordinate the present administration to censor first amendment protected speech knots of this garbage is not protected by anything in our constitution but first amendment protected speech here's what gets me but the courts found this is in the records as is factual finding is facebook devoted all kinds of resources and people actual human people doing things like a monitoring post on covid-19 vaccine efficacy there is one example of a parent in my home state of missouri who went too post something about a school board meeting facebook used human a moderator to take down that post that was important, that has to come down but can have them posting about school board meetings for heaven sakes. but the things your daughter experienced this ring of pedophile rings the plural that facebook cannot take the fine for they don't have the resources for that we have to leave to let the market have its
8:46 am
effect let ai do its job we don't have the resources for it. they had plenty of resources and sent censor first amendment speech note resources to protect her children absolutely unconscionable. >> thank you, senator holly. senator durbin? >> thanks for being here. you said earlier in your opening statement when you work for these companies they were data driven. what you mean by that? >> everything at meta there were goals based on numbers. there is understanding of what's happening people said their jobs on that the next six months i'm going to make this number go from this to that. >> the ultimate answer is they were dollar driven to, correct? >> what i can speak to very ...
8:47 am
... content? >> appeal wonderful question to ask mark and cheryl who is no longer there and adam they can speak to why these made these choices like always speak to effect the keep making the choices over and over again. >> i went back up et cetera graham said if this becomes expensive to them to continue this outrageous conduct they may pay closer attention that is for sure. but you have suggested here as
8:48 am
well we need a survey of young people out of their experience, join to explain that? >> the weight harm should be tracked on this part should go to teens and asked them did you receive an unwanted sexual advances the last seven days? they are going to know it does not matter what the messages. what you can do to help that team is give them a chance to tell you and make sure they talk about it's not expensive to implement. >> are also briefed by the dea in terms of narcotics transactions and the use of platforms for that purpose due to overt look into that issue? what site did not directly but what you can do is if you look at the numbers i provided there is a category four the class of issues you should ask the company how much of that content which teens experience as that, they take down? what's it's interesting to me that if one of my kids, when they were kids or our grandkids
8:49 am
now came home and said there was someone lurking outside the playground at school that made the kids feel uncomfortable we would know what to do and move on it quickly and find it unacceptable and yet we know from reality there is danger lurking in the iphones there opening up every single day and yet we seem to feel we are unable to respond to this. i hope we can change that senator graham's suggestion of 2:30 g of any thoughts on that section 2:30? >> no i'm not qualified to talk about 2:30. i can say these companies should be held accountable for the content they recommend. >> i certainly agree with that. i think that is the bottom line. thank you for your testimony. >> thank you, senator durbin, senator graham. >> thank you, you are doing the country great service here. did they contest your memo? did anyone call you up and say you don't know what you're
8:50 am
talking about? >> quick snow ice must've spoke of 20 or 30 people including adam asking for any feed back and nobody did. >> to sum up your testimony is it fair to say in its current form what you are describing is a dangerous product? >> correct. >> millions of families are affected by this dangerous product,. >> correct. >> as a father who had a 13-year-old affected by this product did you feel helpless? >> i did. and if anyone could help it would've been deeper. >> could you have sued them would you? >> i apologize could you repeat? >> if you could sue on behalf of your daughter would you? >> i'd believe they have to be held to account for. >> one way of doing that is to sue them you know you can't sue them under the current law? lexa did not know that.
8:51 am
>> okay your daughter felt harmed, your testimony is a millions of people are in the same situation as your daughter. they know it they are doing and they keep doing it anyways at all correct course reports it is correctly correct i cannot think of a company in the world they can do the scrap docket suit except these people. now if unity of sovereign immunity which is what we have done here this about the bottom of my list of the top of my list. i asked my office to find out how much money i have received from facebook, instagram and other companies i am going to give it back. i think we ought to all boycott the giving because senator hawley is right and i think you are their leverage is power of the political system. i'm calling that every member of congress today do not take their money until they change.
8:52 am
don't accept what they are offering you until they change. because the money you are receiving is coming people created dangerous product for children and they seem not to be willing to change. that would be on the bottom of my list now than i know what you have told me of people i want to associate myself with. have you ever heard them talk about being afraid of anything or anybody? >> i have not. >> it is amazing, isn't it? companies this a big he is telling them what you're doing is hurting people they are indifferent to its i feel like they are immune from action because they pretty much are. bottom line, if we did create a system where parents like you could sue and hold them liable in court you think that may change their behavior?
8:53 am
>> that is not for me too say i just want my daughter, our kids have the tools they need when they are experience in these things. >> what i will tell you is i believe it would we will never know until we try. i think we should dedicate ourselves on this committee which has been a pleasure to serve unfettered durbin all of you been great on this issue not to just pass bills but insist on change the ultimate change comes is when they can be held liable in a court of law. until you open up the courthouse nothing is going to change for the day you do you will be amazed how many good ideas they knew about but did not tell us. i'm going to dedicate what time i have left in this business opening up the courtroom i don't think nothing else will do and it till that day comes i'm not going to take any money of their money every member of congress should say we were your buddies not welcome until you change might be the first step to
8:54 am
change. thank you for your bravery, i'm sorry for what happened to your daughter we'll everyone in your situation better, thank you for. >> and cassandra graham that is white many of you have joined you including the ranking member and myself. consider klobuchar? what thank you very much senator blumenthal. thank you for this word senator graham. i'm a strong believer in it we can talk about this and have hearings and keep reminding people that we need to get things done. but until we change the law these are no longer companies itstarted with guys in a garager in their college storm room, okay? these real-life getting lost i really appreciate that you are willing to come forward and testify. i'm going to focus on one area that i don't think we talked about enough that is the platform and the ability and
8:55 am
refusal to take down sites that are selling dangerous drugs. recently dea found one third of drug cases had direct ties to social media. i was just in minnesota with a mom, bridget. who lost her kid and the kid literally ordered a one pill as we say one pill kills, on the internet thinking of something else it was laced with fentanyl. as a mom said, as bridget said all of the hopes and dreams we as parents had were erased in a blink of an eye and no mom should have to bury their kid. that is why senator durbin and i and others on this committee have been worked with senator shaheen and marshall as well as senator grassley on the spill that has come to this committee already. i need to go to the floor along with a number of other bills we have talked about the required social media companies to report
8:56 am
fentanyl and other dangerous drug sales on their platforms and the words of our dea administrator they basically the cartels don't care people dire not because no one knows it. in a mexico and china have basically harness these platforms to social media companies have the correct incentives to identify and eliminate drug sales to kids? >> thank you for the question and the concern on all his important things. until there is disclosure of what kids experience as a drug content, sexual content, until there is transparency about these things i don't know what the incentive is which is why transparency is so essential. as parents and grandparents we see it, we understand it.
8:57 am
we know how frequent it is. that's what the numbers share and there's one thing for everyone to know is that when we talk about the categories you care about for example drugs and the company talks about that category they are likely talking but a fraction of the percent as we as a society. >> exactly we all know there's a lot of other things to do with fentanyl including at the border but this will be a major game changer for the ability to take these cases on. prosecutors have also reported an emerging trend they have photos of children that may fold to shy of the definition of child pornography industry with them on website with the intent to harass or abuse the child victims there is a major story on this in the "washington post." senator cornyn and i have a bill the shield act to fill in gaps in federal also prosecutors can hold those who abuse kids in
8:58 am
this way accountable. in your role as a person at facebook who was responsible for efforts to keep users safe can you talk about the deficiencies in current policies? >> thank you for the question. if you look at content of minors again the question is, is that something that violates company policy and would be removed is that with the company's acting on? or does it end up being something that because the content of the companies that act on it ends up recommending and distributing? and as a parent we see this. if you were to open the app and look at it you can find it. if you'd like it you get recommended it. these are all things the company is aware of in terms of reach and can do things about that but have chosen to do so. >> very good i think i was
8:59 am
listening senator graham and he is right there's one big thing we can do is to allow these cases are afforded in court i think some of these things i'm discussing makes it easier for people to proceed with these cases and create incentives. one thing i will add and ask on the record senator kuhn, senator cassidy i think senator blumenthal have a bill to allow independent researchers to look at the algorithms you know are designed in a way that manipulates these kids and can lead to their deaths, to require the digital platforms to give independent researchers access to data yes or no i figured you think this would be helpful, yes? very good. just again thank you and we can again talk about this all he wants we will remember you and your story. but until we get these things by both sides and can maybe put them together into one package
9:00 am
we are not going to get the solutions we need because just getting mad at these platforms have not changed their conduct. >> thank you center klobuchar. senator kennedy? >> social media can make people less lonely, can it not? >> it can do that. >> social media can deliver insights, can it not? >> it can. social media, when used properly it can give voice to the timid can it not? >> it can. >> social media can also spread hate, can't it? >> it can. and isn't it a fact that much of social media not all that much of social media has become a
9:01 am
cesspool of >> isn't it a fact that much of social media, not all, but much of social media has become a cesspool of snark? >> what i can speak to, senator, it happens also often, it does not need to be that way. >> isn't it that way? for macho, not all but much of social media? >> the numbers and i talk about, 20% of kids that witnessed bullying in the last seven days for content that does not get taken down, people comment on and it gets promoted. >> on trying to sum this up for us. it is a fact that social media has lowered the cost of being --?
9:02 am
>> yes. >> isn't it true that social media removes any geographical border to the harassment of others? and isn't it true that some forms of social media optimize for engagement. >> yes. they reward being an. >> they use social media to identify kids hot buttons? i don't know if that's true. >> some forms of social media use algorithms to show us and our kids things that push that was a hot buttons? >> recently, my daughter had
9:03 am
someone go on to one of her posts about cars. and said that you like to drive cars because you saw a man doing it. and she said, i have been doing this for years and i know a lot about cars and i more than qualified. and the person said, women belong in the passenger seat. when i asked her about that poster she would delete it because she knows i reporting will do nothing. she said, i will not delete it. i'm worried that will mean that less people will see my post. >> i will not ask you this question. i will make a statement. your brother not familiar with louisiana. in my state, social media has impacted the news media. particularly print media.
9:04 am
thank god for tv news and radio news. when it comes to printer media, here in louisiana, print media on paper and print media that is on the inner knee. we are a news desert. we only have about two real non- news print media journalists left. most of our print media members are now sports journalists. that's fine, i love sports journalist. there's a lot going on in the
9:05 am
world. what you do is what you believe, isn't it? everything else is just cottage cheese, isn't it? i look forward to the day when members of the united states senate will come together and establish a maneuver not used every day or even every week, but that rule would say, when there is a consensus and when you as a senator can demonstrate that you have 60 votes to pass a bill, you have the right to bring the bill to the floor. no matter who does not wanted. thank you, mr. chairman.
9:06 am
>> thank you, mr. chairman for testifying. i was curious about the fact that so many of the young people on these platforms are exposed to cyber bullying. your daughter experienced that because of some of the things that she posted online. there is also an addictive quality of keeping these kids online for these platforms, keeping these kids online means platforms for them. is there anything we can do to address the addictive aspect for what is happening to our young people. they continue to go on to these platforms and expose themselves to harmful content? >> thank you for the question. it is essential to have good data about the impact that this
9:07 am
product has. it's not that difficult. you could dig a teenager after half an hour and say, how are you doing? are you feeling better or worse? >> let's say that we have this kind of data, what do you think we should do? >> i think part of the measures are appropriate when compelled to figure out a good way to help teens have a use of the product that serves them. i think what happens right now, it distresses them and what i experience as a parent and what every parent here has experienced is the sense of urgency for the need to be on there and the impact it has on their emotions. >> do young people understand the harmful impact themselves? part of the testimony is how all of us should understand
9:08 am
what is happening. what it help if the kids themselves understand the harmful impacts. this is the aspect of education for the young people. would that help? >> in hopes to educate young people. what helps the most is changes to the product so it is less harmful. >> right now, there is not much incentive for these platforms to change their product, they face no consequences for the product. dozens of states including the state of hawaii have sued these companies alleging that they design their products to harm users. most of these have been consolidated in california saying that they have exposure because of section 230.
9:09 am
of these companies were exposed to legal liability in these lawsuits that are still pending by the way, the company's not file liable have to pay money as a result of these lawsuits, without change their behavior as far as paying attention to the harmful content on the platforms? >> i am not a lawyer and i am not qualified to testify about that. >> it is all about money, that is why they do what they do. if they had to pay money as a result of their content, would that have them change their behavior? >> my hope that what will change their behavior is that when mark zuckerberg says is that last quarter we made 100,000 million dollars. the number would go down very quickly. >> how would it go down?
9:10 am
>> there is no gold to reduce unwanted sexual advances as far as i am aware. >> if there is a law against these kinds of content, they are responsible. people know that they have exposed these kids. so what? there is no liability. they are protected from content and they do try. they have a limited definition of what is harmful content. on the other hand, i am all for doing more than what we are currently doing. one of the things that can also happen, i had someone ask me to look into the advertisement to
9:11 am
menstruation. i call that a very narrow definition of harm. these companies left to their own devices choose what is harmful in the examples that they use with a very limited definition, this is what i cited and that women's health products are harmful and they will censor those kinds of products. this is a lot more complicated than a first to tell. i know we will try and do something. thank you, mr. chairman. >> thank you, mr. chairman. thank you for your testimony. i know that we appreciate it. my state of tennessee, we have attorneys here from the attorney general's office
9:12 am
today. and they are pushing to also get something done for the overreach of facebook. we are grateful that so many states have stepped up to hold facebook and meta-to the task. we appreciate this. i want to return us to december of 21. he and i at the chamber of commerce protection committee that we lead, we had them in front of us. you are consulting for instagram at that time? correct? >> december of 2021. >> i had left at that point. >> a few months earlier, you had said in two emails that talked about youth harms on the platform, correct? >> correct.
9:13 am
>> i will call you some things from his testimony. he said that we care about the teens on instagram. when we research bullying and social comparison, we try to make changes. >> i agree that they'd make research and they don't make changes. >> the of the research and they take no action. >> we do not allow people to bully and harass people on instagram. we have built tools that prevent bullying from happening in the first place. we empower people to manage their accounts so that they never have to see it. do you agree with that? >> i believe that is profoundly misleading. at a time when the public statistic was a fraction of a percent. one in five teens had
9:14 am
experienced it. they are standing right there. if this was bold, it would be completely unacceptable. >> let me give you one more. talking about the executives, i am interesting -- interested in how they reacted to the information that came out in 2021 about the disregard to harm to myers -- minors. were they motivated to do more to address the problem? are they more interested in covering up what is going on at meta-at the time? >> you will have to ask about their intentions. actions speak louder than words. >> did any of the members of the meta-team, zuckerberg, cox, did any of them respond to your
9:15 am
email in a way that suggested that they were going to take an action to correct the wrongs? >> no. for six years, when i sent that kind of message, i would get a meeting within 24 hours to spend a meaningful amount of time to talk about what needed to be dealt with. in this case, the meeting sometimes later and then the lack of action speaks about that fact. >> money was more important than protecting children? >> you should ask them that question. >> i would be interested to know who took responsibility for making policy determinations about youth safety. one conversation you had with my staff, you suggested that mark zuckerberg had a part in
9:16 am
those decisions. when you return, you will tell employees not to raise youth safety issues to him, is that accurate? >> in my first stint, he would raise these issues and they would engage very practically. having done that for six years, he is probably one of the most qualified people in the world to bring it to their attention. i was not aware when i send my email that it was hard to talk to mark about this. my experience with how the entire company was behaving when it came to the harm of teens is that it is a cultural issue which in my experience prioritizes prevalence over harm is something that mark sets the direction for and the
9:17 am
whole executive team and that is when i realized it was necessary to appeal directly to them. >> they were aware? they knew that harm was taking place. they had the research that pointed this out. their own research? they made a conscious decision to do nothing about it? >> correct. >> did they talk about profits on these actions? >> not in my presence. >> other than mark zuckerberg, who would deal with youth safety and youth harm? >> for instagram. >> thank you, my time has
9:18 am
expired. >> i want to acknowledge my gratitude for the work that they are doing on a daily basis. we started working together when we are both in the house together introducing the first privacy bill. i was with you in this effort. i guess in our phrase, i would like to associate myself with your remarks. i would like to on my own behalf express my shock at this happening to our kids because there is a lot of money to be made. your questions reveal the disregard for mental health of our kids which is truly
9:19 am
shocking. i am all in with you on your efforts here. in vermont, the attorney general has joined the lawsuit. i want to thank you for providing such clarity embedded in the concern that you have for your daughter and for all of our kids. a couple of issues that have come up from the letters that i have received and comments that i know you are getting the same questions as well. i want to make sure we can do this legislation and that it does not do any harm. i've been seeing letters from those in the lgbt community. saying that this would compromise their ability to get together online and be supportive. i want to talk a little bit about how if we proceed with
9:20 am
the legislation, we are not going to interfere with the capacity of kids who are getting together and being mutually supportive and can we accomplish that? >> thank you for the question. i cannot speak towards the legislation. you are extraordinarily qualified for that part. my job here is to bring light to the harms that the teens are experiencing. no one talks about them in my experience that is misleading. >> that is based on the forefront of facebook. >> what i want you to know for you and for any kid, it does not need to be this way. instagram is standing right next to them as this is
9:21 am
happening. i know because i built these kinds of things for six years, can you please help me with this? and then get help with whatever is happening to them. today, this is not happening for them. >> it is the exploitive content that you are focusing on. >> thank you. it is actually when somebody says, you go to school and you are in the hallway and somebody says, i will make sure you do not get invited to any party ever again. only people around you hear that. if that happens online, that post implies a person and does not name them and it is incredibly distressing to the team. the kind of stuff i am talking about because i deeply care about every child in every
9:22 am
context. that child gets left out and insulted because of the reasons that are outlined. children should be able to get help depending on what the content is. that is important for all children no matter what their gender is. >> i share that. >> one question that came up is about encryption. i would hope that any legislation we have would not compromise the privacy rights of individuals. >> i deeply believe in privacy. and everything i am talking about. if a child gets a direct message that makes them uncomfortable and hurts them, it does not matter what the content is, it is my house, my
9:23 am
rules. it only matters that the child feels uncomfortable. can we add a button when a child receives this message. please help me. what's going on? someone is being mean to me. it does not matter what the content is. if someone is initiating those messages and telling them those things, step number one, they should know, that is not appropriate. >> kids first. >> thank you for being here for your current testimony. i think we have met the enemy and the enemy is us. we have six bills that were referred to for this committee.
9:24 am
in the senate, the only person i can actually schedule those bills for vote is the majority leader. i would suggest that we focus the attention on getting senator schumer to vote on those six pieces of legislation. that would be a good start. without that happening, nothing will happen in the senate. when trying to figure out a complex topic, follow the money. you mentioned a number of times, the data. do social media applications collect huge volumes of data about the users? >> they do. >> that data is then used for advertising products. it is amazing when i go to a
9:25 am
website and i look at something, piece of hunting gear, the next thing i know, on instagram, an advertisement for that same company shows up. the way that happens is that instagram, facebook, x, formally known as twitter, they sell that data to companies that use that information to promote their products, is that correct? >> i'm not an expert in that field. >> that is how they make money, right? >> they make money through advertising. >> i was shocked to read an article that talks about how it is easy to buy sensitive data about military personnel.
9:26 am
at the request of west point and others. to determine that for $.12 per record, data holders would sell sensitive information about u.s. military members and veterans. with that surprise you? >> this is not an area where i have expertise. i do from being a security professional and ensuring that the systems do what they say they will do. i do not know how it gets brokered. >> it is common knowledge that that is the case. this data accumulated by social media companies is sold and that is why when you go on instagram or facebook you do not have to pay a subscription or fee and if they could not recover that revenue from selling that data about me, and
9:27 am
others, and they will have to charge a fee to make this economical. they do not do that because they can sell your data. what you have discovered and you shared with us today about this one social media company, the truth is that this is not unique to instagram or facebook, correct? >> it is all of social media. >> we have talked a lot about our concern about china's buildup of its economy and its militaries threatening peace in asia and elsewhere. we also have talked a lot about apps like tiktok, for example that our chinese applications that do much as instagram does and they vacuum up all this data, addict our children by using the algorithms to figure
9:28 am
out what to recommend to them. this is all about the data and all about the money. we mentioned the use of social media applications when it comes to selling drugs. fentanyl is the single leading cause of death in america today. much of it is transacted through the use of social media. there are things like deep fakes. do you know what a deep fake is? >> it is when you use technology to create an image but it is not a real photograph or image of that person. >> deep fakes are being used to portray young girls for sexual
9:29 am
gratification. using the false images due to this incredible technology. this can be used for a lot of good. it can also be used for ill as well. i want to thank you for answering some of these questions. we have a lot of work to do in the senate and in the congress, as parents and grandparents to try and protect our children. thank goodness my daughters are adults now and they aren't in the age of senator holly's kids or others. the first thing we need to do is ask the one person who can actually scheduling floor vote on some of the bills that have passed unanimously to schedule a vote. we could do that next week.
9:30 am
he has to make that happen. >> thank you. i cannot speak for senator schumer. i know that he is dedicated to reform in this area. i'm sure you will make that interest real at the floor of the senate at this time. >> as a mom, this is a topic that i could not not show up to engage in. i appreciate your leadership for fighting for and leading on behalf of my daughter and america's children. i know not just your own. specifically when you are talking about taking the all children approach. i want to engage in a space
9:31 am
where the all children approach has not been taken. i want to get your thoughts on some gaps. that we can try and fill. we know that the internet can be a hateful place. among your research in the meta- user experience, you looked into identity based hostilities on the platform. instagram users witnessed hostility based on someone's race or religion within the last week. one study published in the journal of child psychiatry looked at black youth who experienced increases of racial discrimination from white counterparts that did not.
9:32 am
those instances predicted worsened mental health amongst black youth. can you talk to us a little bit about what more you think the company should be doing to protect against these kinds of racial and ethnic harassments online? >> sorry. the fact that a child today, black, any identity, gets called out in front of the entire shared audience. the difference when this happens in school and when it happens online.
9:33 am
go home and ask your child. what would you do? there is no way for a child to say, this is happening to me. that person is being really mean to me. 10 years ago, facebook new this. we knew that in order to help a child dealing with an issue, we have to hear the words that they use. a 13-year-old does not like to report things because they are worried that they will get in trouble or get other people in trouble. would you like some help? when you look at the work that they submitted 10 years ago, you should be able to say, this is awful for me because of my identity. the company should be able to take that into account to help that child be protected and get them resources and make sure that is not acceptable behavior in the community. the most tragic thing about the 20% number is that the lack of
9:34 am
action on the part of the company for the content that they would take down, they are normalizing the behavior. children watch and learn from the way that other children are behaving. >> just to follow up a little bit, what would it look like to create a good experience? is it the ability to exercise agency for the button that you are making reference to? >> if it is in direct messages, you have a button, you can record that someone initiated that. how many hateful or harassing messages could someone send before you tapped him on the shoulder and told him it is not appropriate behavior. if someone keeps doing it, you know that they are up to no
9:35 am
good and then you can take further measures. without this data, no systems have a hope of making a safer environment for youth. >> what you think has been the barrier for companies? the company that you have the most experience with, what do you think is the barrier to change? what do you think would help to create that? what can help to overcome that barrier? >> they are not incentivized to make this change. it has been two years. our kids do not have this option in direct messaging to say that this makes me uncomfortable. you can say it about an ad, for example. you can say that is inappropriate. the thing about this is that until the information is transparent, we strongly recommended includes identity-
9:36 am
based youth. for the overall number that is 10%. 90%, 80% of youth that experience these things because of the identity issue. the issue is if the company finds it as a priority. >> thank you for your leadership on behalf of america's children. >> thank you, sunder butler. a number of colleagues are joining us and returning in the next few minutes. we want to begin a second round of questions now. speaking of which, the senator is arriving and we will give you a couple minutes to get comfortable. >> thank you for convening this important untimely hearing.
9:37 am
thank you so much for taking of your own personal experience. as the senior engineer responsible for the well-being section within this unbelievable platform, quick survey suggests that something like two thirds of all american teens are currently on metas platforms. in particular, instagram. i am very concerned about the impact on our children and our future. i want to make sure that i have a chance to question you for just a few moments about a possible path forward. as you testified to the committee today, your own research was hidden and ignored and marginalized by the very team that had recruited new to return to a leadership role at meta-. it highlights the danger
9:38 am
his lack of transparency and the consequences of this ongoing global experiment with our children and documents ways in which they are on the receiving end of images that make them feel worse about themselves and unwanted sexual advances. the surgeon general has issued a call for congress to act and recognize we are experiencing a crisis in mental health amongst our children and to find ways to restrain these platforms and their impact. a bipartisan bill that was referred to before his the accountability and transparency act. it would make critical advances in transparency and required platforms to disclose the public safety information that they currently hide. can you give toward her three examples for the algorithms and
9:39 am
how it works for the public to know that companies like mehta refused to report and do you expect companies will ever voluntarily fully disclose what it is about their algorithms that make these platforms addict give or even dangerous for our children? >> apologies, thank you for the question. for as long as the companies make up the definitions for what is harmful for addiction, i looked into that issue. and what i found was that it was so narrow and does not capture what we all see.
9:40 am
without transparency for what teens are experiencing for the role that social media plays in their life and without ensuring that when they need help, they actually help them. this is something that we proposed, it was not adopted. without these things, nothing is going to change. >> can you explain for us how empowering independent researchers will provide an understanding of how safe or dangerous social media platforms really are. and talk about what kind of research can be done in order to facilitate better safety outcomes for teenagers? >> i can speak to that because that is what i did and what my team did for six years. we brought in experts from
9:41 am
different universities a 13- year-old is more liable to take risks and they knew that it was important. what is important is to make sure that they feel supported in a moment. what is important for us as engineers and designers, this is why independent research is absolutely necessary to help our understanding of what people are experiencing. >> thank you. senator holly said earlier that the instagram algorithm does not just promote but accelerate the connections between pedophiles and our kids. for anyone who cares about our community, that should be a chilling sentence. the fact that you dedicated years to conducting research on
9:42 am
safety. and you are only here before us as a last attempt to motivate all of us to advance legislation to unlink the corrosive connection between algorithms and self-harm and assaults on our children. >> we are going to have a second round of questions limited in length. thank you for your questions and your perseverance here today. let me just begin by saying that the lawsuit filed by the commonwealth of massachusetts yesterday which is one of nine individual lawsuits filed by state is complementary to the
9:43 am
federal lawsuit filed by 33 states in district court. connecticut joined that lawsuit. i will ask that the complaint be made as part of the record. 90% of young people in the united states. 90% of young people use instagram. we are talking about millions of young people are we not? it cites mark zuckerberg saying, in october 2021, in response to the whistleblower testimony before our committee, at the heart of these
9:44 am
accusations is this idea that we prioritize profit over safety and well-being. that is just not true. he said further that it is important to me that everything that we build is safe and good for kids. taking your admission that actions speak louder than words, his actions demonstrate the falsehood of those claims, do they not? >> they do. the same note i would like to bring to the committee's attention. >> share. >> in the same note, mark zuckerberg wrote, when it comes to young people's health or well-being, every negative experience matters. it's sad to think of a young person in distress, instead of
9:45 am
being comforted, has had their experience made worse. i believe that is what instagram does today. >> the reference was made earlier to the policies of facebook and social media being data driven. in fact, they are dollar driven. correct? >> my experience with data driven -- >> in this case, facebook and net -- meta-doctored the data. >> this data that should be public, we should not be here to talk about it. it should be public. >> i was struck in the memo that you wrote that you made
9:46 am
the point that everyone in the industry has the same problems right now. >> correct. >> and in effect, you urged mena -- meta-to be a leader. and you said, there is a great product opportunity into figuring out the features i make a community feel safe and supportive. a great product opportunity. in fact, you are inviting them to design a better product. that consumers would prefer. because it is safer. correct? >> correct. >> the history of capitalism, i
9:47 am
don't want to be philosophical here. consumers go to products that are more efficient and more effective and also safer. safer cars, safer ovens, safer walking, safer everything. you are appealing to the better instincts of zuckerberg and the whole team. >> that is correct. instagram is a product like ice cream or a toy or a car and i ask you, how many kids need to get sick from a batch of ice cream before there are all manner of investigations? standing right next to the team, they are delivering the
9:48 am
unwanted sexual advance and they are delivering the content that is upsetting and they are standing right there. there is an opportunity for them to be told, there is something awful happening here. can you help me? yes, i can. using it to making the community be one that is safer. >> the kids online safety act is about the product. about product design. if you have consumers, give them choices about what they want to see and hear. to disconnect the algorithms. to drive something that people don't want to see or hear. it is not censorship or content blocking. do you favor that approach to protecting young people and others on the internet? >> completely. in a world where in the third paragraph in my email to mark, in my experience from 10 years
9:49 am
earlier, 90% of the content that teens experience as harassment, you might be discernible for the policies. the only way to address this is through the measures you are describing. the product needs to be different. it needs to change. >> the safety act is also about holding social media and tech accountable when they harm people. right now as you have heard, they feel no sense of accountability when it affects the bottom line when mark zuckerberg gets his quarterly report. would you favor that kind of accountability? >> absolutely.
9:50 am
i want to take a moment to say, in my experience, the integrity and well-being professionals that are working on these issues firsthand are incredibly good people with wonderful ideas and management could not be letting them down more. >> go ahead. >> during that time, one of the issues with materials as we talk about the content that we know is bad for body positivity issues. we know it is being recommended and that teens are spending a lot of time looking at it. they are unwilling to address that. i cannot imagine that ever- changing. >> another part of the kids online safety act provides for transparency about the algorithm. so that there can be more public knowledge and expert
9:51 am
knowledge. would you favor that approach? >> yes, transparency is essential. algorithms are as good as their inputs. they can be measured by outputs. if the algorithm does not know, why wouldn't they recommended? when you look at what it is recommending, it should be held accountable. the only way that there is transparency about the aspects. >> before i go to senator holly for his second round of questions, you mentioned that the people who worked on your team and the people who work in these companies are generally good people that want to do the right thing. i noticed in your memo, you said
9:52 am
a point which might be good for you to know which i did not put in the document reviewed by the team is that many employees have spoken who are doing this work and are of different levels and are distraught. they are distraught about how the last few weeks have unfolded. these people that love facebook and instagram i assume and are mission driven to the work, they are distraught by the public exhibition of facebook, knowing that it was profiting on toxic content driven to kids in the company. concealing it and rejecting recommendations for improvement. rolling back safety measures,
9:53 am
correct? >> they were distraught and they were afraid that because the company was disallowing body image issues and there is study and data that says otherwise, they were afraid that the work would be dissolved and they would not get the support that they needed. the amount of investment that the company ought to do for these people should not be to the amount of harms that you have. >> thank you, you have been extraordinarily patient and forthcoming in your responses. it has been tremendously helpful. thank you so much. i want to come back to something that you have been asked over and over.
9:54 am
you said changes to the product. instagram is a product like ice cream, changes to the product would be most helpful. there is no incentive. by no incentive, that means that there is no money in it for the company, is that when it comes down to? if they could make money on it, would they want to? >> am very excited for the day that mark or adam are sitting here and you can ask them, why did you not invest? one of the things that is in each recommendation you see here, do you understand what data is causing these things? here is the button that you can build in the systems, those are not a significant investment. it is a matter of how much they prioritize the work and if they are willing to set their goals based on what teens are experiencing. >> i went had this. you said it would be great for
9:55 am
mark zuckerberg to say, making 35 million this quarter and year is the harm that teenagers suffered. we have 34 billion in jury judgments pending against us. i just have to say that at the end of the day, if you want to incentivize changes to these companies, you have got to allow people to sue them. you have to open up the court room doors. facebook was fined $1 billion and it made no discernible difference to the business practices. they change nothing, they don't care. they fear parents going into court and holding them accountable. that is what happened with big tobacco and opioids. that is the hammer. that is what we have got to do. we talked about the bills that have passed in this committee.
9:56 am
our bill on child sexual exploitation. on my money, the best part about that bill is that it has a private right of action. i will make you a pledge. we will vote before the end of the year. before the end of this calendar year, i will go before the united states senate and i will flow on the bills that we have passed in this committee. we will find out. we will put people on record. i'm tired of waiting. many folks in this committee have waited far longer. any senator can go to the floor and call up a piece of legislation and ask for a vote on it. before this end of the year, i will do it. we have talk about, we need to do stuff. on the money, the money that is flowing into this capital from
9:57 am
big tech is obscene. it is totally obscene. if we want to change something will get corporate money out of politics. stopping corporations for making political corporation -- contributions. we will vote before the end of 2023. we will put people on record and we will see where we go from there. i hope your testimony today will motivate people. every parent will say, you know what, that has been my experience. to have someone as an engineer that you are, parents feel isolated and they feel like maybe i don't understand this technology. listening to you today, parents will say, i'm not the only one. >> if i may say something about that, parents know when they see this every day. the other thing that has been
9:58 am
my experience doing this is that parents know how to parent. sometimes when i have have -- had the parent of a child who has been groomed, they will say, i don't understand this technology. the best experience for people to think about these things, take social media out of the conversation. as the parent of a young kid, you know where your kids are spending time with, keep an eye on that. you want to make it safe for your kid to come up to you and say, this thing is happening. this is what is happening with me at home. you want to make it safe for your kid to bring up an issue to you. when you see that these things are happening on these devices, if they were happening at a school and you knew that one in five kids were witnessing were experiencing unwanted sexual advances and the kid turns to somebody in the school for help
9:59 am
and they say, i'm sorry, i cannot help you with that. as a parent, what would you do? you would hold them accountable. that is one of the reasons i'm here today. >> thanks, senator. i would again make the point that the kids online safety bill imposes accountability. i want to join the pledge to seek a vote before the end of the year. i'm hopeful we will have a vote and a positive bipartisan vote in favor of the kids online safety bill. i challenge social media and big tech to come forward and put your money well -- where your mouth is. for years, they have said, we want regulation. that has been their mantra,
10:00 am
trust us. no longer will kids or parents trust social media to impose the right safeguards. we want to give them the tools that their products need so the kids can take back their lives online. oday. i wish my cy. olleag ue from vermont vermont was still here. it was 2012 when he and i started on privacy and filed the first privacy bill in the house, and as senator welch was saying, we've been at this for a long time, and we've been fought by big tech every single step of the way. every way. and it's been really quite amazing to see because they are -- and sometimes people will say, well, how did tech companies grow this big this fast. and they didn't have the
10:01 am
guidelines, the rules and restraint that the physical world has. and it's kind of been the wild west. and we've seen that in how they choose to gather data and data mine and use that to make the dollar, the eyeballs, they've got to keep these eyeballs on the page. the longer they keep them, the more money they make. . now, i want to go back to the hearing we had with mr. masari in december '21. and i -- for the record i want us to build out a little bit more of this framework because i think it's important to the states that have joined the lawsuit. i think init's important to us we work to get the kids online safety act passed. now, when you were with facebook, you built a structure
10:02 am
that would allow for some online governance, and you put in place what you thought was a pretty good process for keeping people safe online, correct? >> that's correct. >> and basically you had embarked on safety by design, is that correct? >> that is correct. >> okay. and you were putting in place a duty of care for the social media company to be responsive to the users that were on those platforms? >> that is correct. as i was going through one of these materials, i remember talking about bullying and teenagers and we as a company had a responsibility not only to the teens within the but also to improve the world's understanding of these issues so that the field could be moved forward. and that is the spirit in which we engaged the work.
10:03 am
>> and then in 2013, facebook decided they were going to change the rules and allow kids ages 13 to 17 to post content on instagram, correct? >> i don't know exactly that change happened. >> i think that is accurate. and allowing them -- what do you think changed? what motivated them to drop that age and allow 13 year olds? >> i cannot speak to the motivation, but what i can say is that if you look at those 2013 presentations and 2012, one of the things that is written about there is that a 13-year-old will do riskier behavior and feels things more intensely because that's where they are developmentally. and so making a change that potentially increases the audience i think would be
10:04 am
inconsistent with that understanding. >> i >>find is so interesting that whether it was zuckerberg or sandberg or cox, when you highlighted with them how readers were responding to the survey. users were just responding, and you kept trying to direct this toward the experience, not the perception but the experience, and that is noted several times in your emails to them. even though 51% of users said they had a negative experience, they chose not to address that issue. and in most corporations allowing issues like that to just slide would never be tolerated. so it is left -- when i ask
10:05 am
myself why did they do that, it has to be because they were motivated by profit over motivation to protect their users in the virtual space. i do want to ask you just a couple of things, to go back on your memo to adam masari, the october 14th email, and you laid out an agenda and an opportunity for items for discussion so that you would make good use of your time. and you explicitly and specifically went through the numbers on kids that had received different negative interactions, and then you broke out the data by age. and you created a chart so that
10:06 am
he could look at it in a google doc. >> correct. >> how did he respond when you broke it out by age, or did he take the time to look at it? >> it is my experience of all the years in meta that an executive gets that email, reads it thoroughly, looks at all the attachment, and so it would be my expectation that he had read it. in my conversation with them, he demonstrated an understanding of everything we spoke about, and we ok specifically talked about the button for a teen girl who received unwanted advances. >> okay. thank you for that. i think that what troubles me is knowing that harm was being done to kids, and then to tell us, and i quoted back to you some of his comments from his testimony that he gave to us, and for him to allude to the
10:07 am
fact -- to give the impression that they've built tools that prevent these adverse activities, but then you know it's that old thing of the truth, the whole truth and nothing but the truth. it was true. you built tools, you built them. that was true. but they chose to remove that. and in doing that, there are hundreds of children that we have himet with their parents, and we have heard about the suicides, the attempted suicides and the adverse impact on these children. thank you. >> thank you, senator blackburn. thank you so much for being ng here today. as you can tell from the
10:08 am
turnout, there is very strong bipartisan support for reform because actions do speak louder than words, and my hope is that colleagues will join senator hawley and me and senator blackburn and senator durbin and others in seeking action on a very doable, practical, politically achievable bill that targets the design of this product, much as we would a safer car or stopping addiction to cigarettes and tobacco and nicotine. big tech is very much in danger. i would say it is the next big tobacco, and i'm hoping that it will join in this effort to make its products safer. in some ways what we face here
10:09 am
is the garden variety challenge to improve the reliability and safety of a product that uses a black box that very few people understand which makes it more complex and mysterious but no less urgent and ultimately understandable by everyday americans. everyday americans understand the harm that's being done. we have seen and heard it from moms and dads, from teenagers who have come to us and pleaded, absolutely implored us to act now, not at some distant point in the future. and so by the end of the year, i'm very hopeful that we will have a vote, and that it will be an overwhelmingly bipartisan vote, in part thanks to testimony that you have offered
10:10 am
today. it has been tremendously impactful and moving and very powerful in its science-based persuasion. you're an engineer. as you have stated, you're not a lawyer. but ultimately engineering is what may save facebook from the perils and dangers that it's creating, along with other social media, it's not alone. and my hope is that we will wa move forward so that in effect we can make big tech the next big tobacco in terms of a concerted effort to reduce its harm and inform the public about how they can do it as well. so thank you for your testimony today, and this hearing will be adjourned now. but the record will remain open for wa week in case colleagues have questions they want to submit in writing.
10:11 am
in the meantime, again, my thanks to you for your very impactful and important testimony today. the meeting is adjourned. [background noises] [inaudible conversations] [background noises]
10:12 am
[background noises] [inaudible] [inaudible conversations] john boozeman
10:13 am
[inaudible conversations]
10:14 am
tonight president joe biden delivers his state of the union address to a joint session of congress. join for his remarks, followed by the president's speech, the gop response and viewer reaction. that starts live at 8:00 p.m. eastern on cspan. cspan now our free mobile video app and online at cspan.org. friday nights watch cspan's 2024 campaign trail, a weekly roundup of cspan's campaign coverage, provide ago one-stop shop to discover what the candidates across the country are saying from voters, along with firsthand accounts from political report, updated poll numbers, fundraising data and campaign ads. watch cspan's 2024 campaign trail friday night at 7:00 eastern on cspan, online at cspan.org or download as a podcast on cspan now, our free mobile

11 Views

info Stream Only

Uploaded by TV Archive on