Skip to main content

tv   Digital Information Disinformation  CSPAN  April 5, 2024 11:13am-12:15pm EDT

11:13 am
barbara mcquade is.
11:14 am
right. barb. you can leave now while i work here. is done. it all gets gets downhill after this. barb is a professor from practice, the university of michigan law school.
11:15 am
she's a big michigan sports fan. so i'm, sure, she's aware that the last time arizona played michigan in men's basketball, arizona won by 18 points. she's also a legal analyst for nbc news and msnbc and a co-host of the podcast sisters in law. for seven years, barb served u.s. attorney for the eastern district of michigan, appointed by barack obama. and she was the first woman to serve in that position. while you may know, you may know barbara best from tv legal commentary. you not know that earlier in her career she worked as a sportswriter and copy editor. that may explain why she's told me her dream job would be to play shortstop for the detroit tigers. barbara has authored attack from
11:16 am
within how disinformation is savvy tarnishing america. released just a few weeks ago and already number three on the new york times bestseller. it's a it's road map of how we got into our disinformation mess and how just maybe we might get ourselves out of it. cashmere is a journalist at the new york times. where she about the looming tech dystopia and how we can try to avoid it. cashmere got her start in journalism 2008 as a writer for the legal blog above the law. the next year, while getting her masters in magazine journalism at nyu, she created her own blog called the not so private parts. it was it was supposed to be a one year project, but now, over a decade later, cashmere is still chronicling the state of privacy in the modern age.
11:17 am
she joined the times in 2019 after working at gizmodo media group and writing at forbes. she's also written for the new yorker about poker, the card game she loves to play. i wonder good kessler's poker face is and she's written an eye opening book last year about facial recognition called your face belongs to us a secretive quest to end privacy as we know it. cashmere has from duke and nyu, and she's currently training for a 50 mile trail. but she tells me not to be too impressed because. her portion of this relay race is only a half marathon. cashmere. we're still impressed. still impressed. and last but not least, josh horowitz. jeff horowitz. sorry about that. so, jeff. jeff is a technology reporter
11:18 am
for the wall street journal. his reporting series called the facebook won the george polk award for business reporting and the gerald loeb award for beat reporting. previously an investigative reporter for, the associated press in washington. he now lives in the san francisco bay area. his book published last year, is broken inside facebook and the fight to expose its harmful secrets. it lays out in sobering detail not just the architecture of facebook's failures, but what facebook knew and ignored about damaging impact on all us. if you're wondering what jeff has been doing recently in his spare time tells me he's in the middle of attempting develop immunity from poison oak and poison ivy via described in largely forgotten century old medical research. jeff, you'll forgive me if i don't shake your hand, but the poison oak and the poison ivy. is it working?
11:19 am
i want to know. yes. and the wall street journal will supposedly be publishing magazine story about this at some point. i will i will read it with great interest, having had of those in my lifetime. all right. let's begin. let's be big in our questions with barbara mcquade and the book attack from within. barb, it was a few years ago that you mentioned to me that you were thinking about writing a book but seeking just the right topic. so why? is there connection between disinformation and your legal background? and help us first define disinformation. and the premise of your book. yes. well, thanks, frank. and first, let me say thank you. i can't be more thrilled to be here at the tucson festival. what an amazing event and an amazing group of people. so you so much for inviting me to be here. i'm really thrilled to be here. it's much warmer here than it is in michigan. i've noticed. it's pretty nice. but the book, my is as a
11:20 am
national security prosecco cutter. when i was an assistant u.s. attorney and then the u.s. attorney in the eastern district of michigan. i focused on on threats to national security. and i noticed, frank, how it evolved, the time i was working in that space from al qaeda after 911, it evolved to asis and then it involved evolved to cyber intrusions and then to russia and now what i see as the greatest threat to national security is disinformation from within own country. and that's why the book is called from within how, you know, political operatives or profit seekers or people just pushing an agenda are engaging in disinformation. and in my work as a professor at michigan law school, i teach a course on national security and civil liberties and how people are taking advantage of our right to free speech to push false information. and i worry that we are at place now where we are so polarized and technology allows us to reach a large audience so quickly that what we're finding
11:21 am
is not only are people by false information, but many are just going along with the con because prefer a political position or political agenda. you know, so for example, people like elise stefanik, a representative in new york, referring to the january six defendants as hostages, she absolutely knows better. what we're seeing, i think, are people willing to go along with lies, to advance political agenda, which i think is a very serious danger to our national security. so my hope with this book is to open people's eyes to the tactics of disinformation, to explain how it's harming our democracy, our public safety, the rule of law, and to offer some suggestions so that people can become aware of these tactics so that can defeat them together, because i don't think we can choose tribe over truth and remain a democracy. so that's the premise of the book. so we believe we've undoubtedly we obviously have authors here on the panel and we have would be authors and perhaps other authors in the auditorium today. what was the dialog with your
11:22 am
publisher on your initial pitch and how, if any, did it differ from how book actually ended up? yeah, actually i got a call from an editor who had seen a piece i wrote in the new york times about the dangers secretaries of state who were election deniers, those who were seeking office in 2022. so this piece was written in the spring of 2022 and on the ballot were a number, you know, hundreds of election. some of them got elected and some even got elected secretary of state. and, you know, this is a once sort of sleep. the point of my essay was this is like, you know, they take care of the records in state. they oversee elections, maybe driver's license records and other kinds of things. but this is now becoming a really critical position in our states. if they're in the hands of election deniers, what does that do to our democracy? so that was the piece. and he said, would you like to write a book about that? and i said, well, the piece is about a thousand words, and that's about all i have to say about it.
11:23 am
but he said, well, could you expand the idea? he said, well, what i'm really interested in is this concept of disinformation. it's something i have been really studying for the past several years, been teaching it in my national security class. you know, the mueller report from 2018 had some amazing insight about russian disinformation. and i said, that's what i'd like to write about. and they said, then you should write about that. so that's how it began. so it did take a different shape, i suppose what the initial pitch was from this editor. but in terms of, you know, what i wanted to write about, i spent a lot of i spent about a year just reading and researching and frankly had been and researching in this space since 2018 and wanted to talk about the history of disinfo, the tactics, the cognitive reasons it works on us, the technology aspects, why we in america can be particularly vulnerable and how it's harming us, and then maybe a way out. so it was really fun to learn about all those things. it was fun to write about it, and i'm hopeful that it can be useful to people. thank you. yeah.
11:24 am
so you also write about something that i've wrestled with myself, which is i'll quote you. you write that quote, some of america's best virtues are also our worst vulnerabilities and quote, so what is it about our democracy that makes it so so susceptible to disinformation and so hard for us to fight it? yeah. so, you know, we have some really great strengths in our country. we have, for example, a cherished first amendment right to free speech and. it is beloved on the right and on the left. and i don't think anybody ever wants to compromise that in any way. but as a result, sometimes it used as a cudgel to any time you want to do anything, might in any way regulate social or the things, accusations of censorship, because that's a bad word in our society. censorship and we don't want to do anything that might smack of censorship. that doesn't mean we can't do anything. i'd love to hear it. thoughts from our other panelists about regulating algorithms for example, you know, non content based things.
11:25 am
but i think it's very easy to throw up this idea of censorship because of our first amendment values. it also makes it difficult to regulate things for campaign finance reform. we've tried many, many times, know senator john mccain, tried many times to rein in campaign spending. but the citizens united case that the supreme court decided, you know, allows corporations and organizations to engage in unlimited spending on political, often under shadowy names. you know that sound like something very patriotic. the red, white and blue grandmothers of america, which, you know, in fact, is and, you know, some sort of special interest or maybe one individual and so that makes it hard for us fight back against things. and then also, you know, our diversity, one of our greatest strengths is our diversity but it is sometimes something that people to divide us, to try to push and us versus them and blame the other for all of society's woes. we've seen an increase in hate crimes in the past six years, i
11:26 am
think because this idea that i'm going to demonize everybody else who's different and tell you that they're to blame for all of society's ills, you know, it's not corporate tax cuts. it's not huge ceo salaries. what's really causing problems are immigrants or other or the lgbtq. those are the people that you should be blaming for all of society's ills. and so i think that those things i consider society's greatest strengths, our first amendment right to free speech, the diversity of our country, i think, are being weaponized against us and disinformation is the fuel that makes that go. you mentioned your a national security lawyer. what has been the impact of disinformation on national security, public safety? so you know, i'll start with public safety. that is i think that when we here lies about how the fbi is
11:27 am
disgrace that accusations that they planted evidence at donald trump's home at mar a lago although you'll note the inconsistency. right because he's also said he had every right to take those documents when you a former president telling you that the courts are a disgrace, that the department of justice is the department of injustice, it gets people riled up and i think it becomes so reckless that it doesn't give license to some people. i think it suggests a duty for people to take the law into their own hands and sparks vigilante violence. and so things like this, fbi breach by a man in cincinnati at their office with an assault weapon right after this mar a lago search is not surprising. the who sent the pipe bombs to members of the media and the democratic party who was a huge supporter of donald trump is not a surprise. i hammer wielding intruder at nancy pelosi's in san francisco.
11:28 am
should not be a surprise. a plot to kidnap michigan governor gretchen whitmer should not be a surprise. the that the swatting that we're seeing around the country of, you know, the maine secretary of state and the judge who's presiding over the federal election interference. and special counsel jack smith. all of that. i think, is a predictable result. but when people become so outraged that they don't trust the fbi, they don't trust the courts, they don't trust law enforcement, they feel the need to take law into their own hands. and it leaves us all in a very dangerous place. that's public safety. but in terms of national security, of course, the biggest vigilante violence that we've seen in recent years was the attack on january 6th, where thousands of people heeded the call to take back our country and stop the steal and to try to use force to stop congress doing its work to certify an election and other countries around the world that and said see democracy doesn't work and it is responsible for democratic
11:29 am
backsliding around the world. it has harmed our stature around the world. we are no longer the model of and the shining city on the hill because point to us and say democracy work, you're better off in a system where prosper city is your goal and you have a strong leader who will take care you instead of the messy business of democracy. there was a russian leader who said after 911 or after january six, look at the united states is limping on both. this is what democracy looks like. and democracy, of course, helps our country, in terms of our public safety and it has been part of our foreign policy since world war two to lift up democracy around the world. it's why we invest in naito, because we believe that other countries are run through democracies. we will have fewer wars and military conflicts. we will have more trading partners for us that are beneficial to us. we will have fewer refugee crises around the world and look what's happening in our world today. there are military conflicts,
11:30 am
there are refugee crises. and that is because of the destabilization of democracy around the world. and so decent formation is really damaging our country internally and around the world. so you spend almost the last half of your book, you know, the first half is, is how we got into this. second half, maybe how we can get out us. give us your top three suggestions that you offer that someday could could help help us climb out of this rabbit hole of decent. yeah so you know it's a dark place to talk about how decent formation is damaging democracy in public safety and the rule of law. but think that there are some solutions and. you know, every problem that's created by humans be solved by humans. we have the ability to do this through our power in democracy see. so some require legislative fixes, you know, at the moment, not super that we can get anything done in congress. but again, we have the power to change that by. electing representatives who want to solve problems. and so i think some of them are things that you'll probably echoed in josh's.
11:31 am
yes, there i go. all right. that's fine. that's my fault. jeff's just change. it'll be easier. yeah. no, i. for the purposes of this panel, jeff's book in cashman's book about regulating algorithms. right. i think it's very difficult to regulate content on social media, but as jeff reveals in his book, it's you know, it's not the content, it's the algorithms according to whistleblower francis hogan. you know, facebook was driving people content that generates outrage. and so, you know, perhaps we require certain rules about the algorithms that social media companies can create or at least disclosure of the algorithms so that we know, when we're being manipulated, campaign finance laws, citizens united tells us that we cannot forbid the expenditures of corporations and organizations with limits, but we could require disclosure of those groups like just who is red, white and blue grandmothers of america. right. maybe that we could require
11:32 am
disclosure. that would be a way we could fix campaign finance laws to make it better. and then the third one i'll mention, frank, is something all of us can do, which is to improve our own media literacy so that we're not also pressing and expanding and sharing false claims. i'll just tell you a quick story about a time i myself shared online because i was misinformed. i read something social media that said that patrick mahomes, the nfl quarterback, was refusing to play another down for the kansas city chiefs until they changed their name to something that was not offensive to americans. and i thought, oh, good for him retweets. send that out. that fired up. way to go. later that day i was talking to my husband and son and said, did you see that story about patrick mahomes? they said no. what are you talking about? i didn't see any story like that. you, sir, are you sure that's real? like, of course it's. i saw it on twitter, so went back and i look to see the source. i found it and it was on espn.
11:33 am
okay, that's legit. but then i looked at more closely and the account was called sports center. oh yeah. espn has a sports center, but not a sports center. so it was a fake. and so i learned a valuable lesson about not spreading disinformation online. right. looking for a second source before you get excited about a story? looking, make sure the source is credible, exploring the data within it and so there are all things we can do to become more media literate so that we are not expanding the problem i you know, solving the problem begins with us. yeah, great. great advice. great advice. thank you, barbara. all right, let's let's move on to our second author, kashmir hill, and the book, your face belongs us kashmir, your book title, your face belong to us belongs to us is a compelling, disturbed being story of a company's quest, one company's quest to develop and monetize facial recognition. what is facial recognition? and if our faces no longer
11:34 am
belong to us, who do they belong to? so facial recognition technology. and there's couple of different types. okay. there's recognition technology that many of you likely use if you have a smartphone to open and unlock your phone. right. and to sign into bank apps. and that's what it's called, 1 to 1 facial recognition, where it's just making sure you are you that is very controversial. what is controversial is what is called one, two, mn or one too many facial recognition. and this is where you somebody takes a photo of you. you know, it's making a face print of your face and then they're searching a database to try to find you. it could be a database of a million or a billion or in the case, clearview ai, the company i wrote about in this book, 40 billion photos that they have scraped from the internet. yeah. so answer is, a lot of people may own our face at any given
11:35 am
time. how did you even find about this company called clearview? what is it about their story that compelled you to write a book? yeah. so i got a tip about clearview in the fall of 2019. it showed up in a public request, and at that point, as a as most journalists, i you know, i'd been covering privacy for ten years. i had written about facial recognition before. and in 2019, the kind of common understanding was that the technology didn't work that well, that, you know it wasn't reliable. it didn't work as well on some groups, that it was biased and that police weren't using it, that they had kind of facial recognition technology to try to find somebody and a database of criminal mug shots or driver's license photos. but that it worked so poorly that they weren't relying on it much. so then in the fall of 2019, i
11:36 am
hear about clearview ai public records request. they showed up in a 26 i think, page pdf from the atlanta police department, and there was this brochure for that said stop searching starts solving it, described them as a google for faces and there was a legal memo that had been written by paul clement, very famous attorney, useless or general under george w bush. he had been hired by clearview ai to write a legal memo that they could give to who might want to try the app. and it described what clearview had done. they had scraped these 40 billion photos. it said that they could identify people with up to 99% accuracy. paul wrote that the attorneys at his firm used it, that it returned fast and reliable results. and yeah it just sounded incredible and had written this memo to reassure police who wanted to try the technology
11:37 am
that they wouldn't break state federal privacy laws by doing and i was astonished because i thought what is clearview ai why have i never heard of this company before? why did google not do this or facebook not do this? and so it set me off on this quest to figure out who is this company? how did we get? and they just turned out to be fascinating questions. and there were so many fascinating characters involved. so a lot of people and auditorium. yeah. how many us have had our faces captured by clearview in how many of our faces were captured in garage or on the and walking across campus to here? yeah. so clearview ai has scraped the public internet. i mean, millions and millions of sites, but i know a few of them. so i'm going to ask you to raise your hands if you have a photo of your face on one of these sites. so i'm going to start with who and i'm going to ask and just keep your hand up until the end. so we can get a sense of how many of you are in clearview's
11:38 am
database who here uses the venmo app and their profile photo. okay, get hands up high. who here has useful or has family members who flickr and post photos there? who here has family members who use instagram? who here has a linkedin profile? who here has an employer that has their photo on page identifying them. all right so most of you are clearview ai's database who here and then you can all put your hands down who here has no photos on the internet so you guys are safe. congratulations but beware, there's photographers at this festival, so make you don't end up in a photo. indeed. yeah. no one. no one is safe. no one is safe. so so obviously this kind of advanced takes money where the come from to kick start this.
11:39 am
who are the top paying clients for facial recognition now? are they commercial are they government. are they both. yeah. clearview ai the was this small little kind of band of misfits there from new york the person i think of as kind of the mastermind is human contact. a young guy grew up in, always loved. he moved to san when he was 19 years old around the time that the iphone came out and was just kind of chasing the dream. and he moved to new york around the time that trump announced he's for president and went on to at and he said that he was radical by the internet and he walking around brooklyn wearing, a red maga cap and a big white fur coat. and this is pretty unusual in brooklyn, not a lot of trump there. but he was running with this kind of, you know, far right radical crowd. trump supporters.
11:40 am
and he went to republican national convention. in 2016 to support trump. and he met peter thiel there. and peter thiel became the first investor in what would become ai. he gave him $200,000 and they started building this big database of faces. they attracted kind of other. but it was all, you know, venture capitalist and mostly mostly right leaning, actually didn't have a lot of success to raise money in silicon valley, though the investors there, they would always give out the app as kind, you know, sample like a candy sample. and you to use it if you're thinking about investing them and investors loved it they would use it on dates they would use it to identify attractive that they saw in bars. they would use it at conferences like if they ran into somebody whose face they should. one one billionaire in new york, john john catsimatidis.
11:41 am
he had the app because he was thinking about putting it in his grocery stores. but he said one that he was at an italian restaurant and his walked in and she was with a man he didn't recognize. they were clearly on a date. so when they sat down in the restaurant, he sent a waiter over to take photo of them. then he ran it through clearview ai. he said, i want make sure this guy wasn't a charlatan. he turned out to be san francisco venture capitalist and he said he was reassured. i said, are you sure. but yeah, i mean, and and what was interesting when i first found out about clearview i thought, wow, like this one contact he just must a genius how. did he build this thing that google facebook couldn't build the word what i would come to find out to the reporting for the book and for my stories in the new york is that google and facebook had both developed this technology internally and decided that it was too
11:42 am
dangerous to release and so they had been sitting on it, but they had kind of contributed code and this is what is happening right now. and i there's a lot of it's open source and clearview was able to build on that technology and create this powerful tool and what they had done was not a technological breakthrough. it was an ethical breakthrough. they're willing to do what other companies hadn't been willing to do. speaking of ethics, when you started investigating clearview, some pretty creepy things started happening. you. tell us. tell us that. and why clearview did not want a reporter investigating them. yeah. so when i first reached out to clearview, yeah. they were not interested in the big, splashy new york times front story, which they did eventually, but they they had a website, clearview ai, and it didn't say anything. facial recognition technology. it just artificial intelligence
11:43 am
for a better world. and it had address and. it was in new york and so i google mapped it and it was two blocks away from the new york office. so as so you know people from the company responding to me paul clement response. peter thiel wouldn't respond no one was responding from the clearview addresses. so i thought, i'll just walk over the building and try to knock on the front door and i get there and there's no building without address. and i'm going back and forth between between where it should be and it just doesn't exist. and i in the book, i compare it to harry potter like i was looking for the platform that wasn't there. and yeah, they were not interested in talking to me originally, but they, you know, my understanding with understanding was that they were selling this to the police officers. so tracked down police detectives who had used clearview ai and they really saying its praises. they said it's amazing. i talked to one financial crimes
11:44 am
detective in, gainesville, florida, and he said, you know, i had a stack of of of unsolved crimes on my desk where we had a photo of somebody taken at a bank or an atm and i'd run them through the state facial recognition system and hadn't gotten anything. and he said, i ran clearview ai and i just got here after after hit. he said, i'd be the spokesperson if loved me. i said, well, that sounds i'd like to see it for myself. and he said, well, send me your photo and i'll send you what a clear view looks like for you. so i sent him a photo and then he goes to me and wouldn't respond to my messages and after this happened, a couple more times i discovered that though clearview ai wasn't talking me, they had put an alert on my so that when police officers searched for me, they got a notification, they called the officer and, you know, told them not to to me that they could lose access to the app. for me that was that was quite
11:45 am
chilling if the intent was get me not to write about the company that absolutely backfired but it really shows the power of this kind of technology is now getting more ubiquitous, getting more widespread. madison square garden, the events venue use this technology because they didn't lawyers who worked at firms that had sued them to come to nick's games. rangers games mariah concerts. there was one mom who was taking her daughter's girl scout troop to see the rockettes at radio city music hall. and when she tried to go, in a security guard her and said, you're not welcome because you work for a firm that has a lawsuit against us and it it's i've seen this happen i took a personal injury attorney to a rangers game and there's thousands people that are streaming into madison square
11:46 am
garden. we walk through the door put our bags down on the security bell. by the time we pick them up, a security guard has come over to us, asked her for id and told her that she needs to leave. so it just really shows the way that this this technology, which can be so useful, crime solving. i'm very in favor that, but that it can be used in very ways to kind of punish critics, dissenters yeah. so i'm about crime solving did that for a living is facial recognition really solving serious crimes are catching violent criminals and terrorists is the cost benefit analysis worth it? privacy versus catching bad guys. it can you know it can it can go wrong not as one computer science professor told me, we're not all unique snowflakes. sometimes we look alike so it can lead you the wrong person. and that has happened. i've written people have been arrested for the crime of looking like someone else, and they spent the in a jail or week
11:47 am
in a jail eight months pregnant woman who was arrested, carjacking, committed by a woman the month before who was not visibly pregnant. you know, it can wrong, but it can also go right and. one of the first stories i heard about that where i just saw the power of it, saw the need for it was a department of homeland security agent who was a child crime investigator, and he'd gotten image from yahoo that had been found in a foreign user's account that depicted exploitation no child and they could tell that it was somewhere in the u.s. because the electrical outlets and but where is this child who is this who is this abuser and they wound up running the abuser's face through clearview. i and found in the background of someone else's instagram photo and, this investigator followed the breadcrumb seems to find
11:48 am
this guy in las vegas. he did have access this child, they arrested him. he's in prison now. that was the first time the department of homeland security had used the technology in. his boss said have to get a subscription to that product so yeah i mean that that that photo in the background thing really struck me from your book because again we had couple of people raise their hands thinking, hey, my faces and on the internet, i urge you to think again because you're you're in the background at somebody's birthday you know it's really astounding technology but is facial recognition working or not working for people of color? did you find any bias within the technology itself? yeah. i mean, for a long time facial recognition technology was biased and it was biased because is i mean, this is how so? this is how artificial intelligence works, generative ai, facial recognition. it's basically somebody who's a bunch of data to a computer and
11:49 am
the computer is learning. and with recognition, which scientists working on since the 1950s. in the 1960s, the cia was funding facial recognition in silicon valley before it was called silicon. they just need a lot of photos and then a computer over time can recognize somebody and in those early getting a lot of photos meant, taking somebody into a photo studio and taking, you know, photos of their faces from lots different angles and the people working on this tended to be white. and the people they recruited to take photos of tended to be white. and so for many years they were feeding facial recognition systems photos of white people and they got very good at recognizing white people and not so much other people or even women or children and and that was a it's it's disturbing how long that was a problem while facial recognition was actively being used by the government. but now a lot of attention has been brought to that problem.
11:50 am
and the facial recognition vendors now train systems on more diverse faces, and so they are far more accurate across lots of different groups of, people. and i think that what we need to with now is not biased facial recognition, but facial recognition and that is all too accurate and could race kind of as we currently know it. so lastly, for now, all of us that raised our in the auditorium about having photos out there somewhere, what are we to do if we don't want our entire life history linked to our face, then sold to a commercial or government entity? do we just not step outside anymore. would what is the answer here? i think the answer the same answer that barb could have given for how to get rid of information, misinformation, which is get off the internet, you know, more conscious about fact that photos that end up on the public internet are going to be scraped, going to be
11:51 am
findable. and so i think making decisions i'm a parent, i have young daughters, and i tell other parents, not post photos of, your kids to the public internet, they are going to get scraped. they're going to get used in in ways that your children might not want. the other thing is, is laws and regulations, which we've been very slow to develop and. and, you know, there i write in the book about one state that has a really strong law that applies to facial technology. it's called the biometric information privacy act in illinois. and it says if a company wants use, you know, your face, print your voiceprint, your fingerprints, that they need to get your explicit consent to do that or going to owe a lot of money. and, you know, i talked madison square garden, which is using recognition technology in new york city to, keep lawyers out. they also have a theater, chicago, but they can't use their facial recognition technology there to, ban lawyers because they would need the lawyer's consent to use their
11:52 am
faces. well, now. all right. let's move on to jeff horwitz, also known as josh. so, jeff, your book really takes us behind the curtain of facebook like no other book i've seen why this book why you and why this timing for the book. yes. so just just to begin with, between being between these two other authors, i think there is sort of a link here, which is is trying to figure out how the that how technology we build shapes us and, how there are ways that can be used in perhaps pro-social or perhaps more more senses and and sort of who's benefiting from this. right. the the of clearview ai was we can do this and you know of sort
11:53 am
of a benevolent authoritarian you know, oversight. right. rather than like thinking about okay, what are the circumstances which this technology is good. the answer is let's just -- it. right. and if there is any company that managed. take the let's just -- do it approach for a solid 20 years. it would be facebook now metta the way i got into this is i used to cover politics in washington. i on the ap's investigative team, it became abundantly clear circa 2018 that just the system of accountability via journalism was like holy, you know, it's instead of driving the bus in terms of know, say information and distribution, we were like just honkin on a little fisher-price steering wheel, you know? and and i i ended up like it's
11:54 am
very clear where the center of power shifted. it was to social media. you know, twitter, perhaps more than anything else for journalism. but, you know, we journalists kind of like twitter. so we were less likely to focus on it than facebook and i got really got a job with wall street journal covering this company and got really interested how the product works and. you know, you'd ask, look through everything facebook had published about their systems or. you would, you know, ask a friendly company representatives of how their particular or say what determines the order of posts that you see, you know, things that you see from people that you don't follow. and the line would be, we connect you with who and what matters most to you. and you'd be like, okay, but but how do you do that? we connect with who and what matters most, and if that doesn't make your skin crawl at that point, that was kind of. the beginning of this journey
11:55 am
was to figure exactly how this and, you know, say at a pretty crucial point for, me deciding that i would want to write book about all of this was a few years in i recruited francis hogan, who a current at the time facebook employee as a source over the course of six months and in and over the course her final three weeks of the company she did a very deep dive pulling internally their systems about basically what they knew about their own products. and it turns out that they really had spent a few years i would say the use 2017 through 2019, very earnestly investigating their own product and reconsidering like rethinking sort of okay, you know how does this thing work?
11:56 am
what are the likely outcomes and the outcomes were very frequently ugly by their own acknowledgment internally. and you know, this isn't just people talking about whether social media is good or bad. this is people who have the ability to turn a lever and see whether misinformation goes up or down and. so this is like kind of repository of information about, the mechanics of just how we talk to each other, what succeeds online, what goes viral and what dies. a quiet death and it was pretty --. so, you know, we did a big project on that and facebook files, but the story of sort of how that kind of collection of work came to be and also of some of the more like rather like
11:57 am
facebook is, you know, knows it's exacerbating polarization. facebook, you know, knows it's a protection systems do not work in most languages. and most countries overseas, you know, facebook knows it played an integral role in amplifying anti vaccine misinformation. right. that was the wall street journal version. this was just kind of the book was supposed to be how the company came to believe. we understand what it had built and how it came to address most often not address what what had wrought. jeff, you of the new york times finding that a russian influence campaign reached, about 126 million people through facebook alone, did facebook help put donald trump in office? yes, but not via the i mean, i
11:58 am
think this is one of the this is one of the things that like this is a new technology it is probably like if i had to, you know, just off the cuff rank, it in terms of importance definitely more than tv and radio, probably not more in part than the printing press. but like we're in that, you know, that hierarchy in terms of just reshaping who has access to what tools to pass information. and there's a lot of really things about democratizing that. it is, however, the system that they built was built for engagement, right? internally, the the definition user value was actually to be inseparable by. some of its lead engineers from how much time people spent in the platform, which is, you know, i mean, anyone would sort of recoil in at that idea. the idea that the more more
11:59 am
upside of your time, something sucks up, the more valuable it is to you. is it not a clearly obvious situation and and they built thing in a way that was very easy to manipulate. it turns that this is just sort of design that's like so basic and yet touches some of this the stuff that barb is talking about. and i it turns out that if you leave 200 comments per hour. you are 200 times more effective at shifting the algorithm and what it recommends to others than. someone who leaves one comment per hour. by the way, this is not hypothetical. the anti-vaccine advocates extremely good at leaving 200 comments per hour on a regular basis. and so you have this ability of a small band of, you know, a small band that is just to sort of hammer on the platform weaknesses to get massive unbelievable distribution.
12:00 pm
and you know meanwhile you've got the company talking about how social media is sort of this like open community in which the best idea is somehow to the top. it's a it's a, you know, of a 24 seven, you know, impartial all to all of the above, you know, democratic discussion. it was a very like the russia thing was i think it turns out there was someone who was way better that russia at playing with 26 or the the 2016 election and it was macedonian teenagers so candidly they they put the russians to shame the distribution of content that these guys got solely in the interest of like selling really low end advertising using like we're talking like you know, -- pill type advertising on like basically nothing sites via google ad words like guys just destroyed the russians terms of competency at this and it's not hard it is like to this day it
12:01 pm
very easy to game the system 2024 will be very much the same. it turns out that like look anyone here, if you guys want to, you know, get into the mass scale manipulation game doing some cat videos and, dirt bike crashes from, any website and you know, as long as they've gone viral, once they'll go viral again, repost those over and over again. and then sometime around october, you know, start posting crazy -- to your of of several thousand to a few million that you've gathered is really not that hard. and i think one of the things just sort of trying to tie tie this stuff in is there's very good uses for recommendation systems. you know, candidly, they make finding things and sort connecting with people that they can be good, maxing it out for
12:02 pm
the of the entity for the for the goal of achieving usually the financial benefit of whoever's building the thing is like kind of invariably a -- disaster and is no reason why that should be the way this technology is run. obviously in the absence of any regulation and you know yeah hey you know everyone's faces here like i could snap photo of all of you guys and then you start running it through clear i you know one by one, including that one woman back there who's not on the. so like these are these are not unsolvable problems and some of them are like extreme basic. i think one of the nice things about that i really enjoyed about the book was just they learn the same lessons and over again they apply to other platform. there is no reason why, you know, it's not like, oh, how are we going to deal with what's true and untrue yes, that is you can get into impossible to answer questions in terms of
12:03 pm
sort of finding the lines, but you can also just build a system a lot less volatile. you can build where, let's say not a hypothetical example where a single person can't invite 400,000 people into q and andrew into sync six months, you know, like that, that not necessarily a tool that like, you know, no one has 400,000 friends. you can put a cap on like i mean this is just another just example from the book of just kind of like how off the rails this thing went in terms of design that team devoted to friending, increasing number of friends. and it turns out that way that team hit its numbers and metrics very important that facebook was boosting friend sarah sent by people who sent more 100 per day and again, like, you know, you may not be an internet, but like the first thing that should come to mind when you hear of people
12:04 pm
sending 100 requests per day that either that's their first day on the internet and, you know, or or that this is just that this is obviously bad news. it's spam fraud. it's indiscriminate sort of an. they simply would not lower the limit because that was good for the numbers anyhow, let's talk about privacy and on our data. so facebook let a group called cambridge analytica have our personal data is facebook's sharing our data to commercial entities and if so, what data they don't share it. they they rent it. they don't sell either. they they you know, provide access to it for. targeting purposes. this is a place where i do think that maybe some of the the privacy with cambridge and like there's a reason haven't heard the phrase psychometric
12:05 pm
profiling. since 2019. you know the cambridge analytica scandal and that is because the concept of segment profiling is largely --. it it was a i think cambridge analytica was like a a great reminder. this company had been doing whatever it wished with the data sometimes in contravention. to its users direct wishes for quite a long time. but the the idea that. somehow that this could be directly used against us in the sense that like your medical records could is probably probably a little bit overblown. and i think it was we were trying to figure out just kind of like something went wrong, right? like clearly democrat democratic discussions were not going well, something has gone wrong. and it obviously social media and i think first we seized on like russians and outside
12:06 pm
agitators and then we seized on like the nature of targeting as opposed to like looking at what the choices were that the platform was making in terms of what of content would succeed what signals were used to determine value or merit and, you know and whether this was and sort the financial infrastructure it all. so lastly, before we go to q&a, real given the fact that facebook meadow's consistently putting their own interests over the interests of its users why are so many people still using facebook? so this is a very depressing about frances halligan and, you know, i continue to get out of the company because that is all the skills i have as a reporter my single biggest one is i'm a weasel to whom people wish to give documents and and there was
12:07 pm
i the marketing presentation noting the company had long focused on something called a metric called k2 cow cares about you and it was just a simply a one survey question, which is does the company does facebook care you and they'd long that if this number tanked it was going to just destroy their business that you know, there was going to be a point at which they would be so unlikable that people would using their products, they actually in the wake of frances testimony that this was no longer true and that, in fact, that usage recovered within the week from drop off it had and you know, i think it's a look that it's a people still use it because it's compelling product and because their friends are on there did anyone ask for you know videos from unconnected sources to start getting like sort of prominently in their feed? absolutely not.
12:08 pm
you do watch it more than if you just see your friends. because as facebook internally likes to joke, your friends are often boring. so like this is it's very well designed for what it does. it's designed somewhat adversarial and i think this is one of those places where as much as individual education is a good thing and, sort of. absolutely. we should all strive for that is. not how this story is going to end like the story ends with either some version of of regulation or the product getting so crappy and spam written that it just becomes unusable and gets replaced or things just continue as they are and that does that last option. pretty ominous to me, but yeah, got it. we have a couple of minutes for a couple of questions. only and if you got one, please
12:09 pm
make your way to the nearest microphone since we're being broadcast by c-span, we'll need you to be near that microphone. first up, the man in the red. go for it. my question concerns turning over national security information to candidate donald trump. i don't quite understand it. i don't understand that they're thinking of doing that. why not give it directly to putin? yeah, i just got a question for you. well, i mean, i. i think any of us can kind of handle. look, there's i think you're referring to the held protocol of giving a major party candidate some intel briefings. right. usually right after the nomination at convention. right. the main goal there is to sure they don't actually say something going to screw up ongoing efforts by the incumbent government. hey, stay away from talking about north korea because. we're kind of dealing with that right now.
12:10 pm
you know, that trump really doesn't have intention of staying away from any topic. so it is antithetical to think we would give a classified to a man who's charged with mishandling classified information. but i believe what the resolution be is some very limited probably, not classified, but sensitive briefing to let him know. hey, things are going on in the middle east. have a nice day. i think that's what's going to happen, sir. you know, i just kind of want say that i don't really have a question. just have this feeling from listening to you that when i read your books, it's snail mail. you know, it's slow. it doesn't reach anybody. but that because you're in program like this, if you speak in buzzwords and small images because of electronics, that's the power out there. it reaches more people faster. so that tick everything that
12:11 pm
we're watching is it's all about buzz words and images to win the day with. so it's almost like if you write a comprehensive book and share your minds with us on the topic, like kashmir hill story, to me that's frightening. that's like you're targeted because you tried to be a certain way and that you cannot feel paranoid about that amazes me. but we're in this world where electronic you know benjamin franklin discover electricity and now look where it went. if you can't reach more people in a buzz word and an image, tick tock or, whatever it is, you're getting beaten. yeah. i mean, rest assured, disinformation is reaching people faster than we ever could. on that. on that briefly, i would say that there was a this is something that sort of like previously people said regarding the trump campaign is, oh, these guys are just brilliant. absolutely brilliant. social media, you know, they're just so good at that part of it.
12:12 pm
i think it is probably worth reconsidering, though, which buzzwords are capable of, succeeding and i mean, there is a bias toward that are not true and things that are true are more interesting. if i tell you what, i would actually happen to me this morning, you're not going to be that excited if you if i tell you about like some -- i made up, then like, lo and behold, it has a high potential and so it's it's like there is an element of that where. like, yes, you've got to adjust to the packaging but i think it important part of for the social media landscape is recognizing that right now producing high quality original content you know deeply that stuff that is carefully worded is a is a loser move and will remain such. all right we have time for only one more question, so go ahead, ma'am. should we worried about dna? a lot of us do. family history.
12:13 pm
and we've been spitting in the tube tell me that i'm okay. no, no, not here. yeah, i can i can answer quickly on that. so. even. well, i mean, i guess what. are you worried about. and so thing that has happened is people use three me ancestry those are closed databases those companies been resistant to police searches. so if you're worried about your cousin being exposed, a serial killer, you're okay if you stay inside those databases. but has happened is that a lot of people who are on the ancestry and who are on 23 a me want to be able to find each other so they have taken their dna out of those databases and put it onto the open internet on these open source databases that they can find more of the relative. and the police have said that's a really useful database because
12:14 pm
we only had database databases that were from people who had been arrested, you know, known criminals. all of a sudden they have this larger population of people who have shared their dna. and so police have been starting to look into those databases using, those to identify victims, identify suspects, and they'll build a family tree off that dna using those databases. it's it is happening. it's kind of in the same realm of facial recognition of should be able to do this does this violate fourth amendment rights or it okay because it's already on the internet. yeah. thank you thank you. i got i've got a mandatory thing i must read. these are must read books and i have a mandatory paragraph. i read to you. make sure to stop by the book sales area and author signing at sales and signing area. the uk bookstore tent on the mall. after this session, book sales, you know, help defray the costs and help literacy campaigns. final note shamelesslu

13 Views

info Stream Only

Uploaded by TV Archive on