Skip to main content

tv   Click - Short Edition  BBC News  April 24, 2021 3:30am-3:46am BST

3:30 am
india's hospitals are reporting dangerously low oxygen levels and no empty beds as coronavirus cases reach record highs. there have been over 2,200 deaths in the past 2a hours. the prime minister, narendra modi, says the government is trying to source additional supplies of oxygen. president emmanuel macron, has said france will never yield to islamist terrorism, after a female officer was stabbed to death inside a police station near paris. the attacker — who was a tunisian national — was shot dead. anti—terror prosecutors are carrying out an investigation. american health authorities say the one jabjohnson &johnson the one jab johnson & johnson can the one jabjohnson &johnson can be used again after it was pause after blood clots incidences were investigated. authorities say the risk is
3:31 am
very low. coming up shortly, newswatch. now on bbc news, click. hey, welcome to click! we're going to start with a quiz this week! we're going to play a game of guess the famous face! using possibly the freakiest faces that you've seen in awhile. they are pretty disturbing, aren't they, lara? they certainly are. although i have to say that the end result is not quite as scary as the process of actually of making them. ok, so this may very well be something that you cannot
3:32 am
unsee, but here we go! 0k. who is this? and who's this? and who is this? crikey! right, well, while you ponder, let me tell you that this is what happens when you ask an al to generate fake faces based on other faces. are you ready? here come the answers. the first one is a blend of lara and me. so odd! i think it is more you than me, to be honest. i think it is more you than me! really? i don't know. ok, the next one is chris fox and omar mehtab. and this is our oz and our kitty. goodness. i think the really odd bit is actually seeing the progression from one person into another. now, this is a really weird, fun thing that has come out
3:33 am
of the really serious issue that we are going to be talking about for the rest of the programme, and that is the fact that computers have got much, much better at recognising faces — but not all faces. especially faces that aren't white. 2020 highlighted many inequalities in how we treat each other as humans. inequalities in who could afford to shelter from the virus and who had no choice but to physically go to work. and inequalities in how we are treated by the authorities. the killing of george floyd, the protests that followed and this week's conviction of derek chauvin have reminded us all that racism still exists in our societies and we all need work together to truly root it out. and it is against this backdrop that we are going to be looking at biases in technology — an industry which has often been criticised for coding our
3:34 am
prejudices into its products. so it needs to get things right, right? well, craig langran from the bbc radio programme people fixing the world has been looking at how we can create facial recognition systems that work for everyone. facial recognition is slowly seeping into everything we do. while it can be a convenient way of interacting with tech for some, it has been creating problems for others. you know those passport gates at the airport? well, the thing is for me, they often don't work as well as they should. sometimes it can take me several goes before i finally get through, and sometimes they don't even seem to work at all. it's a little bit annoying and i'm never quite sure what is going on, or why. but at least the issue at the gates is not affecting my livelihood. i travelled to meet sahir — he wanted to remain anonymous, so we've given him a different name.
3:35 am
he lost his job overnight. uber introduced this face—recognition technology. they sent messages out, saying that "occasionally we are going to ask you to take a selfie of yourself". they compare against the profile picture you have got on there. so it is quite dark and i had a cap on and i took a selfie and i got an e—mail about 11 o'clock, saying "your account has been deactivated due to not being recognised. we have chosen to end our partnership with you. i hope you understand this." you know, i was shocked at the time — i did not know what to do. sahir has since sent dozens of messages to uber eats, and he has got a similar generic response each time. he asked for his case to be reviewed and whether it would be acceptable if the mcdonald's manager where he usually picks his deliveries up from could vouch for him. he got nowhere. so all of these messages were sent by you? mmm—hmm. and that's the first message that got back? yeah, but as you can see, it's a generic message.
3:36 am
uber says it believes the picture provided to its system... an assertion sahir denies. when he pleaded with the company to be able to appeal the decision and asked for a review of his file, uber told him the decision was final. five months on, sahir is still waiting for a response to his letter that he sent to holland, where uber�*s appeal team is based. since we have taken on his case, uber has agreed to share the image captured on the night of the incident. so 22 of our members have been dismissed by uber eats for substitution, and that is through a facial recognition software system that uber eats use. of that figure, 12 of them were bame and four of those were from a brazilian—portuguese heritage. we've since spoken to another union who have said that their members, too, have faced similar issues.
3:37 am
so basically, uber launched a new system to check the real—time id. they have blocked my account for — on that time. it's very bad for me because i have no other source of income. in a statement, uber said: these stories show us just how crucial it is to get this right. these systems have got to be robust and they've got to work for everyone. a key part of the problem is often within the data sets these algorithms have been trained on. they are often built from images scraped from the web — images of celebrities in the media
3:38 am
or pictures on social networks. even our data sets predicting a wedding dress will show, you know, the western christian wedding as the prediction for that, because it's very heavily influenced by western media. so as a result of that — of course, western media, it is not very diverse, does not feature people of colour in a lot of tv shows. so now some companies are hoping that al itself could just be the answer to solving the problem, using the might of something called generative adversarial
3:39 am
networks, or gans for short. to see how it works, we have embarked on a little experiment. these are the faces of the click team which we have fed directly into an off—the—shelf gan software from nvidia. on the right is the image of the person the software already knows and we are on the left. the algorithm starts by comparing the facial features it knows to the new image that it's looking at. after hundreds of iterations, it's able to work out what makes that face look the way it does — at least mathematically, anyway. once we have this digital replica, we can start playing around with different features. we can fiddle with age, ethnicity and mix faces, but most importantly, create people that don't exist at all. this technology is used to create large databases of fake faces, which is then used to train facial recognition systems. but creating faces in this way is not enough. if you want to create something that works and treats everyone more equally, the real images you feed into the gan have to be representative of life. after all, we would not be looking straight into a camera in the real world.
3:40 am
this is a photo shoot by generated photos in the us, a company specialising in creating gan images. it spent several months taking pictures of thousands of people. these models were specifically chosen for their diversity, but they are also being captured doing all sorts of things. so do you think that gans can totally eliminate bias in an area like facial recognition? totally is probably a very strong term, but i think they can mitigate significantly, yes, i think so. i would say that if you do collect more real data, if you are able to do that,
3:41 am
then you should do that. this technique should optimise how these systems work, but what is more important are the people behind the code. the reality is that technology always reflects the biases that exist in society. and just changing facial recognition tools to make them more accurate isn't going to change that. how did it feel back in october when you realised that you were going to lose your livelihood? it was horrible, i can't even explain it. sleepless nights, things were going on in my head. what am i going to do now, how am i going to survive? stories like sahir�*s show us just how important it is to have people on the other side that you can easily talk to and reason with. ethical debates around how these technologies are used and deployed need to continue, and life—impacting decisions shouldn't be left
3:42 am
to machines alone. we have much more honesty full—length version. we have much more honesty full-length version.- full-length version. you can kee - full-length version. you can kee u- full-length version. you can keep up with _ full-length version. you can keep up with the _ full-length version. you can keep up with the team - keep up with the team on social media. . ~' ,, keep up with the team on social media. ., ~ i. ., ., . media. thank you for watching and we will — media. thank you for watching and we will see _ media. thank you for watching and we will see you _ media. thank you for watching and we will see you soon. - and we will see you soon. bye—bye.
3:43 am
hello and welcome to newswatch with me, samira ahmed. why were there more than 100,000 complaints about bbc coverage of the duke of edinburgh's death? and was it appropriate to focus on the relationship between prince william and prince harry before and after the duke's funeral? in the past two weeks, much has been written and said, notjust about prince philip but also about the way the bbc handled his death, its aftermath and his funeral. it all began like this. we have just received
3:44 am
a statement from buckingham palace, confirming that the duke of edinburgh has died. the statement says "it is with deep sorrow that her majesty the queen has announced the death of her beloved husband, his royal highness, the prince philip, duke of edinburgh". the bbc cleared its schedules for the rest of the day and some of the weekend, showing identical output, including tribute programmes, across bbc one, bbc two and the news channel. they also took bbc four off the air. the result was a significant drop in audience figures for the two main channels and the largest number of complaints made in broadcasting history. most of them submitted via a dedicated online form set up for the purpose. as well as those 110,000 people, hundreds contacted newswatch direct, too, and i will be speaking to two of those viewers in a moment. we also wanted to discuss the coverage with someone from bbc television,
3:45 am
scheduling or news, but no—one was available. instead, we were pointed towards the statement they put out last week. we decided to wait until after the funeral to have this discussion on newswatch. so, let's talk now to fiona gill and barbara norris. thank you both for coming on newswatch. barbara, first, why did you complain? i complained because i thought the coverage was absolutely

24 Views

info Stream Only

Uploaded by TV Archive on