13/08/2019 by Rose Eveleth
Today we travel to a future where your face can be held against you.
Guests: Brandeis Marshall — data scientist & professor at Spelman College Matt Cagle — attorney, ACLU Northern California Keith Kaplan — councilman, Teaneck, NJ Ankita Shukla — PhD candidate, IIIT Delhi
Actors: Evan Johnson as Mr. Morton David Romero as David Ash Greenberg as Ash Santos Flores as Santos Charlie Chalmers as Charlie Grace Nelligan as Grace Ava Ausman as Ava Sidney Perry-Thistle as Sidney Arthur Benjamin Allison as Arthur → → → Further reading on today’s episode can be found here ← ← ←
Season One episode about facial recognition
Flash Forward is produced by me, Rose Eveleth. The intro music is by Asura and the outtro music is by Hussalonia. The episode art is by Matt Lubchansky.
Special thanks to the Women’s Audio Mission, where all the intro scenes were recorded this season. Special thanks also to Evan Johnson who played Mr. Morton and also coordinated the actors of the Junior Acting Troupe who play the students in the intros this season.
Get in touch: Twitter // Facebook // Reddit // email@example.com
Support the show: Patreon // Donorbox
Subscribe: iTunes // Soundcloud // Spotify
Sponsor information: Skillshare is an online learning community with thousands of amazing classes covering dozens of creative and entrepreneurial skills. Flash Forward listeners get 2 months of unlimited access to thousands of classes for free by signing up at http://skillshare.com/flashforward Shaker & Spoon is a subscription cocktail box that brings world-class cocktails into your kitchen once a month. Get $20 off your first box at http://shakerandspoon.com/flashforward Tab for a Cause is a browser extension that lets you raise money for charity by opening tabs. Grab the extension and start earning today at tabforacause.com/flashforward!
Learn more about your ad choices. Visit megaphone.fm/adchoices
Listen Date: 2019-10-24
- I didn’t make the connection while listening to it; but talking to Ashish today reminded me that this episode complements the EconTalk interview of Shoshana Zuboff.
- In fact, it leads to the despairing realisation that even if you had uploaded your photos on a good and light service, that was clear about not selling your likeness or data further on, some nonprofit university researcher might still scrape it and then resell it themselves. And, for that matter, every selfie app in the world might have sold your face on to a data broker; which would then sell it further on to anybody who wanted to do facial recognition. [Substitute facial recognition for any other identification algo of choice – is anybody doing voice recognition also?]
- The only safe means of sharing photos seems to be self-hosting or email.
- This quote stood out: “Now, when I first started reading about how bad facial recognition is, how wrong it is all the time, it sounded kind of weird to me. Because we’re not used to hearing about how bad a technology is. In fact, we usually hear about the opposite, when it comes to surveillance tech. We’re used to hearing about how GOOD this tech is, and that’s part of why it’s scary. We hear about how Facebook can identify us without even our faces, how cops can see through walls, or about how our phones are tracking our movements down to the footstep. We’ve sent a rover to Mars! Why… is facial recognition so hard to do? “
- On the San Francisco ban on facial recognition: “So the San Francisco surveillance technology ordinance applies to city departments. So that will be law enforcement agencies the Recreation and Parks Department. Any agency that an actor working for the city or county government. But there is an entire additional industry of corporate players who want to be using face surveillance technology to scan your face when you walk into a store when you enter a stadium.” It made me wonder, as a legal layperson, would an ordnance against using a San Francisco resident as a data point in a facial recognition training set not work? Because of legal reasons, or simply because of enforcability? It’s at times like this that I miss being on Twitter, at least I would know who to ask.
- “California’s considering a ban on the use of facial recognition technology with officer worn body cameras.” made me think wryly about how the same technology that is used to prevent atrocities has to also be legally disbarred from attacking civil liberties.
- I need to read this: Nothing to Hide: The False Tradeoff between Privacy and Security
- And having read it, see if my opinion of the Neal Stephenson quote about “Terrorism is a recruiting station for statists.” changes.
- A glorious line: “Why would we want to use facial recognition on monkeys?”
- I’m wondering if that ninety percent accuracy on monkeys is only on monkeys in the database, or if it craters once you put it into the field and test it on all monkeys.
- I love the classroom debate intro as much as Rose Eveleth does.
- I do agree that the despair those classroom kids had about giving up on privacy is disturbing, but I’m not sure it can be generalised to all kids the way this episode did.
- Rose Eveleth’s story of the gay giraffes on her Malawi safari and of how she realised that she was being a giraffe creeper was delightful.
- This quote: ‘In a March 2019 poll that the ACLU commissioned, they found that 76 percent of California voters supported strongly or somewhat a law that would “require public debate and a vote by lawmakers before any surveillance technology is obtained or used by government and law enforcement.” And 82 percent of Californians said they disagreed strongly or somewhat to this statement: “Government should be able to monitor and track who you are and where you go using your biometric information.”’ – because I am slightly cynical, I feel that the numbers might have dropped dramatically if “you / your” had been substituted with “Criminals / their”. But let’s see.