
Will Big Tech End Privacy As We Know It?
Clip: 9/25/2023 | 18m 9sVideo has Closed Captions
Kashmir Hill joins the show.
Clearview AI claims to be able to identify anyone with 99% accuracy, based on just one photo of their face. The controversial software first came to the attention of New York Times reporter Kashmir Hill back in 2019. She joins Hari Sreenivasan to discuss her new book, "Your Face Belongs to Us," a deep dive into her reporting of the company and the dangers of this new technology.
Problems playing video? | Closed Captioning Feedback
Problems playing video? | Closed Captioning Feedback

Will Big Tech End Privacy As We Know It?
Clip: 9/25/2023 | 18m 9sVideo has Closed Captions
Clearview AI claims to be able to identify anyone with 99% accuracy, based on just one photo of their face. The controversial software first came to the attention of New York Times reporter Kashmir Hill back in 2019. She joins Hari Sreenivasan to discuss her new book, "Your Face Belongs to Us," a deep dive into her reporting of the company and the dangers of this new technology.
Problems playing video? | Closed Captioning Feedback
How to Watch Amanpour and Company
Amanpour and Company is available to stream on pbs.org and the free PBS App, available on iPhone, Apple TV, Android TV, Android smartphones, Amazon Fire TV, Amazon Fire Tablet, Roku, Samsung Smart TV, and Vizio.

Watch Amanpour and Company on PBS
PBS and WNET, in collaboration with CNN, launched Amanpour and Company in September 2018. The series features wide-ranging, in-depth conversations with global thought leaders and cultural influencers on issues impacting the world each day, from politics, business, technology and arts, to science and sports.Providing Support for PBS.org
Learn Moreabout PBS online sponsorship>>> NOW, AS AI, WE'VE JUST BEEN DISCUSSING, CONTINUES TO CHANGE EVERYDAY LIFE, IT'S ALSO REWRITING OUR RIGHT TO PRIVACY.
ONE SMALL AI COMPANY IN THE U.S. CLAIMS TO BE ABLE TO IDENTIFY ANYONE WITH JUST ABOUT 99% ACCURACY BASED ON JUST ONE PHOTO OF THEIR FACE.
"NEW YORK TIMES" TECH JOURNALIST KASHMIR HILL HAS BEEN REPORTING ABOUT THE CONTROVERSIAL SOFTWARE IN HER NEW BOOK, YOUR FACE BELONGS TO US, AND SHE JOINS HARI SREENIVASAN TO DISCUSS THE DANGERS OF THIS TECHNOLOGY.
>> CHRISTIANE THANKS, KASHMIR HILL, THANKS FOR JOINING US.
LET'S STARTS WITH THE TITLE "YOUR FACE BELONGS TO US."
WHO'S THE US?
HOW DID THEY GET MY FACE?
>> THE US IN THE BACK IS PRIMARILY CLEAR VIEW AI, WHICH IS THIS COMPANY I HEARD ABOUT A FEW YEARS AGO THAT SCRAPED BILLIONS OF PHOTOS FROM THE PUBLIC INTERNET WITHOUT PEOPLE'S PERMISSION TO CREATE THIS FACE RECOGNITION APP THAT THEY WERE SECRETLY SELLING TO THE POLICE.
>> AND HOW SUCCESSFUL IS CLEAR VIEW?
>> CLEAR VIEW WORKS WITH THOUSANDS OF POLICE DEPARTMENTS.
THEY HAVE $2 MILLION IN CONTRACTS WITH THE DEPARTMENT OF HOMELAND SECURITY.
THEY HAVE A CONTRACT WITH THE FBI.
THEY'VE RECEIVED FUNDING FROM THE AIR FORCE AND THE ARMY TO WORK ON THE FACIAL RECOGNITION GLASSES, AUGMENTED REALITY GLASSES THAT YOU CAN WEAR AND IDENTIFY SOMEONE.
SO THEY HAVE HAD SUCCESS SELLING THEIR PRODUCT TO LAW ENFORCEMENT AGENCIES.
>> SO GIVE ME AN IDEA IN THE SORT OF GRAND SCHEME OF BIOMETRICS FROM FINGERPRINTS TO TAKING A PICTURE OF YOUR PHOTO -- OR YOUR FACE AND IDENTIFYING IT, WHAT GOES INTO FACIAL RECOGNITION TO MAKE IT WORK, AND HOW GOOD IS THE STUFF THAT YOU'RE TALKING ABOUT?
>> THE TECHNOLOGY THAT SCIENTISTS, ENGINEERS HAVE BEEN WORKING ON FOR DECADES.
IT USED TO NOT WORK VERY WELL, VERY FLAWED WHEN IT CAME TO PARTICULARLY PEOPLE WHO WERE NOT WHITE MEN.
WHAT IS HAPPENING IS THAT A COMPUTER IS LOOKING AT ALL THE INFORMATION FROM A FACE FROM A DIGITAL IMAGE, AND YOU KNOW, IF IT'S TRAINED ON ENOUGH FACES WHICH THERE ARE A LOT OF FACES NOW IN THE INTERNET AGE, IT'S ABLE TO KIND OF FIGURE OUT WHAT IS UNIQUE FROM ONE FACE TO ANOTHER, AND SO THESE FACE RECOGNITION APPS GO OUT AND ESSENTIALLY LIKE LOOK FOR A FACE THAT MATCHES THE FACE IT'S GIVEN, AND SO IT -- THEY CAN WORK PRETTY WELL AT FINDING YOU.
BUT THEY MIGHT ALSO FIND DOPPELGANGERS, AND SO THAT'S BEEN A PROBLEM IN POLICE USE OF THE APP.
THERE HAVE BEEN SEVERAL PEOPLE WHO HAVE BEEN ARRESTED FOR THE CRIME OF LOOKING LIKE SOMEONE ELSE.
>> THIS TECHNOLOGY THAT THE COMPANIES HAVE DOESN'T REQUIRE A NICE WELL-LIT HEAD SHOT OF ME LOOKING DIRECTLY INTO THE CAMERA, RIGHT?
>> YEAH, THIS IS WHAT POLICE OFFICERS TOLD ME WHEN I FIRST HEARD ABOUT CLEAR VIEW AI, THEY SAID THE FACIAL RECOGNITION TECHNOLOGY THAT THEY HAD BEEN USING BEFORE THAT JUST WORKED ON CRIMINAL, YOU KNOW, MUG SHOTS, STATE DRIVER'S LICENSES, IT DIDN'T WORK THAT WELL, BUT WHEN THEY STARTED USING THE CLEAR VIEW AI APP, WHICH HAD NEWER TECHNOLOGY, A FRESHER ALGORITHM, IT WOULD WORK EVEN WHEN SOMEBODY WAS TURNED AWAY FROM THE CAMERA, WEARING A HAT, YOU KNOW, WEARING GLASSES.
I TALKED TO THIS ONE FINANCIAL CRIMES DETECTIVE IN GAINESVILLE INCREDIBLE.
I HAD A STACK OF WANTED FRAUDSTERS ON MY DESK.
I RAN THEM THROUGH CLEAR VIEW AI, AND I GOT HIT AFTER HIT AFTER HIT, AND HE SAID I'D BE THEIR SPOKESPERSON IF THEY WANTED ME.
>> IS THIS LEAP FORWARD THAT WE'RE SEEING NOW, IS IT THAT THE TECHNOLOGY HAS GOTTEN BETTER OR IS IT THAT ESSENTIALLY THE ETHICS HAVE GOTTEN LOOSER?
>> THE TECHNOLOGY HAS GOTTEN BETTER, BUT ONE THING I DISCOVERED WHILE I WAS DOING THE RESEARCH FOR THE BOOK IS THAT BOTH GOOGLE AND FACEBOOK DEVELOPED THIS TECHNOLOGY INTERNALLY AS EARLY AS 2011, ERIC SCHMIDT, THE THEN CHAIRMAN OF GOOGLE SAID THAT IT WAS THE ONE TECHNOLOGY THAT GOOGLE HAD DEVELOPED AND DECIDED TO HOLD BACK BECAUSE IT WAS TOO DANGEROUS.
FACEBOOK ENGINEERS AT ONE POINT RIGGED UP THIS SMARTPHONE ON A BASEBALL CAP, AND WHEN YOU TURNED YOUR HEAD AND THE CAMERA ZOOMED IN ON SOMEBODY, IT COULD CALL OUT THEIR NAME, BUT FACEBOOK TOO DECIDED TO HOLD THE TECHNOLOGY BACK.
AND THESE ARE NOT COMPANIES KNOWN AS BEING, YOU KNOW, PRIVACY PROTECTIVE ORGANIZATIONS.
THEY'VE PIONEERED MANY TECHNOLOGIES THAT HAVE KIND OF CHANGED OUR NOTIONS OF PRIVACY.
THEY FELT IT WAS TOO DANGEROUS.
WHAT WAS DIFFERENT ABOUT CLEAR VIEW AI, IT WASN'T THAT THEY MADE A TECHNOLOGICAL BREAKTHROUGH, IT WAS AN ETHICAL ONE THAT THEY WERE WILLING TO DO WHAT OTHER COMPANIES HADN'T BEEN WILLING TO DO.
>> WOW.
I HAD A CHANCE TO INTERVIEW THE CEO OF CLEAR VIEW A COUPLE YEARS AGO, AND AT THE TIME HE SAID THAT THEY HAD NOT HAD ANY SORT OF FALSEHOODS, ANY MISIDENTIFICATION, AND YET, WHAT YOU'RE TALKING ABOUT IN AND WRITING ABOUT IN "THE TIMES" THESE DAYS IS A SERIES OF INSTANCES WHERE PEOPLE HAVE BEEN MISIDENTIFIED FOR CRIMES THAT THEY DIDN'T COMMIT BY FACIAL RECOGNITION SOFTWARE AND HAD TO, WELL, SUFFER BECAUSE OF IT.
>> YEAH, I DISCOVERED ONE CASE THAT APPEARS TO INVOLVE CLEAR VIEW AI, A MAN NAMED RANDALL REED.
HE LIVES IN GEORGIA.
HE GETS PULLED OVER ONE DAY BY THE POLICE AND THEY SAY THERE'S A WARRANT FOR HIS ARREST.
HE IS ARRESTED.
HE IS HELD IN JAIL FOR A WEEK.
THE CRIME WAS COMMITTED IN LOUISIANA, AND HE'D NEVER EVEN BEEN TO LOUISIANA, AND SO HE'S SITTING IN JAIL WAITING TO BE EXTRADITED.
HE HAS NO IDEA WHY HE'S TIED TO THIS CRIME, AND IT TURNS OUT THAT THE DETECTIVES HAD RUN CLEAR VIEW AI ON SURVEILLANCE FOOTAGE, AND IT HAD MATCHED TO HIM, AND HE WAS ARRESTED EVEN THOUGH, YOU KNOW, HE LIVED, YOU KNOW, HUNDREDS OF MILES AWAY FROM WHERE THE CRIME OCCURRED.
>> AND WHAT EVENTUALLY HAPPENED?
HOW WAS HE KIND OF EXONERATED?
HE WAS SORT OF GUILTY UNTIL PROVEN INNOCENT BY THIS TECHNOLOGY.
>> SO HE GOT A GOOD LAWYER, AND THAT'S WHAT HAPPENS TO THESE PEOPLE WHO ARE FALSELY ARRESTED.
THEY DO HAVE TO HIRE LAWYERS TO DEFEND THEM.
THE LAWYER WENT AND ACTUALLY WENT TO THE CONSIGNMENT STORES WHERE HE WAS ACCUSED OF USING A STOLEN CREDIT CARD TO BUY DESIGNER PURSES, AND HE ASKED TO SEE THE SURVEILLANCE FOOTAGE, AND ONE OF THE STORE OWNERS SHOWED IT TO HIM, AND HE SAID, DOES LOOK A LOT LIKE MY CLIENT, BUT IT'S NOT -- IT'S NOT HIM, AND HE CALLED THE DETECTIVE AND THE DETECTIVE REVEALED TO HIM THAT THEY HAD USED FACIAL RECOGNITION APP IN THE CASE.
AND SO HE GOT A BUNCH OF PHOTOS OF HIS CLIENT, A VIDEO THAT HIS CLIENT HAD MADE OF HIS FACE, GAVE IT TO THE POLICE, AND THEY REALIZED THAT THEY HAD THE WRONG PERSON, AND THE CASE WAS DROPPED.
>> THERE WAS A RECENT CASE YOU WROTE ABOUT, A WOMAN WHO WAS 8 MONTHS PREGNANT WHO WAS TAKEN TO JAIL AFTER A MISIDENTIFICATION.
>> YES, A WOMAN NAMED PORTIA WOODRUFF, IT HAPPENED ON A THURSDAY MORNING IN FEBRUARY.
SHE WAS GETTING HER TWO YOUNG CHILDREN READY FOR SCHOOL.
POLICE TURNED UP AT HER DOOR SAYING SHE WAS UNDER ARREST FOR CAR -- FOR ROBBERY AND CARJACKING, AND SHE WAS JUST IN SHOCK.
SHE COULDN'T BELIEVE IT.
SHE SAID, WELL, IS THE PERSON WHO COMMITTED THIS CRIME PREGNANT?
YOU KNOW, LOOK AT ME.
AND SHE GOT TAKEN TO JAIL.
SPENT THE DAY IN JAIL.
YOU KNOW, WAS CHARGED.
AGAIN, HAD TO HIRE A LAWYER AND IT WAS ALL, AGAIN, A CASE OF MISTAKEN IDENTITY.
SHE WAS ARRESTED FOR THE CRIME OF LOOKING LIKE SOMEONE ELSE, AND AFTER SHE SPENT THE DAY IN JAIL, SHE WENT TO THE HOSPITAL BECAUSE SHE WAS SO DEHYDRATED AND STRESSED OUT FROM BEING ACCUSED OF THIS CRIME.
IT'S ACTUALLY THE THIRD TIME THAT THIS HAS HAPPENED IN DETROIT THAT -- THAT'S WHERE PORTIA WOODRUFF LIVES, AND ALL OF THE CASES THAT WE KNOW ABOUT WHERE SOMEONE'S BEEN FALSELY ARRESTED, THE PERSON HAS BEEN BLACK.
>> WE HAVE HEARD ABOUT KIND OF ALGORITHMIC BIAS AND BIAS IN THE STRUCTURE OF SYSTEMS.
HOW DOES THAT WORK WHEN IT COMES TO FACIAL RECOGNITION?
>> YEAH, I MEAN, FACIAL RECOGNITION TECHNOLOGY FOR A LONG TIME WAS REALLY FLAWED WHEN IT CAME TO HOW WELL IT WORKED ON DIFFERENT GROUPS OF PEOPLE, AND THE REASON WAS THAT WHEN IT WAS INITIALLY BEING DEVELOPED, THE PEOPLE WERE WORKING ON IT TENDED TO BE WHITE MEN, AND THEY TENDED TO MAKE SURE THE TECHNOLOGY WORKED WELL ON THEM AND PEOPLE WHO LOOK LIKE THEM, AND SO THEY WOULD TRAIN IT ON PHOTOS OF THIS WAS KIND OF REALIZED AND PEOPLE IGNORED IT, FACIAL RECOGNITION TECHNOLOGY WAS DEPLOYED IN THE REAL WORLD WITH THIS BASIC FLAW IN IT, BUT YOU KNOW, THE VENDORS HAVE TAKEN THE CRITICISM TO HEART, AND THEY NOW DO TRAIN THEIR ALGORITHMS WITH MORE DIVERSE SETS OF FACES, AND SO THE TECHNOLOGY HAS COME A LONG WAY.
BUT AS YOU SEE FROM THESE FALSE ARRESTS, THERE ARE STILL DISTURBING OUTCOMES, RACIST OUTCOMES THAT WE'RE SEEING IN THE WAY THE TECHNOLOGY IS BEING DEPLOYED AND MISUSED.
>> I'VE POINTED OUT A COUPLE OF CASES WHERE IT'S BEEN USED AND IT'S COME OUT WITH HORRIBLE OUTCOMES.
WHAT ARE SOME CASES WHERE FACIAL RECOGNITION HAS BEEN USED TO ACTUALLY CATCH THE CORRECT BAD GUY, SO TO SPEAK?
>> YEAH, I MEAN, FACIAL RECOGNITION TECHNOLOGY IS A POWERFUL INVESTIGATIVE TOOL, THAT'S WHAT POLICE OFFICERS TOLD ME.
THEY SAID IT CAN REALLY BE A GAME CHANGER IN AN INVESTIGATION WHEN ALL YOU HAVE IS SOMEBODY'S FACE.
PARTICULARLY IT'S BEEN POPULAR WITH CHILD CRIME INVESTIGATORS WHO ARE OFTEN WORKING WITH, YOU KNOW, BASICALLY PHOTOS OF ABUSE, AND THEY HAVE PHOTOS OF NOT JUST THE ABUSER BUT ALSO THE CHILD WHO'S BEING ABUSED, AND THEY HAVE BEEN USING CLEAR VIEW AI TO TRY TO SOLVE THESE CASES AND, YOU KNOW, I HAVE HEARD OF MANY SUCCESS STORIES.
ONE OF THE CRAZIER STORIES I HEARD FROM A DEPARTMENT OF HOMELAND SECURITY AGENT WAS HE HAD A PHOTO.
HE WAS TRYING TO FIGURE OUT WHO THE ABUSER WAS.
AN AGENT FRIEND OF HIS RAN IT THROUGH CLEAR VIEW AI AND FOUND THE GUY STANDING IN THE BACKGROUND OF SOMEONE ELSE'S INSTAGRAM PHOTO.
AND THAT WAS -- YOU KNOW, THAT WAS A CRUMB THAT LED HIM TO FIGURE OUT WHO THAT MAN WAS.
HE LIVED IN LAS VEGAS, AND HE WAS ABLE TO ARREST HIM AND REMOVE THE CHILD FROM HIM HAVING ACCESS TO HER.
>> THIS IS ONE CONVERSATION ABOUT HOW IT'S USED IN POLICING.
MULTIPLE SITUATIONS WHERE IT'S BEYOND POLICING.
IT IS IN GROCERY STORES IN THE UK RIGHT NOW.
IT IS IN DEPARTMENT STORES IN AMERICA TODAY.
YOU KNOW, TELL US A LITTLE BIT ABOUT WHAT HAPPENED WHEN YOU TRIED TO GO HAD INTO A RANGER'S GAME WITH SOMEONE -- WELL, TELL ME THAT STORY AT MADISON SQUARE GARDEN.
>> YEAH, SO I WENT WITH A PERSONAL INJURY LAWYER TO -- IT WAS ACTUALLY A KNICKS GAME AT MADISON SQUARE GARDEN, AND YOU KNOW, WE PUT OUR BAGS ON THE SECURITY BELT TO GO THROUGH THE METAL DETECTOR.
AND AS WE WERE COLLECTING OUR BAGS, A SECURITY GUARD CAME OVER AND PULLED THIS PERSONAL INJURY ATTORNEY ASIDE, AND HE SAID, OH, YOU KNOW, YOU'VE BEEN FLAGGED.
WE USE A FACIAL RECOGNITION SYSTEM HERE, AND MY MANAGER'S GOING TO COME OVER, AND HE'S GOING TO NEED TO TALK TO YOU.
AND THIS ATTORNEY WAS ONE OF THOUSANDS OF ATTORNEYS WHO HAVE MADISON SQUARE GARDEN ENFORCED WITH FACIAL RECOGNITION TECHNOLOGY BECAUSE SHE WORKS AT A FIRM THAT HAS A CASE AGAINST THE COMPANY.
SHE'S NOT WORKING ON THAT CASE, BUT THIS IS SOMETHING THAT THE OWNER OF MADISON SQUARE GARDEN JAMES DOLAN HAS DECIDED TO DEPLOY TO KIND OF PUNISH HIS ENEMIES.
SO THE MANAGER CAME OVER AND SAID -- HE GAVE HER BASICALLY A NOTE KICKING HER OUT AND SAID YOU'RE NOT WELCOME HERE UNTIL YOUR FIRM RESOLVES THAT LITIGATION, DROPS THE CASE AGAINST US.
>> INTERESTINGLY, YOU POINT OUT THAT, FOR EXAMPLE, THE OWNER OF MSG COULD NOT USE THIS TOOL AT A FACILITY THAT HE OWNS IN CHICAGO.
HOW COME?
>> SO HE CAN DEPLOY THIS TECHNOLOGY AGAINST LAWYERS AT HIS NEW YORK VENUES, LIKE MADISON SQUARE GARDEN AND RADIO CITY MUSIC HALL, BUT NOT AT THEIR CHICAGO THEATER BECAUSE ILLINOIS HAS A LAW CALLED THE ACT, PRESCIENTLY PASS INSTEAD 2008 THAT SAYS PEOPLE HAVE CONTROL OF THEIR BIOMETRIC INFORMATION INCLUDING THEIR FACE PRINT, AND IF A COMPANY WANTS TO USE IT, THEY NEED TO GET CONSENT, AND IF THEY DON'T, THE COMPANY WOULD NEED TO PAY UP TO $5,000 PER, YOU KNOW, FACE OR BIOMETRIC INFORMATION IT USED.
AND SO YES, MADISON SQUARE GARDEN HAS A BAN LIST IN CHICAGO, BUT IT DOES NOT ENFORCE IT BY SCANNING PEOPLE'S FACES AS THEY ENTER THE VENUE.
>> SO IF THIS IS HAPPENING IN ILLINOIS BECAUSE OF REGULATION AND THERE'S ALSO OTHER EUROPEAN COUNTRIES THAT ARE ALSO FOLLOWING SUIT, RIGHT?
WHEN IT COMES TO CLEAR VIEW AI, IT'S BANNED IN SEVERAL COUNTRIES WHETHER IT CAN BE USED.
>> AFTER I EXPOSED THE EXISTENCE OF CLEAR VIEW AI, A NUMBER OF PRIVACY REGULATORS ANNOUNCED L INVESTIGATIONS.
THEY ALL SAID WHAT THE COMPANY THEY CAN'T OPERATE IN THEIR COUNTRIES ANYMORE AND NEEDED TO DELETE THEIR CITIZENS' INFORMATION FROM THE DATABASE.
THEY ALSO WERE ISSUED SOME FINES.
WHILE THEY HAVEN'T BEEN ABLE TO GET THEIR CITIZENS' INFORMATION OUT OF THE DATABASE, THEY HAVE EFFECTIVELY KEPT CLEAR VIEW AI FROM OPERATING IN THEIR COUNTRIES, AND SO YEAH, WE DO LIVE IN A WORLD RIGHT NOW WHERE YOUR FACE IS BETTER PROTECTED IN BASICALLY SOME PLACES BETTER THAN OTHERS.
>> I MEAN, SO LET'S KIND OF FAST FORWARD FIVE YEARS OUT, I MEAN, WE SEEM TO BE AT AN INFLECTION POINT WHERE WE OUGHT TO BE THINKING ABOUT THE IMPACT AND THE RAMIFICATIONS THIS TECHNOLOGY HAS ON SOCIETY AND MAYBE, YOU KNOW, IN A BEST CASE WORLD CREATING POLICIES AROUND IT.
BUT AT THE PACE AT WHICH TECHNOLOGY IS CHANGING AND THE PACE AT WHICH LEGISLATORS ARE ACTUALLY RESPONDING, WHERE DO YOU SEE THIS GOING IN FIVE YEARS?
>> I THINK UNLESS PRIVACY LAWS ARE MORE UNIFORMLY PASSED AND ENFORCED, I DO THINK WE COULD HAVE A WORLD WHERE FACIAL RECOGNITION IS PRETTY UBIQUITOUS WHERE PEOPLE COULD HAVE AN APP ON THIS PHONE, AND IT WOULD MEAN THAT WHEN YOU'RE OUT IN PUBLIC, YOU COULD BE READILY IDENTIFIED, YOU KNOW, WHETHER YOU'RE BUYING HEMORRHOID CREAM AT THE PHARMACY OR WHEN YOU GO INTO A BAR AND YOU MEET SOMEBODY YOU NEVER TO WANT SEE AGAIN AND THEY JUST FIND OUT WHO YOU ARE.
OR YOU'RE JUST HAVING A SENSITIVE CONVERSATION OVER DINNER ASSUMING YOU HAVE THE ANONYMITY OF BEING SURROUNDED BY STRANGERS, AND IF YOU SAY SOMETHING THAT'S INTERESTING, MAYBE SOMEBODY TAKES A PICTURE OF YOUR FACE, AND NOW THEY UNDERSTAND WHAT YOU'RE TALKING ABOUT.
I THINK IF WE DON'T KIND OF REIN IT IN, IT COULD REALLY CHANGE WHAT IT IS TO BE ANONYMOUS.
>> DID YOU SPEAK WITH CLEAR VIEW AI ABOUT IT?
BECAUSE IN THE BEGINNING OF THE BOOK WHAT WAS INTERESTING WAS JUST LITERALLY HOW THEY KNEW YOU WERE WORKING ON THE STORY, BUT DID THEY EVENTUALLY TALK TO YOU?
>> YEAH, ORIGINALLY CLEAR VIEW AI DID NOT WANT TO TALK TO ME.
THEY WERE NOT HAPPY I WAS GOING TO BE WRITING ABOUT THEM.
THERE WERE SOME STRANGE, YOU KNOW, RED FLAGS ABOUT THE COMPANY.
THEY HAD AN ADDRESS ON THEIR WEBSITE FOR A BUILDING THAT DID NOT EXIST.
THEY HAD KIND OF ONE FAKE EMPLOYEE ON LINKEDIN.
THEY DIDN'T WANT TO TALK TO ME, AND I ENDED UP TALKING TO POLICE OFFICERS WHO WERE USING THE APP AND OFTENTIMES THE POLICE OFFICERS WOULD OFFER TO RUN MY PHOTO TO KIND OF SHOW ME HOW WELL THE APP WORKED, AND EVERY TIME THIS HAPPENED THE POLICE OFFICER WOULD EVENTUALLY STOP TALKING TO ME, AND FOR TWO OF THE POLICE OFFICERS, THEY SAID YOU DON'T HAVE ANY RESULTS.
THERE'S NO PHOTOS IN THE APP FOR YOU.
THAT'S REALLY STRANGE.
AND EVENTUALLY I FOUND OUT THAT CLEAR VIEW AI WAS ACTUALLY -- EVEN THOUGH IT WASN'T TALKING TO ME, IT WAS TRACKING ME, AND IT HAD SOME KIND OF ALERT FOR WHEN MY PHOTO WAS UPLOADED.
IT HAD BLOCKED RESULTS FOR ME, AND ONE OF THE OFFICERS I TALKED TO MINUTES AFTER HE RAN MY FACE GOT A CALL FROM THE COMPANY TELLING THEM -- TELLING HIM THAT, YOU KNOW, THEY KNEW HE HAD DONE THIS AND HE WASN'T SUPPOSED TO, AND THEY DEACTIVATED HIS APP.
AND IT REALLY FREAKED HIM OUT.
HE SAID I DIDN'T REALIZE THAT THIS COMPANY WOULD KNOW WHO I WAS LOOKING FOR, THAT THEY KNOW WHO LAW ENFORCEMENT IS SEARCHING FOR AND THAT THEY CAN CONTROL WHETHER THEY COULD BE FOUND.
IT WAS REALLY A PRETTY CHILLING START TO THE INVESTIGATION.
>> SO WHEN YOU EVENTUALLY DID SPEAK TO THEM, WHAT DID THEY SAY ABOUT THESE CASES OF MISIDENTIFICATION OR THE POSSIBILITIES OF THAT?
>> SO AT THE TIME I STARTED TALKING TO CLEAR VIEW AI THEY DIDN'T KNOW OF ANY MISIDENTIFICATIONS YET.
THEY SAID IT'S A RISK, BUT OUR TECHNOLOGY IS NEVER MEANT TO BE USED TO ARREST SOMEBODY.
WE JUST TRYING TO, YOU KNOW, GIVE POLICE A LEAD IN A CASE, AND THEN THEY HAVE TO DO MORE INVESTIGATING.
THEY NEED TO FIND EVIDENCE, AND SO HE KIND OF DISTANCED HIMSELF FROM THE RESPONSIBILITY FOR WHEN THIS GOES WRONG.
>> AND HAS THIS CHANGED HOW YOU DO YOUR REPORTING?
>> YEAH, I MEAN, THE FIRST TIME THAT WAN TAN TAT RAN MY OWN PHOTO THROUGH CLEAR VIEW AI ONCE THE COMPANY STOPPED BLOCKING THE RESULTS.
I WAS REALLY SHOCKED BY THE PHOTOS THAT CAME UP, PHOTOS OF ME WALKING IN THE BACKGROUND OF OTHER PEOPLE'S PHOTOS.
A PHOTO OF ME ACTUALLY WITH A SOURCE, SOMEBODY I HAD BEEN INTERVIEWING AT THE TIME FOR A STORY THAT I DIDN'T REALIZE WAS ON THE INTERNET.
IT MADE ME THINK, WOW, I MIGHT NEED TO BE MORE CAREFUL IN PUBLIC, YOU KNOW, YOU CAN'T JUST LEAVE YOUR PHONE AT HOME AND MEET AT A DIVE BAR AND ASSUME THAT NO ONE WILL KNOW IT.
THIS IS SOMETHING THE FEDERAL GOVERNMENT HAS REALIZED AS WELL, WHILE I WAS WORKING ON THE BOOK, THE CIA SENT OUT A WARNING TO ALL OF ITS OUTPOSTS AND SAID OUR INFORMANTS ARE BEING -- ARE BEING IDENTIFIED.
YOU KNOW, THEIR IDENTITIES ARE BEING COMPROMISED BY NEW ARTIFICIAL INTELLIGENCE TOOLS INCLUDING FACIAL RECOGNITION TECHNOLOGY.
>> THE BOOK IS CALLED "YOUR FACE BELONGS TO US," KASHMIR HILL FROM "THE NEW YORK TIMES," THANK YOU SO MUCH FOR JOINING US.
>> THANK YOU SO MUCH, HARI.

- News and Public Affairs

Top journalists deliver compelling original analysis of the hour's headlines.

- News and Public Affairs

FRONTLINE is investigative journalism that questions, explains and changes our world.












Support for PBS provided by: