Photo storage app Ever’s branding is warm and fuzzy. It shares your best moments and frees up space on your phone. But the photos people share are used to train the company’s facial recognition system. Ever then sells that technology to private companies, law enforcement and the military.
Millions of people are uploading and sharing photos and personal information online without realizing their images could be used to develop surveillance products. What started out 6 years ago as another cloud storage app has pivoted to Ever AI, a far more lucrative business, without telling the app’s millions of users. They are taking images of people’s families, photos from a private photo app, and using it to build surveillance technology. They make Facebook look good.
These billions of images are used to instruct an algorithm how to identify faces. Every time Ever users enable facial recognition on their photos to group together images of the same people, Ever’s facial recognition technology learns from the matches and trains itself. That knowledge, in turn, powers the company’s commercial facial recognition products.
Many companies offer facial recognition products and services, including Amazon, Microsoft and FaceFirst. Those companies need access to huge databases of photos to improve the accuracy of their matching technology. But while most facial recognition algorithms are trained on well-established, publicly circulating datasets, Ever is different in using its own customers’ photos to improve its commercial technology.
Ever AI says the company possesses an “ever-expanding private global dataset of 13 billion photos and videos” from what the company said are tens of millions of users in 95 countries. Ever AI “best-in-class face recognition technology,” can estimate emotion, ethnicity, gender and age. Ever AI promises prospective military clients that it can “enhance surveillance capabilities” and “identify and act on threats.” It offers law enforcement the ability to identify faces in body-cam recordings or live video feeds.
The company decided on the facial recognition pivot about two-and-a-half years ago when they realized that a free photo app with some paid premium features wasn’t going to be a venture-scale business. After it announced its new focus, the company raised $16 million at the end of 2017 — over half of its total investment to date. Having over 13 billion images was incredibly valuable in developing a facial recognition system. Ever AI’s facial recognition technology is 99.85 percent accurate at face matching.
Previously, their privacy policy explained that facial recognition technology was used to help “organize your files and enable you to share them with the right people.” This was recently changed to “Some of these technologies may be used in our separate products and services for enterprise customers, including our enterprise face recognition offerings, but your files and personal information will not be.”
Burying such language in a 2,500-word privacy policy that most users do not read appears grossly insufficient. They are commercially exploiting the likeness of people in the photos to train a product that is sold to the military and law enforcement. It would appear to be an invasion of privacy.
Ever has added a new pop-up feature to the app that gives users an easy way of opting out of having their images used in the app’s facial recognition tool. The pop-up however does not mention that the facial recognition technology is being used beyond the app and marketed to private companies and law enforcement.
Ever needs photos to improve face recognition to a point where it will work when girls take their makeup off!( Sorry, sexist I know but it was the only facial recognition gag I could find, please don’t email me)