The Federal Trade Commission is urging companies that use facial tech to do more to protect security and privacy. It also wants tighter measures to avoid scanning children’s faces.
The call, couched as a “recommendation for best practice”, comes in a report published this week. It follows a lengthy probe that included a workshop with industry players last December.
According to the FTC, the guidelines are needed to deal with the rapidly improving technology. In the report’s introduction, it notes that some elements of 2002 movie Minority Report no longer seem so futuristic. For example, promotional displays in shopping malls can now scan faces and show advertising appropriate to a passer-by’s (perceived) age and gender.
The commission suggests a few key principle and then gives some specific examples. It says any firm using facial recognition must think about privacy when developing its technology and make sure it has adequate security measures to deal with the data it collects.
Other principles include getting fresh permission to use facial data in a way other than originally intended, and never using the technology to personally identify somebody in a photo without their permission, including to a third party.
In situations such as the shopping mall example, the FTC wants companies to put up clear signs warning that the scanning is in use. These signs need to be in a position that gives people enough time to avoid the location with the scanners. The commission also says companies simply shouldn’t use such tech if they know children are particularly likely to be in the area.
Social networks also get a warning. The FTC says that sites should always have the option to not have facial data collected from uploaded photos. Furthermore, they should have the right to opt out retrospectively at any time and have all previous facial data deleted.
At the moment, the measures in the report are intended as guidance only. They won’t act as new regulations.
Only four of the five FTC commissioners backed the report. J Thomas Rosch said it went “too far, too soon” and argued it was wrong to be pushing new policies to prevent potential misuse rather than waiting until evidence companies were actually abusing their facial data collection.