[ad_1]
Instagram has a new idea for how to determine kids’ ages online — direct the children to videotape themselves, upload the content, and then the company will deploy facial scanning technology.
The Meta-owned social platform is partnering with tech company Yoti to test how the scanning tech can be used to verify children.
“After you take a video selfie, we share the image with Yoti, and nothing else,” Instagram said on its blog. “Yoti’s technology estimates your age based on your facial features and shares that estimate with us. Meta and Yoti then delete the image.”
Instagram said it wants to verify children’s ages to prevent unwanted contact from adult strangers and to limit some advertisers’ ability to reach the kids. Critics are concerned about the privacy risks associated with children sending videos of themselves to the big tech company.
Instagram’s video selfie for facial scanning tech is among two new options to determine kids’ ages that are undergoing tests by the company. The other option involves requesting friends vouch for the child’s age. Children will also be asked to provide an ID, which Instagram said will be encrypted, stored, and then deleted from its servers within 30 days.
Lawmakers have concerns about Instagram’s interactions with children. Sen. Marsha Blackburn criticized Instagram for endangering kids’ health and authored the “Kids Online Safety Act” earlier this year with Sen. Richard Blumenthal, Connecticut Democrat, to enhance child safety through new requirements for social platforms.
Ms. Blackburn, Tennessee Republican, said Instagram’s latest tests are ripe for disaster.
“Instagram has a proven track record of knowingly putting children at risk, and their new facial recognition proposal undeniably intrudes on children’s privacy,” Ms. Blackburn said in a statement to The Washington Times. “Instead of using less risky solutions, like those in my proposal for device-level verification, Instagram settled on a privacy nightmare waiting to happen.”
The Blackburn-Blumenthal bill would order a federal study of age-verification systems at the device or operating system level instead of leaving it up to platforms and apps.
Instagram is open to this idea and its blog post said devices and app stores sharing people’s ages with individual apps would be an “effective way” to address the problem of verifying someone’s age.
Instagram insists its new tech will not do facial recognition. Instagram head Adam Mosseri said the tech will scan an image to predict an age rather than try to identify or recognize a child online.
“I want to be clear: there’s no facial recognition, there’s no way to tell what your identity is,” Mr. Mosseri said in a video on Twitter. “It’s just about predicting age.”
Meta spokesperson Stephanie Otway said in a statement that the technology “does not personally recognize anyone” and the images would be used for nothing other than the age estimate.
Other companies are adopting vastly different approaches to technology involving facial scans and recognition. Microsoft chief responsible AI officer Natasha Crampton said earlier this month the company would end certain artificial intelligence capabilities in its facial recognition and scanning tech designed to infer emotions and identify attributes such as age and gender.
“Taking emotional states as an example, we have decided we will not provide open-ended [application programming interface] access to technology that can scan people’s faces and purport to infer their emotional states based on their facial expressions or movements,” Ms. Crampton wrote on Microsoft’s blog. “Experts inside and outside the company have highlighted the lack of scientific consensus on the definition of ‘emotions,’ the challenges in how inferences generalize across use cases, regions, and demographics, and the heightened privacy concerns around this type of capability.”
Microsoft said it will begin denying existing customers access to other facial recognition capabilities on June 30, 2023, if their application to use Microsoft’s facial recognition tech is not approved.
Whether Instagram’s facial scanning tools for kids outlast Microsoft’s tech remains to be seen.
Mr. Mosseri said Instagram will do its utmost to respect people’s privacy, but he has previously canceled proposals for his company’s products for children following lawmakers casting doubt upon the company’s ideas.
Instagram planned to make an “Instagram Kids” experience aimed at children under age 13, but Mr. Mosseri paused its development amid mounting criticism before his appearance last year at a Senate subcommittee hearing led by Ms. Blackburn and Mr. Blumenthal.
Mr. Mosseri faced questions from the senators in December 2021 and the bipartisan duo then unveiled the Kids Online Safety Act in February. The bill proposes requiring platforms to provide choices about what kids see and to make the platforms mitigate the risks of harm to children through digital content.
The bill has yet to receive a final vote in the Senate and Instagram has made changes that appear aimed at demonstrating it does not want to cause problems for kids. Earlier this month, Instagram announced it would roll out Amber Alerts to share notices of missing children on its platform.
[ad_2]
Source link