Top

Instagram asks suspected bots for verification with video selfies

The move may surprise some, given Meta's recent announcement that it would be shutting down one of its Face Recognition features

Washington: Instagram has many bot accounts which can leave spam messages, harass people or be artificially inflate like or follower counts.

As per The Verge, Instagram is asking some users to provide a video selfie showing multiple angles of their face to verify that they're a real person.

The company started testing the feature last year but ran into technical issues. Multiple users have recently reported being asked to take a video selfie to verify their existing accounts.

As per reports, it looks at "all angles of your face" to prove that you're a real person.

Instagram posted on Twitter that accounts that had suspicious behaviour (such as quickly following a ton of accounts) could be asked to do a video selfie. The company also reiterated that the feature doesn't use facial recognition, and said that Instagram teams review the videos.

The move may surprise some, given Meta's recent announcement that it would be shutting down one of its Face Recognition features.

As the company has since reiterated, though, it was only shutting down a specific Facebook feature, not Meta's use of facial recognition as a whole.

Nevertheless, the message from Instagram is that the video selfie feature won't use face recognition at all and that the video will be deleted after 30 days.

Meta's promise to not store or post the data may not reassure some users who are already distrustful of Meta / Facebook.

Next Story