Teenagers are not safe on Instagram, 30 security tools completely failed
Teenagers are not safe on Instagram, 30 security tools completely failed A study has claimed that the security tools released by Instagram, owned by Facebook's parent company Meta, for the protection of teenagers are not effective. A study jointly conducted by various child protection organizations, including Cybersecurity for Democracy, has revealed that content that encourages suicide or self-harming activities is still easily found on the accounts of teenagers. Instagram launched 'Teen Accounts' in 2024 with the aim of making teenagers safer through parental supervision. But 30 of the 47 security tools tested by researchers were found to be completely failed or non-existent. The study also accused Instagram's algorithm of encouraging children under the age of 13 to engage in sexual and risky activities for 'likes' and 'views'. The Molly Rose Foundation, which works to protect children, has strongly criticized Meta, calling it a result of its corporat...