Facebook's Messenger Kids app, designed for children below 13 years of age, has been failing to do the one thing it was built for - prevent strangers on the internet from interacting with the children, according to a recent report by the Verge.

COMMERCIAL BREAK
SCROLL TO CONTINUE READING

The report elaborates the simple premise that Facebook's app was built for. It requires parental permission for the children to be able to talk to users. The assumption was that this would stop the children from accidentally coming across users who could coax them under nefarious purposes. After all, the internet has no dearth of them.

However, a flaw in the algorithm has been pointed out by the publication, which shows that this limitation only extends to one-on-one group chats. When a child tries to engage with a user on a one-on-one group chat, the parental filter will stop them from doing so; however, this doesn't happen in the case of group chats.

According to the Verge report, a group chat on Facebook's Messenger works on a different algorithm from a one-on-one group chat. In a group chat, the person who created the group can invite any number of people that they so approve, without requiring the permission of the others in the group, thereby bypassing the need for the approval of the children's' parents.

This is especially tricky territory for Facebook, since, as Verge notes, the organisation is looking at a possible $5 million lawsuit regarding the Cambridge Analytica scandal soon, and a report such as this could complicate things and bring far more serious questions about the company's adherence to privacy promises.

The publication notes that since the Facebook Messenger Kids app is targeted at children less than 13 years of age, this brings them under the jurisdiction of the Children's Online Privacy Protection Act (COPPA), of violating which it has already been accused of by some child rights groups.

The Facebook Messenger Kids app was launched with group features in December of 2017.