Microsoft's new technology scans online chat to catch a child predator

Microsoft's new technology scans online chat to catch a child predator

Good news for children and parents: Microsoft announced yesterday (January 9) that it will hunt down online sex offenders by using artificial intelligence to scan chats and find potential child groomers

Child grooming is a technique used to lure potential victims Predators (sexual offenders) talk with targeted children for an extended period of time to make them feel safe and comfortable Successful grooming can lead to online sexual abuse, forcing the child to send sexual videos or physically meet with the child

Project Artemis uses artificial intelligence to continuously monitor chats with children to detect conversations that could be interpreted as grooming

According to Microsoft, the technology "evaluates and rates conversation features and assigns an overall probability rating"

"The ratings can be set by individual companies deploying the technology to help determine when flagged conversations should be sent to human moderators for review"

The technology "can also be used to identify conversations that have been flagged by human moderators

Human moderators can then evaluate the content and identify "imminent threats to refer to law enforcement or suspected incidents of child sexual exploitation to refer to the National Center for Missing and Exploited Children (NCMEC)" According to Microsoft, "NCMEC, along with ECPAT International, INHOPE, and the Internet Watch Foundation (IWF), provided valuable feedback through their collaborative work"

Of course, this human moderation component raises privacy concerns This would not be the first time that the tools that are supposedly used for our security have been abused On the other hand, we cannot leave all these sensitive issues in the hands of AI algorithms

According to Microsoft, this new tool, called Project Artemis, was launched at the November 2018 Microsoft "360 Cross- Industry Hackathon, and has been developed over the past 14 months in collaboration with The Meet Group, Roblox, Kik, and Thorn

The software giant says it has been successfully using the technology underlying Project Artemis for "years" on Xbox Live And now it is trying to incorporate the Project Artemis toolset into its multi-platform chat system, Skype

Even better, Project Artemis is now available to any company that wants to incorporate its software Developers interested in licensing these technologies can contact Thorn starting today, January 10

Microsoft warns that Project Artemis will not eliminate online child abuse

"Project Artemis is an important step forward, but by no means a panacea," the company said in its announcement The company said, "Exposing online child sexual exploitation and abuse, as well as online child grooming, is a serious problem But we are not deterred by the complexity and intrusiveness of these issues"

Earlier this week, Apple announced at the CES 2020 privacy roundtable that it would scan user accounts for known images of child pornography and child abuse Jane Horvath, Apple's chief privacy officer, said that user accounts will be automatically flagged if Apple finds such images

Apple has not stated exactly how it does this, but its own description of the process seems consistent with a technology called PhotoDNA, developed jointly by Microsoft and Dartmouth College (PhotoDNA compares new images to a database of known child abuse images that have already been detected and flagged by authorities; PhotoDNA also has some support for audio and video However, PhotoDNA does not prevent grooming or future abuse, as Project Artemis aims to do

Categories