“It evaluates and ‘rates’ conversation characteristics and assigns an overall probability rating,” explained Courtney Gregoire, Chief Digital Safety Officer at Microsoft, in a blog post. “This rating can then be used as a determiner, set by individual companies implementing the technique, as to when a flagged conversation should be sent to human moderators for review. Human moderators would then be capable of identifying imminent threats for referral to law enforcement.” The company has been working with popular children’s game Roblox, as well as Kik, The Meet Group, and Thorn to create the solution. Thorn, a child defense non-profit, will provide the tech to ‘qualified online service companies’.
Hackathon Roots
Project Artemis has its roots in Microsoft’s 2018 cross-industry hackathon. The idea of the event was to engineer solutions that work not just on a technological level, but legally and policy-wise. Further collaboration with Dr. Hany Darid, who worked with the company on PhotoDNA has furthered the project to the point it’s ready for adoption. However, Gregorie stressed that though Project Artemis is a definite aid, it shouldn’t be seen as a catch-all solution. “Project Artemis is a significant step forward, but it is by no means a panacea. Child sexual exploitation and abuse online and the detection of online child grooming are weighty problems,” she said. “But we are not deterred by the complexity and intricacy of such issues. On the contrary, we are making the tool available at this point in time to invite further contributions and engagement from other technology companies and organizations with the goal of continuous improvement and refinement.” Interested parties can get in contact with Thorn for the solution. Reports of child sexual abuse material have increased 10,000% since 2004, and tools like Microsoft could become essential in fighting it.