4 Matching Annotations
  1. Last 7 days
    1. Companies will struggle with how their apps affect kids' mental health and safety

      ||StephanieBP|| Quite a few things on kids and children.

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

    1. Recently, the issue of child safety in virtual reality was raised by the Center for Countering Digital Hate (CCDH). The campaign group looked at a popular third-party app called VRChat.

      ||StephanieBP|| This could be of an interest for UNICEF project.

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL

  2. Aug 2021
    1. where our personal devices become a radical new tool for invasive surveillance

      Could this happen? Possibly, if we don't control the LEA which tag sexual abuse content. But with global cooperation against sexual abuse, there are commonly shared databases across countries, and it might be harder to LEA of a single country to insert some political content within. Thus, there is some risk, but I think it is overstretched.

    2. Another notifies a child's parents if iMessage is used to send or receive photos that a machine learning algorithm considers to contain nudity

      I understood this is an option that parents can switch on. Anyhow, from Apple's description it is not really an AI that reads the photos, but rather only crypto-signature which is matched with a known database of sexual abuse photos. This is quite different.

    Created with Sketch. Visit annotations in context

    Created with Sketch. Tags

    Created with Sketch. Annotators

    Created with Sketch. URL