Social media experts in the UK have called upon Facebook to adjust its guidelines to remove any ‘grey areas’ around what content is allowed on the platform. The comments follow a report in The Guardian that revealed the secret rules which Facebook moderators stick to when dealing with reported or extreme content. According to the current rules, Facebook will not delete images of child abuse, or live streams of self-harm, as the company does not want to punish people in distress. However, experts in both the child protection sector and social media market have criticised this, calling for an independent regulator to deal with extreme content on Facebook’s behalf from now on. The former chair of the UK’s Home Affairs Select Committee, Yvette Cooper, has also commented on the controversy, saying that the leaked documents “demonstrate why powerful social media companies, including Facebook, have to be more transparent”. Claire Lilley, the head of child safety online at the NSPCC, also added that social media companies “shouldn’t get to decide what’s in the best interests of children or the public”, but that police should make that call. Recently, Facebook announced that it would be hiring thousands more staff to help combat extreme content online.