The digital world is vibrant and fast-moving. The constantly evolving nature of the threats - from websites promoting suicide to social media channels celebrating restricted eating - has rightly heightened the need for safety measures to protect young people from harm.
For Kooth, creating a truly safe space online has always been central to our mission to bring mental health support to all. In relation to the Online Safety Act, our safeguarding processes, while already stringent and in line with all existing standards, are being fully aligned, following the recent guidance from Ofcom regarding the Protection of Children codes which comes into force in 2025.
Our moderation guidelines prioritise the safety of children and adults and provide clear age ratings to prevent children in particular, from being exposed to inappropriate or harmful content.
Key to remaining at the forefront of online safety is our in-depth and ever-evolving safeguarding governance and assurance policies, managed by our experienced clinical and safeguarding teams.
These ensure we are keeping our thousands-strong community of children and young people safe - and can be summarised into five Kooth principles:
1. We pre-moderate every single word on Kooth
What does this mean? This means that no user generated content is published to our site until it has been “moderated” by one of our moderating team. This content may be an article, a poem, a story, a discussion post, a journal entry, a live forum comment or any subsequent interaction.
Our moderating team is composed of dedicated emotional wellbeing practitioners who are qualified to degree level as well as having experience in a range of backgrounds, from youth work and social care to mental health nursing and teaching.
We adopt a number of specialised resources, NHS guidance and government frameworks to guide our moderation and each practitioner completes bespoke and intensive Kooth training before they can begin moderating the platform. This activity is overseen by a highly experienced team, trained to either level 4 or 5 in safeguarding (level 5 being the highest).
The team carefully reviews and checks over every comment, every submitted article or poem and every forum thread: nothing is published to the platform without first being reviewed.
The first stage of the review identifies any safeguarding concerns. In imminent risk situations where we already hold identifiable information, the moderator will contact emergency services.
This will be followed up with a same day referral to children’s social care. In non-imminent risk situations, a same day referral will be made to social care and/or other relevant services (eg. CAMHS crisis services, the school). Additionally, the young person will be messaged to explain our concerns about their safety, advise them of the actions we’ve taken (if it’s deemed safe to do so) and provide them with immediate safety advice including a list of local services.
Where we do not hold identifiable information, our moderators will ask for this from the young person if deemed useful. If we don’t get that information, safety planning advice will be shared with the young person and they will be invited to join a live chat with a counsellor. They will be assigned as high risk on our system and prioritised when entering any of our services.
In all scenarios above, the moderator will add a safeguarding case note to the system with the concerns, rationale and actions taken after which the safeguarding case note is added to a triage queue and all safeguarding concerns are reviewed again.
The second stage is to ensure the post meets our safety guidelines for publishing. This is where we review the post in accordance with our moderation guidelines that are informed and developed from a number of specialised resources, NHS guidance and government frameworks. If something isn’t publishable - perhaps it contains details of a sexual assault or mentions class A drugs, for example - the team will notify the young person explaining why and linking them to our guidelines.
The final stage is to determine the age appropriateness of the post and which age bracket it is suitable for. As explained below, content that is either user-generated or Kooth-generated is all age-rated based on our moderation guidelines. We also have an age rating framework, informed by a variety of evidence sources, including the Department for Education, the NHS, the NSPCC, Government Online Safety Guidance, and a number of specialist mental health sources for particular topics, eg. BEAT, Mind and the Samaritans.
All content and resources are assessed against the framework by a qualified health, education or social care professional with appropriate subject matter expertise and safeguarding trained to at least level 3.
2. Service users cannot privately interact with other service users
It’s simply not possible to do this on Kooth. Service users can post public replies to forum posts or on live forums, but as described above, all comments are moderated and only published if they meet our guidelines.
3. All service users are anonymous to one another
Kooth is purposefully anonymous at the point of initial sign up, with individuals self-reporting age (year and month of birth) and locality (from a drop down list of options), before creating a login and password to access the service.
This data, along with all subsequent data gathered, is then linked to a unique username for that user (which they choose themselves). However, as the user engages with the service, there are various points where they can share more identifiable information, with some choosing to fully identify themselves with us.
This approach supports effective safeguarding. Young people consistently tell us that anonymity enables them to disclose things that they wouldn't otherwise tell anyone. Giving young people this type of anonymity is often the key to them beginning to trust services and to feel able to open up and receive vital support.
If a young person tells us something that constitutes a concern around their safety, we ask for their personal identifiable information to help us safeguard them. When young people share this with us, they remain unidentifiable to others within our platform.
Sometimes, however, young people do not feel comfortable giving us this information. If this is the case, our team will work with them to develop an agreed safety plan and to understand and address the barriers to accessing other services.
We will also signpost them to other relevant services for young people who might be able to support them, such as SHOUT and Samaritans.
There are many reasons for favouring an anonymous approach.
- •
Suicide data shows that simply being known to services is not sufficient to prevent those at risk of suicide from taking their own lives
- •
Psychological theory describes how anonymity can increase engagement - this is supported by service user feedback
- •
Our practice-based evidence gathered over many years demonstrates that we effectively de-escalate risk, thus containing users rather than shifting risk elsewhere (eg. to oversubscribed services)
- •
Our severity and presenting issues data shows that many of our users would not meet the threshold for onward referral and yet still need timely support to prevent escalation of difficulties.
4. All content on Kooth is age-gated
All of the therapeutic content that we create and all of the content and comments submitted by service users is age-gated to ensure service users only see what is appropriate and relevant for their age.