It’s no surprise that certain online platforms may leave parents and carers scratching their heads in confusion as young people talk about how they like to socialise online. As technology continues to move forward, we quickly realise that tech savvy young people may be keeping up with it more than we are! Discord has been around for several years but has gained notable popularity with millions of active users taking part.
How Do People Use Discord?
Discord uses a lot of familiar features that many of us will be used to from other apps. Originally used by gamers to chat, the free app has expanded to a wider social network, accommodating those with more general interests to come together and chat through discussion forums called ‘servers’. Online chat is available through messaging, video calling or audio depending on what the user chooses. Users can also be direct messaged on a one to one basis which is not made public.
Servers can be specific to certain groups such as particular fan bases, allowing users to join in with those who share similar interests. Some groups require an invitation and others are publically available for anyone to join. Alternatively, chats can be limited to friends and family alone.
As the internet is so vast, Discord houses communities that cover a wide array of topics, sometimes containing mature content that may include offensive language, graphic imagery or general inappropriate content for certain age groups. As well as this, Discord provides more open chat features which can result in unwanted contact from strangers when participating in public discussions.
As Discord offers a lot of freedom in how people socialise online, it is advisable that parents and carers are aware of what security measures are available for young people using the platform.
Discord Security Features
Discord requires users be over the age of 13 to create accounts, however this relies on honesty and is up to users and family members to ensure correct information is given. As well as this, there are servers that require a minimum age of 18 to take part in, but again, this relies on honesty.
The app provides a number of features that can help with limiting the exposure of harmful online content, unwanted contact from strangers and unwanted friend requests. These include:
- Direct Messaging – Images and videos sent via direct message can be scanned for explicit content and, where such content is found, this can be deleted by Discord.Users have the option to choose between scanning all messages (Keep me safe), only scanning those from strangers (My friends are nice) or not at all (Do not scan). For users under 18, all messages are scanned by default. These settings can be found in Privacy and Safety as part of the Settings.
- Unwanted Direct Messaging – Users are able to set their privacy features to restrict direct messages from server members. This can be toggled on or off at any time and can be applied to previous servers that you may have already joined. Direct messages are permitted by default when an account is made so this feature needs to be applied manually in the Privacy and Safety section.
- Unwanted Friend Requests – Users can set who is allowed to send them a friend request in a way that is similar to other social media platforms. Users can decide whether their account is open to Everyone, Friends of Friends or limited to Server Members. Discord highlights that Friends of Friends is an advisable setting for young accounts. This can also be found in Privacy and Safety.
- Blocking – As with other social media platforms, users can block accounts at any given time. The option to block can prevent an unwanted account from making contact through direct messaging and also restricts that person’s content in servers. This can be applied by clicking on their profile and selecting the Block option.
- You can also report users by following the advice from Discord which initially recommends contacting the moderator of the server using a reporting form
Considerations for Parents and Carers
While Discord operates in similar ways to other social media platforms, it is important to note that younger audiences could still find themselves exposed to harmful online content. The privacy features available, whilst helpful, are limited in how explicit content is presented, restricting media based content on messages but potentially allowing explicit language and harmful written content to be seen on various servers.
Children who want to use Discord may only want to use it within their own friendship groups but it’s important for parents and carers to highlight the risks associated with socialising with strangers on platforms such as these as well.
Having an open discussion with your child around potential risks can allow them to gain confidence in knowing what to do when situations arise whilst encouraging them to confide in parents and carers for support. As children develop their critical thinking skills they can then go on to assess when unfamiliar friend requests need to be ignored, when someone online is being inappropriate, when an account needs to be blocked or when a server is causing them to feel uncomfortable and they need to leave the chat. Building this awareness can help prepare them for navigating the wider internet with peace of mind and confidence.
If you are using Discord and have reported harmful content to the platform that has not been actioned, you can make a report to Report Harmful Content. Any user over the age of 13 can use this service provided by the UK Safer Internet Centre and operated at SWGfL. Find out more here: