Earlier this month, Meta unveiled the much-anticipated Threads app, a new platform aligned with Instagram, that offers users a distinct space for real-time updates and public conversations. Similarly to Instagram, Threads enables users to connect with friends and creators who share their interests but with a stronger focus towards text-based updates.
Threads allows you to publish posts that contain 500 characters, and can include links, photos, and videos of up to 5 minutes in length. Users also have the option to share Threads posts directly to their Instagram stories as well as post them as links on various other platforms.
Safety wise, Threads has integrated Instagram's existing suite of safety and user controls along with enforcing Instagram’s community guidelines. This article takes an introductory look at Threads and informs around what safety and privacy features are available.
Getting Started – Safety Features
If users want to get started with Threads, they can download the app from the App or Play Store and login by connecting to their existing Instagram account. Threads linking to your Instagram account in this way, means that your Instagram username, verification details along with other details such as followers and accounts that you followed before, carry over. You have the option to tailor your profile specifically for Threads though if you wish. It is important to note that Threads is only available via mobile platforms and a desktop version is currently not available.
For users under 18 in the UK, accounts are automatically defaulted into a private profile which limits who can see your activity, filtering only towards followers in a lot of instances. Other safety controls also provide you with the ability to control who can mention you or reply to your threads. Similar to Instagram, Threads allows you to add hidden words to filter out replies containing specific words that may cause harm or are of no interest to you. Additionally, you can unfollow, block, restrict, or report a profile within Threads and any accounts you've blocked on Instagram will be automatically blocked on Threads as well.
Currently, Instagram’s Family Centre supervision tools are also automatically implemented across Threads, allowing parents and guardians to help support teens aged 13-17 on the app. This will allow parents to see information on their teen's following, followers, and time usage, as well as see daily time limits and set scheduled breaks.
The app adheres to Instagram's Community Guidelines, ensuring that all content and interactions align with the platform's policies. Some of these policies include:
- Ensuring photos and videos have the right to be shared
- Ensuring that photos and videos are appropriate for a diverse audience
- Fostering meaningful and genuine interactions
- Abiding by the law
- Respecting other users
- Not glorifying self-harm
Instagram states that if these Guidelines are not adhered to, then it may result in ‘deleted content, disabled accounts or other restrictions.’
A final aspect to consider is reporting. Threads has been built to ‘enable positive, productive conversations’ but users must still need awareness of reporting functions when experiencing any harmful content online. Threads does include reporting functions for posts, as well as users but further support can also be found at www.reportharmfulcontent.com for independent advice.
To align with Meta's new platform, we're also pleased to announce that Threads has joined as a participating plaform for StopNCII.org. Anyone who wants to protect their intimate images online can use StopNCII.org to create a hash (or digital fingerprint) of their image which will then be cross-referenced across Threads to detect any corresponding image (or video) and removed.