Consume: inappropriate content

Almost half (47%) of internet users report coming across a potential harm while using social media, as opposed to news services (3%) and gaming platforms (2%) with these harms being the most reported (in terms of potential consumption harms):

  • hateful, offensive or discriminatory content (64%)
  • animal cruelty (62%)
  • generally offensive or ‘bad’ language (62%)
  • content which negatively impacted on a user’s self-esteem (61%)
  • and misinformation (61%)

(Ofcom, 2022)

 

back to Consume

 

 

There is a disparity between the ‘harms that people are most concerned about and the actual incidence of them occurring’.
Although people say they are concerned about some potential online harms, the harms which most internet users are concerned about tend to be those with the lowest incidence.

Users are most concerned about content depicting the sexual abuse or exploitation of children (86%) and content encouraging extremism, radicalisation or terrorism (79%) but less concerned about generally offensive or ‘bad’ language (41%) and
unwelcome friend/follow requests or messages (42%).

 

Read Life on the small screen: What children are watching and why (ofcom.org.uk)

Internet safety risks and considerations

Access

In order to consume online content and potentially inappropriate or harmful content, learners usually need access to:

  • an internet-connected device
  • platforms or services, providing the content

When supporting learners with this area, it is important to consider who controls their access to these opportunities and resources:

  • Is there a risk of this occurring in school?
  • How is that enabled?
  • Are there steps that could be taken to reduce the risk of this?

If it is an out-of-school risk:

  • How can the school support with this?
  • Is support required for families or learners?
  • Who can support with this: Community Learning and Development or Police Scotland?
Potential risks

Being more cyber resilient reduces the risk of internet safety issues arising. We all want the internet to be a more welcoming space for children and young people and that is why we promote this positive message of safe, smart and kind.

Topics to explore with learners might include:

  • the types of content they intend to view and how to judge if it is potentially inappropriate
  • the risks of unintentionally accessing and consuming inappropriate content and how to reduce the risk of this with cyber resilience
  • are the potential risks, and associated harms, the same for everyone – why are they different, and why does this matter?
App settings guides

YouTube and TikTok are the two most popular video content platforms for children and young people, while Instagram is the most popular social media platform to consume content, such as posts, images and videos.

These guides offer an overview of each app:

Instagram

TikTok

YouTube

Cyber resilience guidance

Devices

The first potential vulnerability when consuming online content is the device not being securely setup. Check with learners that they have:

  • a screenlock that requires a passcode or biometric (face or fingerprint) to log in to stop unauthorised access (hacking)
  • apple devices have a content filter, called communication safety in Messages, in their ‘screen time’ settings that blocks potentially nude content from children’s phones
Accounts

Being more cyber resilient reduces the risk of internet safety issues arising.

  • If they are accessing their information on social media, there are usually filter settings in the security and privacy settings for the platform – these can be used to reduce the content from certain sources, sites or profiles
  • Having a profile to view online content means that the platform can tailor suggested content for the user. Not using ana ccount will mean the platform suggets its most popular content, irrespective of who may be viewing it
  • YouTube https://support.google.com/youtube/answer/6342839?hl=en&co=GENIE.Platform%3DAndroid
  • TikTok https://www.tiktok.com/safety/en/content-controls/ 
  • Instagram https://about.instagram.com/blog/tips-and-tricks/control-your-instagram-feed

there is the possibility that content creators may attempt to manipulate this process and label their content for an audience it may not be sutiable for. however, online platforms do work to reduce this risk.

Report and support

This is where knowledge of the platforms is useful for teachers. Each platform or provider will have its own reporting and settings systems. Having an idea of the platforms your learners are using allows you to better understand and support them with settings and reporting controls.

YouTube Report inappropriate videos, channels, and other content on YouTube – Android – YouTube Help (google.com)

TikTok Report a video | TikTok Help Center

Instagram How to report things | Instagram Help Centre

Resources and information

General and younger learners

This activity from Screen Scotland helps learners analyse and evaluate the purpose and content of videos, like they might do with a book in class: YouTube questions

This page from Internet Matters further explores and explains the potential issue of inappropriate content: What parents need to know about inappropriate content | Internet Matters

This video from BBC Own It explores the reasons that certain content may be suggested on platforms: Joe Tasker: why am I being suggested weird videos? – Own It – BBC

Older learners

Inappropriate content is subjective to the user and learners may feel that something is offensive or inappropriate because it seems unfair or untrue. There are several fact-checking websites and services, but these are a few reputable ones:

There is also a section on information literacy and tackling misinformation on this section of the site:

Information Literacy – critical thinking online

Gaming

Leave a Reply