Editorial – 06 Feb 2019

Health Secretary Matt Hancock is to meet with Adam Mosseri, the worldwide Head of Instagram tomorrow (07.02.2019) over the prevention of suicide/self-harm images on the platform.

Users of social media sites such as Facebook or Tumblr could unwittingly be exposed to or easily find such images, which begs the question, how are the owners of these platforms safeguarding their users? Are they at all?

Placing a minimum age requirement on services has been seen as the way forward by some – Facebook has had the minimum age of 13yrs in place for its users for quite some time – but who is to say that works or that it is only those under 13 who are vulnerable?

Social media can be a place to pretend you have a charmed life just as much as one to openly confess to mental health issues – graphically or not.  Either way can be a therapy of sorts for people, however I’d argue that seeing someone on a beach is probably going to be less damaging to your psyche than images of self harm.

I think there should be a better form of moderation – obviously the number of users on social media sites nowadays precludes the good old approach of a physical admin, but surely there is software that could do the job?  Facebook already implements a filter/screen for images it deems graphic, and only allows users over the age of 18 to bypass it and see the image beneath.  Instagram apparently use a similar approach.  It would only be a small stretch to prevent any user from seeing an image deemed too graphic, surely?

Lin Mason

Leave a Reply

Your email address will not be published. Required fields are marked *

%d bloggers like this: