Something needs to be done about the dangers of social media – but the government will have to go further than Ofcom

Regulation is desperately needed, but tackling harmful content online remains one of the biggest challenges facing any government

Jasper Jackson
Wednesday 12 February 2020 16:53 GMT
Comments
Sophie Turner on social media being a 'catalyst' for depression

The world of social media is vast. About 350 million photos are uploaded to Facebook every day, and 500 hours of video are uploaded to YouTube every minute. As many as 6,000 tweets are sent every second.

In among those uploads, posts and messages are a huge number of extremely unpleasant bits of communication: terrorist-recruitment videos, child grooming, harassment and abuse.

We’ve known for a long time that something needs to be done to try to minimise the very real harm this does. On Wednesday, the UK government decided that communications regulator Ofcom would be given the task. Whether it’s up to the challenge is another matter entirely.

The big social media companies, particularly Facebook, YouTube and Twitter, have so far largely avoided this kind of government interference. They already use automated processes to remove content deemed illegal or harmful, or even simply offensive. On top of that, they employ many thousands of people, often badly paid and working in difficult conditions, to moderate content posted by their users. Facebook took action against 3.2 million pieces of content alone in the third quarter of 2019 that were deemed to constitute bullying and harassment.

The government has decided that this isn’t enough, and last year introduced a piece of legislation to deal with “online harms”.

The scale of the challenge is extraordinary. In the past, Ofcom, which currently oversees broadcasting and communications services such as mobile-phone providers, has understandably been resistant to being given responsibility for the internet. Now it is being asked to get a grip on all the ways people are hurting each other online.

Ofcom employs about 1,000 people in total. Its new role will require additional resources, but unless the government is also planning to plough billions of pounds into a huge job-creation scheme, the regulator will not be charged with looking at every single nasty piece of content online.

The proposals suggest that Ofcom’s role will be more focused on setting guidelines and rules, and imposing sanctions when it is clear that companies have failed to meet them. Whether those are fines or criminal charges for individuals, and exactly how either can be used against huge multinational companies based for the most part in the US, is not yet clear.

Exactly what content will be deemed harmful is also going to pose a huge problem. The social media firms are already relatively good at dealing with illegal or clearly harmful posts, such as child pornography or violent videos, although the spread of the Christchurch gunman’s live recording from New Zealand last year shows that some extremely big holes remain. Other areas, however, such as cyberbullying, trolling and users discussing self harm, lie in far more subjective territory.

There are some other thorny issues. It is not just the big companies that will fall under the legislation’s remit. It will apply to all “companies that allow the sharing of user-generated content for example, through comments, forums or video-sharing”. The government’s insistence that this will only affect “fewer than 5 per cent of UK businesses” does not sound all that reassuring.

Beyond that, there is an argument that focusing on the output of social media, illegal and harmful posts and behaviour, misses the core of the problem: the way these services are designed.

The most toxic aspects of social media, from YouTube’s recommendation algorithms that push people down rabbit holes of radicalisation, to Facebook’s outrage-rewarding newsfeed, stem from the fundamental ways in which they shape the behaviour of their users. It’s impossible to really address the harm that social media can cause without looking at the foundations on which social media platforms are built.

None of this is to say that regulation isn’t desperately needed. We’ve seen how inadequate our regulatory systems are in the Electoral Commission’s inability to deal with political advertising during the Brexit referendum and recent elections.

The internet has created an unprecedented opportunity for people to communicate with each other, and with it an unprecedented volume of harmful content. Trying to set rules that govern what billions of people do and say online, let alone enforce them, is one of the biggest challenges any government has ever faced. Ofcom will have its work cut out.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in