I sat down with BBC Cyber-security reporter Joe Tidy to discuss the recommendations of our House of Lords report into Democracy and Digital Technologies – and to try and answer the question posed by the webinar which was about how the UK Parliament is combatting online attacks on elections. It was a good opportunity to revisit our report’s recommendations since the Intelligence and Security Committee published their Russia Report. As that report showed the question could perhaps be phrased “if” rather than “how”.

Our report highlighted the corrosive impact of the loss of trust in the democratic process and argued that these tech monopolies must take responsibility for the misinformation and disinformation that is promoted by their business model. We refer to the “giants” as platforms and  insist they have a responsibility for the content they push, promote and profit from and must be held to account if they fail in this.

Our report made 45 recommendations which cover, in detail, the regulation, regulators, sanctions and more:

Regulation of mis/disinformation

We recommend that the Online Harms Bill (OH Bill) should be introduced within a year of this report’s publication and should make it clear that mis and disinformation are in scope.

As part of the OH Bill, Ofcom should produce a code of practice on misinformation – if a piece or pattern of content is identified as misinformation by an accredited fact checker, it should be flagged as misinformation on all platforms. The content should then no longer be recommended to new audiences.

Fact checking

Ofcom should work with online platforms to agree a common means of accreditation, initially based on the International Fact-Checking Network (IFCN), a system of funding that keeps fact checkers independent both from Govt. and from platforms and develop an open database of what has been fact checked across platforms and providers.

Content moderation

The Government should establish an independent ombudsman for content moderation to whom the public can appeal should they feel they have been let down by a platform’s decisions. The ombudsman’s decisions should be binding on the platforms and create clear standards to be expected for future decisions for UK users. These standards should be adjudicated by Ofcom. The ombudsman should not prevent platforms removing content which they have due cause to remove.

A joint committee of Parliament would oversee the work of the proposed ombudsman, including setting the budget and having the power of veto over the chief exec’s appointment.

Political advertising

We recommend that relevant experts in the Advertising Standards Agency (ASA), Electoral Commission, Ofcom and UKSA should co-operate through a regulatory committee on political advertising. Political parties should work with these regulators to develop a code of practice for political advertising, along with appropriate sanctions, that restricts fundamentally inaccurate advertising during a parliamentary or mayoral election, or referendum. This regulatory committee should adjudicate breaches of this code.

Imprints

The Government should legislate immediately to introduce imprints on online political material. This could be done through secondary legislation.

Advert libraries

Ofcom should issue a code of practice for online advertising setting out that in order for platforms to meet their obligations under the ‘duty of care’ they must provide a comprehensive, real time and publicly accessible database of all adverts on their platform. This code of practice should make use of existing work on best practice.

Personal data in political campaigns

The Government should legislate to put the Information Commissioner Office’s (ICO) draft code on political campaigners’ use of personal data onto a statutory footing.

Algorithmic recommendation

For harmful but legal content, Ofcom’s codes of practice should focus on the principle that platforms should be liable for the content they rank, recommend or target to users.

Ofcom should issue a code of practice on algorithmic recommending. This should require platforms to conduct audits on all substantial changes to their algorithmic recommending facilities for their effects on users with characteristics protected under the Equality Act 2010. Ofcom should work with platforms to establish audits on relevant and appropriate characteristics.

Ofcom should be given the powers and be properly resourced in order to undertake periodic audits of the algorithmic recommending systems used by technology platforms, including accessing the training data used to train the systems and comprehensive information from the platforms on what content is being recommended.

Platforms v publishers

The report uses the term ‘platforms’ but holds them to a responsibility for a duty of care, responsible for the content that they promote to large audiences, rather than content they host. Ofcom should have the power to sanction platforms that fail to comply with their duty of care in the OH Bill. These sanctions should include up to 4% of global turnover, and powers to enforce Internet Service Providers blocking of serially non-compliant platforms.

Regulatory capacity

The Government should introduce legislation to enact the ICO’s proposal for a committee of regulators that would allow for joint investigations between regulators. This committee should also act as a forum to encourage the sharing of best practice between regulators and support horizon scanning activity.

The Centre for Data Ethics and Innovation should conduct a review of regulatory digital capacity across the Competition and Markets Authority (CMA), ICO, Electoral Commission, ASA and Ofcom to determine their levels of digital expertise. This review should be completed with urgency, to inform the OH Bill before it becomes law.

Freedom of expression

We protect free expression online by focusing on what platforms algorithmically promote rather than what they host. This means that platforms would not be encouraged to remove harmful but legal content. They would be required to not promote it through their algorithms or recommend it to users. This gives people the freedom to express themselves online but stops it from reaching large audiences.

We also support free expression by improving platform’s content moderation decisions. We do this by requiring greater transparency of what content they take down so that the rules that govern online debate are clearer and by establishing an online ombudsman who will be empowered to act for users online.

Anonymity online

Ofcom should work with platforms and the Government’s Verify service, or its replacement, to enable platforms to allow users to verify their identities in a way that protects their privacy. Ofcom should encourage platforms to empower users with tools to remove unverified users from their conversations and more easily identify genuine users.

Online voting

We received a small amount of evidence that was in favour of online voting. In the round, however, opinion was overwhelmingly against introducing voting online. We heard that online voting might cause people to question the trustworthiness of election results and create fertile ground for conspiracy theories.

Exercising your democratic vote is an important act that should have some ceremony about it; visiting a polling station, for those for whom this is possible, is an important part of this. We should not seek to substitute or undermine this significant and important act with an online process.

Journalism in a digital world

We recommend that the CMA should conduct a full market investigation into online platforms’ control over digital advertising.

The Government should work urgently to implement those recommendations of the Cairncross Review that it accepts, as well as providing support for news organisations in dealing with the impact of COVID-19.

Education/digital literacy

Ofsted, in partnership with the Department for Education, Ofcom, the ICO and subject associations, should commission a large-scale programme of evaluation of digital media literacy initiatives.

The Department for Education should review the school curriculum to ensure that pupils are equipped with all the skills needed in a modern digital world. Critical digital media literacy should be embedded across the wider curriculum based on the lessons learned from the review of initiatives recommended above. All teachers will need support through Continuous Professional Development to achieve this.

As I wrote in a previous post: it’s in our hands, our (well washed) hands and as we type, tap, and share, we must take more care.  What kind of conversations and discussions do we want to be part of and on what kind of social media? Do we want rigorous respectful debates that are open, transparent, accountable and trustworthy?

If not, why not pack away the public square, put away the polling stations, and with muffled cry, let our democracy die.

It’s no one else’s democracy, save ours.

An earlier version of this post is available here.

Share this page