What difference would an update in electoral law make to protecting the UK against the threat of online manipulation? Jillian Linton looks at how this has worked in Canada.
From the 2016 American election to the Cambridge Analytica scandal in the United Kingdom, the use of social media to shape political conversations online and manipulate public opinion has increased drastically. In doing so, it has affected our ability to hold free and fair elections. Foreign interference, such as the recent Russian targeted misinformation campaigns aimed at a range of African nations, has played a part in this – but there are a whole range of actors taking advantage of these platforms as election tampering and misinformation campaigns continue to appear worldwide.
In the United Kingdom, a two-year DCMS inquiry led to a final Disinformation and Fake News report launched in February 2019, stating unequivocally that “electoral law is not fit for purpose and needs to be changed”. Following the report, the government published an Online Harms White Paper aimed at making “Britain the safest place in the world to be online.” The report acknowledged that the existing patchwork of regulation was insufficient and proposed the creation of a “statutory duty of care” for tech companies. The government also launched a finalised statutory code of practice for providers of online social media platforms, giving tech companies the opportunity to begin self-regulation ahead of any binding legislation.
A group of MPs responded to the White Paper, supporting some of the conclusions and plans but ultimately calling for urgent updates to electoral law as soon as possible. They issued the following warning: “Were an election or referendum to take place later this year, campaigns would be fought using electoral law that is wholly inadequate for the digital age.”
And yet here we are. With the snap election called, this White Paper has not been translated into real legislation and no new controls are yet in place, which means this election is taking place with effectively the same (lack of) protections as the election in 2016. That being said, what difference would an update in electoral law make?
Electoral reform for the digital age: the Canadian case
In Canada, a recent federal election was held on 21 October 2019. Canada has a similar electoral system to the UK: first-past-the-post constituency-based voting and paper ballots that mean online interference with the physical process of voting is difficult. However, Canada did update its electoral law in anticipation of the election. They passed a bill amending their electoral legislation to include the online sphere and address foreign influence. This included criminalising foreign funding of partisan activity for advertising, requiring a political ad registry from online platforms, and placing strict rules and spending caps on third parties involved in partisan political activity.
The Canadian government also established a new cyber security department, linked up with international bodies, and created an elections task force (SITE) aimed at beefing up security, tracking both domestic and foreign interference, investigating related criminal activity and advising the government on global disinformation campaigns so that they can respond appropriately. If a breach was to take place during the election, the Critical Election Incident Public Protocol would kick in, informing party officials and citizens of the issue and recommending best actions to protect themselves.
With all this in place, how did it work in practice?
During the pre-election period, well funded third-party partisan Facebook pages continued to amass large amounts of followers, sometimes surpassing official political party pages in numbers. These pages produced edgy memes, posts and ads, including promoted ‘fact check’ content that claimed to debunk statements by various parties without necessarily providing any proof. Although their follower numbers were lower than traditional news outlets, often their engagement levels on social media were higher, with more impressions and shares and none of the accountability or obligation to focus on the truth.
With the new electoral laws in place, as well as Facebook’s own policy updates, misleading ‘fact check’ content was now labelled as ‘advertisements’ and included in the Facebook Ad library, allowing anyone who viewed the ad to see who had paid for it. For the first time, third parties were subject to spending caps on political advertising online and were required to track and subsequently report on their spending. This allowed researchers to track their overall reach and reporters to monitor their claims.
On Twitter, there was some back and forth between independent researchers and Twitter’s own staff about whether automated activity occurred during the election period. While it’s clear that there was coordinated activity by groups of accounts, particularly around certain hashtags, it remains disputed whether this was created by bots or groups of active individuals. Either way, scandals based in truth (like Conservative candidate Andrew Scheer’s undisclosed dual citizenship or Liberal candidate Justin Trudeau’s blackface pictures) were shared widely and often along partisan lines. Fabricated news stories also gained significant traction – including a later debunked sex abuse story about Trudeau that was circulated widely on Twitter, despite the majority of the content linking to a fake news website based in Buffalo, New York.
Despite these concerns, the protocol raised no alarm, meaning the committee felt there was no significant threat to the integrity of the election. Early reports found that once the election period began Canadian political activity increased by 800 percent on Twitter, and by 250 percent on public Facebook posts, but that in general misinformation and disinformation did not factor heavily in the online political landscape.
What could the UK learn from Canada?
With the UK general election coming to a close, only post-election analysis can reveal what role bots, targeted political advertising and online misinformation may have played. Although the Canadian and British contexts differ, early reports from the Canadian election suggest that increased legislation could tangibly reduce the threat of online manipulation. Individuals may still create fake news and people may still seek it out and share it organically, but the Canadian case suggests that more election regulation reduces the ability of misinformation to gain as wide a reach and allows for better tracking in real time of this activity. This regulation also clearly criminalises certain campaigning activity, increasing the risks for political parties and third-party partisan actors who engage in questionable online practices in the election period.
Outside of updating election legislation, the UK must also consider increased citizen-focused and led initiatives around digital literacy and data control. Citizens need more tools and training to ensure they are better able to evaluate the information that comes their way online. They also need to have a say in how their personal data is being operationalised and sold, both inside and outside of the election period. Misinformation is not going anywhere, but as technology updates so too should our strategies to deal with it.
Related articles
-
Police are already using AI. The public deserve a say.
Will Grimond
New research from the RSA suggests the police aren’t doing enough to engage the public on how they are using new technologies.
-
Data is key to overcoming the coronavirus crisis
Jake Jooshandeh Asheem Singh
Data is crucial to tackling the Covid-19 crisis – but upholding privacy and having transparency, control and autonomy over the use of data is essential.
-
How will artificial intelligence shape the future of work?
Michael Tjalve FRSA Ross Smith FRSA
Artificial intelligence is developing rapidly and is already changing the world of work. Michael Tjalve FRSA and Ross Smith FRSA explore the role of AI in the workplace of the future.
Be the first to write a comment
Comments
Please login to post a comment or reply
Don't have an account? Click here to register.