Regulation of the Social Media in Electoral Democracies: A Case of Kenya

AuthorIsaac Rutenberg and Abdulmalik Sugow
Regulation of the Social Media in Electoral
Democracies: A Case of Kenya
Isaac Rutenberg and Abdulmalik Sugow
The emergence of peer-to-peer platforms such
as social media has broken down traditional
barriers to one-to-many communication and
widespread engagement. While the unique
features of social media serve as an attractive
selling point, they are also often the source of
a multitude of problems.
Unlike traditional media, there is a glaring
absence of fact checking and content filtration
on social media platforms, raising concerns
around their level of involvement in
democracies. The recent 2017 elections in
Kenya are believed to have been influenced by
the propagation of fake news and inciting
ethnic messaging disseminated through social
media. While the exact impact of fake news
and hate speech disseminated through social
media in Kenya is unknown, the proliferation
of such content during democratic processes is
worrying. In response, these modern
platforms have become the subject of
attempted regulation by a number of
countries, with these regulations often
302 SLJ 7 (1)
focusing on fake news and hate speech. For
example, Kenya recently criminalised false
publication. In some extreme cases, there
have been knee-jerk government reactions
such as total internet shutdowns. Generally,
attempts at regulation within Kenya remain
disjointed with multiple laws each dealing
with different problematic content.
Using the example of Kenya to demonstrate
the importance of balancing regulation and
freedom, this Article discusses the unique
characteristics of social media that make it a
particularly effective tool for democratic
elections. We argue that an approach that
targets platform design instead of content
may be better suited to the ends of upholding
democratic values.
Introduction
I. Introduction
Like the printing press, the radio, the television, and the
internet, social media as a phenomenon has revolutionized
society.1 Unlike the aforementioned examples, which were
more-or-less single technological developments that later
allowed for creativity in the content delivered, social media
1 John Downing, ‘Social Movement Media and Media Activism’,
Oxford Research Encyclopedia of Communication (2018).
Regulation of the Social Media in Electoral Democracies 303
simultaneously allows for creativity in the content and in the
way that it is distributed and consumed.
The peer-to-peer nature and potential for self-expression are
key aspects of political participation and a unique motivation
for using social media in such context.2
Indeed, as a result of these characteristics, social media is
increasingly featuring in politics.3 Unlike traditional media,
there is a glaring absence of fact checking and content
filtration on these platforms, raising concerns around their
level of involvement in democracies.4 The recent 2017 Kenyan
elections are believed to have been influenced by the
propagation of fake news and inciting of ethnic messaging
disseminated through social media.5 While the exact impact
2 J Knoll, J Matthes, & R Heiss, ‘The social media political
participation model: A goal systems theory perspective’ [2008]
Convergence 1, 5.
3 Open Society European Policy Institute, Social Media Monitoring
During Elections: Cases and Best Practices to Inform Electoral
Observation Missions, (Open Society Foundations, May 2019), 7.
4 Diana Owen, ‘The Age of Perplexity: Rethinking the World We
Knew. Radically Reassessing ‘The Economic’’, (Penguin
Random House Grupo Editorial, 2017).
5 Portland Communications, ‘The Reality of Fake News in
Kenya’ (2018) https://portland-communications.com/pdf/The-
Reality-of-Fake-News-in-Kenya.pdf> accessed 4 July 2019. See
also, Abdi Latif Dahir, ‘Fake News is Already Disrupting
Kenya’s High-Stakes Elections Campaign’ (Quartz Africa, 25 June
2017),
misinformation-are-upstaging-kenyas-upcoming-high-stakes-
election/> accessed 4 July 2019.
304 SLJ 7 (1)
of fake news and hate speech disseminated through social
media in Kenya is unknown6, the proliferation of such
content during the democratic processes is worrying.
In response, these modern platforms have become the subject
of attempted regulation by a number of countries, with these
regulations often focusing on fake news and hate speech.7 For
example, Kenya recently enacted the Computer Misuse and
Cybercrimes Act which criminalises false publication of
communications and communications deemed by the
recipient as offensive.8 In some extreme cases, there have
been knee-jerk government reactions such as total internet
shutdowns.9 Generally, attempts at regulation within Kenya
remain disjointed with multiple laws each dealing with
different problematic content.10 This holds true for a number
6 Noah Miller, The Tyranny of Numbers on Social Media During
Kenya’s 2017 Elections’ (LSE Africa Blog, 4 September 2017)
https://blogs.lse.ac.uk/africaatlse/2017/09/04/the-tyranny-of-
numbers-on-social-media-during-kenyas-2017-elections/ >
accessed 4 July 2019.
7 See Section IV of this Article where these laws are elaborated
on.
8 Computer Misuse and Cybercrimes Act No 5 2018 (Kenya),
sections 22-23.
9 Chipo Dendere, ‘Why Are So Many African Leaders Shutting
Off the Internet in 2019?’ (The Washington Post, 30 January
2019)
cage/wp/2019/01/30/why-are-so-many-african-leaders-shutting-
off-the-internet-in-2019/> accessed 4 July 2019.
10 There exists the Computer Misuse and Cybercrimes Act, the
Kenya Information Communication Act 1998 and, more
recently, the Data Protection Act.
Regulation of the Social Media in Electoral Democracies 305
of other jurisdictions, and the variety of policies regulating
digital space continues to expand.11
In light of recent global trends, this Article explores social
media law and policy in the context of democracy, using the
example of Kenya’s electoral process to demonstrate the
importance of balancing regulation and freedom.12 In the
first part, the Article discusses the unique characteristics of
social media that make it a particularly effective tool for
democratic elections. These same characteristics engender
difficulties in regulation. Two primary uses of social media in
political processes are identified and discussed: broadcasting
and engagement. Subsequently, the Article discusses the
current state of social media regulation. It identifies two
types of problematic content as posing a threat to
democracies: fake news and hate speech. Following this
discussion, we canvass global approaches to regulation,
finding that, by and large, regulations adopted are targeted at
11 Select Committee on Communications, Regulating in a Digital
World (House of Lords, 9 March 2019, 2nd Report of Session
2017-19), 3.
12 A difficulty with writing about social media, particularly in an
academic setting, is the rapidly changing nature of the medium.
Specific platforms are mentioned here as instructive examples,
with full knowledge that such platforms may be on the decline
or virtually unused shortly after publication of this work.
Accordingly, it may be for the reader to apply the discussion
herein to the popular platforms that exist at the time of reading.
See, for example, John Downing, ‘Social Movement Media and
Media Activism’, Oxford Research Encyclopedia of Communication
(2018).
306 SLJ 7 (1)
content as opposed to the structure or design of platforms
that facilitate dissemination of said content. These content-
centric regulations impose liability on either the user or the
platform, or some combination of the two. The challenges
posed by such laws are discussed and include violations of
free speech and privacy rights. In the final Section, the Article
analyses the state of regulation in Kenya, finding that existing
content-centric regulations fail to effectively and
proportionately achieve the objective of regulating political
speech. As a solution, the Article proposes by way of
example the use of a principles-based approach that
inspires co-regulation (i.e., joint efforts by governments and
platforms).
II. Social Media in Electoral Democracies
In the broadest of contexts, former United States (US)
President Barrack Obama is often credited with popularising
the use of social media in elections.13 Slightly over a decade
later, the use of social media in electoral contexts has risen
significantly. Only as recently as 2013 did Kenyan electoral
politics begin the shift from traditional media to social
media.14 This has grown over the past few years to the extent
13 Jennifer Aaker and Victoria Chang, ‘Obama and the power of
social media and technology’ [2010] The European Business
Review,
may-june-obama.pdf> accessed 4 July 2019.
14 Christa Odinga, ‘Use of New Media during the Kenya
Elections’ (Uppsala Universitet, 2013) http://www.diva-
Regulation of the Social Media in Electoral Democracies 307
that a recent study identified politics as one of the primary
reasons for social media use in Kenya.15
For purposes of this Article, social media usage in an
electoral democracy is conveniently classified in two
categories use as a broadcast tool and use as a forum for
engagement and dialogue16 that are discussed below.
A) Social Media as a Broadcast Tool
Political candidates, activists, and government officials have
always sought effective means for communicating with the
general public and are finding social media to be a tool with
unmatched benefits. The gatekeepers of traditional media
editors, journalists, etc. are entirely absent from social
media, thus allowing complete control over content and
portal.org/smash/get/diva2:633138/FULLTEXT01.pd> accessed
March 12, 2020, 18.
15 Social Media Lab Africa (SMElab Africa), ‘Social Media
Consumption in Kenya: Trends and Practices, (USIU Arica,
2018)
umption_in_Kenya_report.pdf> accessed 12 March 2020.
16 Hannah O’Morain, ‘What is the role of social media in a
general election campaign? A case study of the 2016 Irish
General Election’ (2016) Maynooth University Research Project
Repository
t/Hannah%20Byrne%20O%27Morain_0.pdf> accessed July 4
2019.
308 SLJ 7 (1)
messaging. Just as important, the low cost of engagement17
and wide reach18 contribute a great degree of the
democratizing power of social media.
Due to the aforementioned qualities, social media use often
obviates particular hindrances to broadcasting and may be
useful as compared to traditional media. This is particularly
the case in electoral contexts in democracies such as Kenya
where the freedom of the press is often limited.19 For
example, during the 2017 Kenyan elections, when the
government effectively shut down traditional media outlets
(in particular TV), the online activities of traditional media
houses were largely unaffected. Media houses continued to
post online content regarding the shutdown of their
traditional outlets.20 Similarly, this particular use of social
media was demonstrated in Sudan, when the military
removed President Al Bashir in 2019.21 Sudanese nationals
17 Hunt Allcott and Matthew Gentzkow, ‘Social Media and Fake
News in the 2016 Election’ (2016) 31(2) Journal of Economic
Perspectives, 211, 221.
18 John Ndavula and Joy Mueni, ‘New Media and Political
Marketing in Kenya: The Case of 2013 General Elections’ (2014)
3(6) International Journal of Arts and Commerce 69, 82.
19 Odinga (n 14) 9-10.
20 See Tweets by Daily Nation and Business Daily
>
and >
accessed July 30, 2019.
21 BBC News, ‘Omar Al Bashir: Sudan’s ousted president’ (BBC
News 2019) >
accessed 12 July 2019.
Regulation of the Social Media in Electoral Democracies 309
around the world were kept apprised of developments
through social media.22 Referred to as the #BlueForSudan
movement, the level of engagement was remarkable and
demonstrated considerable and widespread solidarity with
Sudan.23
In addition to its use in electoral contexts, social media
platforms have been recognised as convenient tools of
dissemination by various government entities.24 Through
these platforms, various governmental bodies regularly
update citizens on service delivery and even provide avenues
for public participation in some instances. For example, in
Kenya, following the 2013 elections, the President set up the
Presidential Strategic Communications Unit which kept the
public apprised of the government’s projects on social media
platforms.25
22 Ayen Bior, ‘Sudan’s Social Media Deemed Major Player in
Bashir Ouster’ (VOA News, 18 April 2019)
www.voanews.com/archive/sudans-social-media-deemed-
major-player-bashirs-ouster accessed 4 July 2019.
23 Natalia Faisal and Rym Bendimerad, ‘#BlueforSudan: Why is
social media turning blue for Sudan?’ (Al Jazeera News 13 June
2019) an-social-
media-turning-blue-sudan-190613132528243.html> accessed July
4 2019.
24 David Landsbergen, ‘Government as Part of the Revolution:
Using Social Media to Achieve Public Goals’ (2010) 8(2)
Electronic Journal of e-Government 135, 136-137.
25 Grace Mutung’u, ‘The Influence Industry Data and Digital
Campaigning in Kenya’, (Our Data Ourselves, June 2018)
310 SLJ 7 (1)
B) Social Media as a Forum for Engagement and Dialogue
Mass communication is not the only appeal of social media:
the two-way interaction that is a feature of most platforms
also plays a significant role in fostering civic engagement.
Political actors can directly engage with each other and
their target audiences at a peer-to-peer level, in ways that
were not possible on such a wide scale prior to social media.26
Some academics have linked the interactivity of social media
with concepts of active citizenship.27
As a tool for dialogue and civic engagement, social media is
particularly effective in grassroots organising. The popularity
and success of movements such as #MeToo,
#BlackLivesMatter, #ArabSpring, #BlueForSudan, and
#TakeAKnee signals the rise of a hashtag democracy i.e. a
global and borderless system of civic engagement in which
any cause can gain power through effective social media
https://cdn.ttc.io/s/ourdataourselves.tacticaltech.org/ttc-
influence-industry-kenya.pdf > accessed 4 July 2019, 5.
26 Missy Graham and Elizabeth Avery, ‘Government Public
Relations and Social Media: An analysis of the Perceptions and
Trends of Social Media Use at the Local Government Level’
(2013) 7(4) Public Relations Journal 1, 1.
27 Gibril Schmiemann, ‘Social Network Sites and Political
Participation: Attributes of the European Union’s Facebook
Pages’ (2015) University of Twente Essays
>
accessed 4 July 2019, 22.
Regulation of the Social Media in Electoral Democracies 311
engagement.28 In key political moments, hashtags have also
trended in Kenya such as #SomeoneTellCNN and
#OccupyParliament.
Mainstream politicking also benefits from the technology.
The design of social media platforms allows political actors to
market their policies and have dialogues with each other29,
thereby influencing perceptions.30 For example a recent study
found that the primary motivations for political actors’ use of
social media are: visibility, communication, and the creation
of a perception of relevance due to the use of such
platforms.31 These platforms tear down the barriers between
the audience and the author, thereby enabling political actors
to reach out and directly influence users in various ways, i.e.
through intimate posts regarding their private lives, or policy
briefs, or simply answering queries from the electorate. In the
2013 Kenyan elections, a number of candidates engaged with
users regularly on Facebook, generating hashtags and catchy
phrases to solicit responses.32
28 Glenda Daniels, ‘Scrutinizing Hashtag Activism in the
#MustFall Protests in South Africa in 2015’ in Bruce Mutsvairo
(ed), Digital Activisim in the Social Media Era (Palgrave
Macmillan, 2016)
29 Gunn Enli and Eli Skogerbø, ‘Personalized Campaigns in
Party-Centred Politics’ (2013) 16(5) Information, Communication
and Society 757, 763.
30 O’Morain (n 16).
31 Karen Ross and Tobias Bürger, ‘Face to face(book): Social
media, Political Campaigning and the Unbearable Lightness of
Being There’ (2014) 66(1) Political Science 46, 59.
32 Ndavula and Mueni (n 18) 79.
312 SLJ 7 (1)
It is not surprising, then, that the use of social media for
political purposes has steadily increased. An increased level
of political engagement can be construed as beneficial for
most democracies, however, not all uses of social media are
peaceful or beneficial for democracy. As examples, various
social media channels were used in the 2017 Kenyan general
election to aggravate ethnic tensions33 and Facebook was
found to have been used to incite violence in Myanmar.34 The
power of social media platforms lies, in part, in its
independence from any form of censorship and regulation.
This is also the strongest argument for imposing rules of
engagement for the sake of preserving democratic and
human rights. However, imposing such rules is easier said
than done. More often than not, attempts at regulating these
new mediums raise further concerns as is discussed in the
next Section.
33 Robert Muthuri et al. ‘Biometric Technology, Elections and
Privacy: Investigating Privacy Implications of Biometric Voter
Registration in Kenya’s 2017 Election Process’ (Strathmore
Centre for Intellectual Property and Information Technology
Law, 2018)
https://privacyinternational.org/sites/default/files/2018-
06/Biometric%20Technology-Elections-Privacy.pdf> accessed
July 5 2019.
34 Alexandra Stevenson, ‘Facebook Admits It Was Used to Incite
Violence in Myanmar’ (New York Times, 6 November 2018)
facebook.html> accessed July 5, 2019.
Regulation of the Social Media in Electoral Democracies 313
III. Social Media Regulation
The very characteristics that make social media particularly
effective in fostering political engagement are the ones that
create opportunities for misuse. It is therefore necessary to
discuss social media regulation, particularly in Africa where
ICT laws and policies are currently being enacted at a
steadily increasing rate.
Technology often outpaces the law in terms of speed of
development. Consequently, existing laws governing the
offline world are used to deal with illegal online conduct. In
this Section, we discuss the application of non-specific laws
to social media activity showing the lack of nuance that this
engenders. It subsequently discusses particular problematic
content resulting from social media use that may necessitate
slightly more specific regulatory efforts: fake news and hate
speech.
A) Offline Laws in an Online World
There are few laws existing world-wide written specifically
to regulate social media activities. Instead, laws that exist
independently of social media are typically applied to social
media activities.35 Where the law was originally written to
target and regulate offline activities, the online nature of
social media can present a challenge to effective and fair
application of the law.36 Nevertheless, unless and until social
35 Select Committee on Communications (n 11), 9.
36 Valerie Brannon, ‘Free Speech and the Regulation of Social
Media Content’ (Congressional Research Service Reports, 27
314 SLJ 7 (1)
media-specific laws are created, law enforcement and the
courts are forced to apply non-specific laws. In applying non-
specific laws, liability for the content on a social media site
generally falls into two categories content for which the
posting user is liable, and content for which the social media
site operator is liable.37
i. Primary liability
All or nearly all content on social media falls into the first
category, and any such material can also concurrently fall
into the second category under specific conditions.
Unless a user can show that s/he temporarily or permanently
lost control over a social media account, such as due to
hacking activities, the user is generally liable for any content
posted under that account. This clearly includes original
content created by the user, and also includes reposted
content where the original content was prepared and first
posted by another user. Thus, for example, a user is
responsible and liable for offensive content in an original
tweet. The user is also liable for offensive content in a “re-
tweet” – i.e., a reposting of an original tweet from another
user just as the originator of the content is also liable.
March 2019)
>
accessed 5 July 2019.
37 Michal Lavi, ‘Content Providers’ Secondary Liability: A Social
Network Perspective’ (2016) 26 Fordham Intellectual Property,
Media and Entertainment Law Journal 858, 859.
Regulation of the Social Media in Electoral Democracies 315
During the 2017 Kenyan elections, the Communications
Authority together with the National Cohesion and
Integration Commission issued guidelines to regulate
political messaging via new media. These guidelines set
limits on the content capable of being shared through social
media, providing that existing ‘offline’ laws such as the Penal
Code and National Cohesion and Integration Act would be
applied in prosecuting offenders.38
Some social media activities result from non-human actions.
Automated social media management software tools are
created to assist users mainly corporations but increasingly
politicians and political groups to manage multiple social
media accounts and platforms.39 These tools may
automatically generate social media content in order to
provide updates on product development, remind consumers
about the existence of a product, highlight trends in a
product or business sector, or report news events, among
various other purposes. Such tools should be used with
caution, as they typically fail to account for important
38 Communications Authority of Kenya and National Cohesion
and Integration Commission, ‘Guidelines on Prevention of
Dissemination of Undesirable Bulk and Premium Rate Political
Messages and Political Social Media Via Electronic
Communications Networks’ (July 2017) https://ca.go.ke/wp-
content/uploads/2018/02/Guidelines-on-Prevention-of-
Dissemination-of-Undesirable-Bulk-and-Premium-Rate-
Political-Messages-and-Political-Social-Media-Content-Via-
Electronic-Networks-1.pdf accessed March 12, 2020.
39 See for example SocialPilot,
accessed 5 July 2019.
316 SLJ 7 (1)
contextual factors that a human would understand.
Nevertheless, the social media account holder is responsible
for the legal and non-legal repercussions of content generated
automatically. A further complication in terms of regulating
online activities results from the growing importance and
influence of “bots” in steering online conversations.40 Such
automated posting of content has been recognized as a
serious threat to the trustworthiness of online interactions.41
Bots were found to be particularly prevalent in Kenya,
ranking higher than other influential figures and generating
discourse during election periods.42
ii. Secondary liability
The second category of liability relates to the platform. Social
media platform providers are generally categorized as online
service providers (OSPs).43 As such, so long as the platform
does not actively monitor user content, the platform is
commonly afforded the safe harbour provisions that apply to
other OSPs such as search engines.44 Safe harbour laws
typically have notice and take-down provisions, requiring an
OSP to remove offensive content once notified of its
40 Robert Gorwa and Douglas Guibeault, ‘Unpacking the Social
Media Bot: A Typology to Guide Research and Policy’ (2018)
Policy & Internet, 1, 1-2.
41 Notwithstanding the growing importance of bot activity to the
democratic process, we have elected to explore this topic further
in a future article.
42 Social Media Lab Africa (n 15) 50.
43 Communications Decency Act 1996 (US), section 230.
44 Communications Decency Act 1996, section 230 and Digital
Millennium Copyright Act 1998 (US), section 512.
Regulation of the Social Media in Electoral Democracies 317
existence.45 OSPs that fail to remove offending content risk
liability for the content. The offending content can be
material that incites violence or tortious material such as
libellous statements. In Kenya, for example, a recently
enacted Copyright (Amendment) Act created safe harbours
similar to laws first created in the United States at the turn of
the 21st century.46 However, not all countries have or make
use of these safe harbour provisions.
B) Insufficiency in Regulation
The difficulty in governmental regulation of social media has
been attributed to the fact that the internet has a dual nature:
a global commons and a privately owned and operated
space.47 Practically, it is also difficult to adopt laws that can
keep up with the rapid developments that are characteristic
of these platforms. However, some believe that it is not the
place of government to regulate content on these platforms;
that such a task lies with the platforms themselves.48
45 Digital Millennium Copyright Act 1996 (US), section 515 and
e-Commerce Directives, 2000 (EU), sections 12-15.
46 Copyright (Amendment) Act No 20 of 2019 (Kenya).
47 Paulina Wu, ‘Impossible to Regulate: Social Media, Terrorists,
and the Role of the U.N.’ (2015) 16(1) Chicago Journal of
International Law, 281, 292.
48 John Samples, ‘Why the Government Should Not Regulate
Content Moderation of Social Media’ (Cato Institute, Policy
Analysis No. 865, 9 April 2019)
https://www.cato.org/publications/policy-analysis/why-
government-should-not-regulate-content-moderation-social-
media > accessed 9 July 2019.
318 SLJ 7 (1)
It would suffice that these platforms have their own
regulatory mechanisms in the way of policy guidelines and
terms of use. However, the interests of the private entities
that own and operate these platforms (e.g., profit, goodwill,
political gain) may often stand in contradiction to the
interests of the state.49 Even where these interests may be
aligned, enforcement that may call upon a private entity to
make decisions that would affect users’ rights is problematic
at best. In the face of liability, social media platforms would
be more likely to err on the side of caution and censor their
users where the legality of content is in dispute.50 Fagan
proposed viewing government interference as
supplementary to self-regulatory efforts i.e., where the
interests of both parties align, platform effort suffice as
regulations and state intervention would only be suitable
where these interest diverge.51
In Africa, the difficulty of the situation is compounded by the
fact that these platforms are often owned and operated by
foreign entities. Existing laws within African countries rarely
contemplate social media usage as is the case in countries
such as the US where safe harbour provisions are included in
legislation.52 It is no surprise then, that when faced with
rampant use of social media to oppose government policy or
49 Frank Fagan, ‘Systemic Social Media Regulation’ (2018) 16(1)
Duke Law & Technology Review 393, 396.
50 Rachel Pollack (ed), World Trends in Freedom of Expression &
Media Development Special Digital Focus (UNESCO 2015) 105.
51 Fagan (n 50) 396-397.
52 Communications Decency Act 1996 (US).
Regulation of the Social Media in Electoral Democracies 319
action, a number of African governments have opted to
clamp down on social media by blocking, taxing, or making it
difficult to access such platforms.53
Keeping in mind the increasing use of social media in
elections, is it possible and/or desirable to regulate the
content on social media directly?
A problem with discussing the insufficiency of the
regulations surrounding social media is the difficulty in
proposing a definitive solution. Social media is ever changing
and the problems that abound are always markedly different.
However, the need for regulation is salient from the nature of
the problems faced in a majority of democracies. These
problems are rooted in the nature of social media as an
unfiltered forum where user generated content can be
disseminated to millions of people. At face value, the fact that
content reaches a wide audience is not necessarily
problematic: it is the type of the content that is often the
issue.54 Indirect application of tort law (defamation etc.) or
criminal law may not suffice to regulate some of the content
shared due to the seemingly benign nature and the
anonymity offered by the internet. In fact, a lot of what
53 Abdi Latif Dahir, ‘African strongmen are the biggest
stumbling blocks to internet access on the continent’ (Quartz
Africa, 20 March 2019)
dictatorships-shut-down-the-internet-the-most/> accessed July 5
2019.
54 As has been canvassed in this Article, fake news and hate
speech easily become prevalent.
320 SLJ 7 (1)
would be deemed ‘problematic’ content for purposes of this
Article does not necessarily violate any laws (with the
exception of hate speech). Indirect application also fails to
address the way platforms work. These platforms effectively
act as ‘norm entrepreneurs’, deciding what content users are
exposed to often for commercial purposes.55 Problematic
content may have a far-reaching effect, not because of the
concerted efforts of persons seeking to disseminate it on a
wide scale, but because of the architecture of platforms
Aside from this, the large volume of content makes
regulation particularly challenging or impossible.
In a majority of legislative responses, states have sought to
address two particular problematic uses of social media: hate
speech (inclusive of incitement to violence) and
disinformation (more popularly known as fake news). It is
believed that these pose a danger to electoral legitimacy and
political stability, hence attempts at reigning them in.
However, exercising control over platforms with a view to
curbing this problematic content is not as straightforward as
simply enacting legislation, as will be discussed in the
following Sections.
a) Hate Speech
Most, if not all, African countries have laws prohibiting hate
speech and criminalising actions or words that may amount
to incitement to violence. For example, in Kenya, the Penal
55 Fagan (n 50) 395.
Regulation of the Social Media in Electoral Democracies 321
Code prohibits incitement to violence and disobedience of
the law.56 Within the Constitution, hate speech and
incitement constitute an exception to the freedom of
expression.57 In the context of an election or political
campaign, this provision is particularly important owing to
the background of ethnic violence in Kenya during
elections.58 In Kenya’s case, traditional media (mainly
communal radio) was believed to have been used to fuel
ethnic tensions following a contested election in 2007/8.59 As
social media has been proliferating, there is the concern that
its versatile and borderless nature can provide an avenue for
hate speech and incitement.60
The problems of hate speech and incitement are not novel,
therefore social media has not necessarily brought about the
issue. However, it has been cited as magnifying an already
56 Penal Code (Kenya), section 96.
57 Article 33(2), Constitution of Kenya (2010).
58 Jennifer Cooke, ‘Background on the Post-Election Crisis in
Kenya’ (Center for Strategic & International Studies, 6 August
2009)
post-election-crisis-kenya> accessed July 8 2019.
59 Sam Howard, ‘Community Radio and Ethnic Violence in
Africa: The Case of Kenya (MA Thesis, University of East
Anglia, 2009).
60 Mutuma Ruteere, ‘The State must deal firmly with ethnic
incitement on internet and social media’ (Daily Nation, 13 July
2014)
Ethnic-Incitement/1949942-2382538-format-xhtml-
uyfqb7/index.html> accessed 8 July 2019.
322 SLJ 7 (1)
existing vice.61 The problem is also not continent specific:
social media has been used to incite violence and hatred in
Germany, the US, India, Myanmar, and Sri Lanka, effectively
showing that, while it is a powerful tool for political
participation, it can equally be used to sow discord.62 The use
of social media by persons with nefarious intentions raises
the concern that its use ought to be regulated. In particular,
the persons using these platforms for these ends ought to be
held accountable.
This concern is exacerbated by the difficulty of applying the
existing, non-specific laws to social media. For example,
while the wording of the Kenyan Penal Code includes
‘publishes’ as an actus reus– an act one may argue includes
social media posting it may be difficult to identify offenders
due to the anonymity offered by social media. It may also be
difficult to decide where charges lie in instances where posts
are generated by bots. Even where identified, the ambiguity
of these non-specific laws in relation to social media leave
room for malicious prosecution.63
61 Zachary Laub, ‘Hate Speech on Social Media: Global
Comparisons’ (Council on Foreign Relations, 7 June 2019)
comparisons> accessed July 8 2019.
62 ibid.
63 Juliet Nanfuka, ‘New Year, Old Habits: Threats to Freedom of
Expression Online in Kenya’ (CiPESA, 27 January 2016)
https://cipesa.org/2016/01/new-year-old-habits-threats-to-
freedom-of-expression-online-in-kenya/> accessed 8 July 2019.
Regulation of the Social Media in Electoral Democracies 323
Attempts at remedying this by ‘digitising’ these non-specific
laws by extending their application to platforms often fail to
appreciate the technical and often nuanced nature of the
internet.64 While they target users for content posted, some
problematic content may gain traction as a result of the
systems in place set up by the platforms. For example, the
commercial models of these platforms generate engagement
by facilitating the formation of idea bubbles and reinforcing
circles that are then targeted by advertisers; they literally
generate revenue from social networks centred around ultra-
specific ideas, whether illicit or not.65 Some problematic
content may not even be illegal, just harmful.
In the home countries of some of these platforms, the laws
have developed to accommodate these roles played in the
form of liability regimes that offer exemptions in certain
instances.66 However, in many democracies in Africa, it is
increasingly difficult to impose control on these platforms,
particularly where there aren’t any social media specific laws.
However, even in these countries with laws catering to such
platforms, there are difficulties of free speech, privacy,
private enforcement of rights, and censorship that arise.
These problems often arise due to the fact that the ‘digitised’
laws do not appreciate the contribution of platform design to
the spread of problematic content. This is more salient with
fake news as is discussed below.
64 Select Committee on Communications (n 11) 47.
65 Laub (n 62)
66 Communications Decency Act, 1996 (US).
324 SLJ 7 (1)
b) Fake News
In recent years, global trends have shown an uptick in the
proliferation of misinformation and disinformation through
social media.67 Some academics have argued that these have
had an impact on recent elections in the US.68 While
disinformation is the deliberate act of spreading false
information with the intent to deceive, the inaccuracy in
misinformation is often accidental or innocent in nature.69
Nonetheless, both are believed to contribute to the erosion of
basic truths which are essential for democracies; for
electorates to make decisions, there needs to be a consensus
on basic truths.70
The effects of the un-edited content that dominates social
media are profound, particularly in the political context.71
Algorithmically, a majority of social media platforms
facilitate the creation of echo chambers (i.e. where users are
67 Hunt Allcott et al, ‘Trends in the Diffusion of Misinformation
on Social Media’ (2016) NBER Working Paper No 25500
https://www.nber.org/papers/w25500 accessed 9 July 2019.
68 ibid.
69 Caroline Jack, ‘Lexicon of Lies: Terms for Problematic
Information’ (Data & Society Research Institute, 2019)
.pdf> accessed 9 July 2019.
70 Abdulmalik Sugow, ‘The Right to Be Wrong: Examining the
(Im)possibilities of Regulating Fake News while Preserving the
Freedom of Expression in Kenya’ (2019) 4(1) Strathmore Law
Review 19, 21-22.
71 Open Society European Policy Institute (n 4) 3.
Regulation of the Social Media in Electoral Democracies 325
only confronted with content they are likely to agree with)
through suggestions of posts and targeted advertisement.72
By itself, this is to some extent harmless, however, where the
content is problematic (i.e., deliberately or accidentally false),
it is dangerous as users tend to easily accept content that is in
line with their world view as true.73 Various aspects of
human psychology reinforce this tendency: the repetition of
content by different sources may impute truth in one’s
mind74, and the reaffirmation of pre-existing beliefs (i.e.
homophily) furthers the acceptance of information regardless
of veracity.75 Platforms therefore act as norm entrepreneurs
instead of simply being an avenue for speech, their level of
involvement rises to curation of what speech users get
exposed to and in what order.76
The impact of fake news is not exclusive to Africa. In Kenya,
during the 2017 general elections, fake news is believed to
have been circulated through social media.77 In Nigeria, the
President was forced to publicly refute claims that he had
72 Seth Flaxman et al, ‘Filter Bubbles, Echo Chambers, and
Online News Consumption’ (2016) 80 Public Opinion Quarterly
298, 299.
73 Claire Wardle, ‘Fake news. It’s complicated’ (Medium,16
February 2017)
complicated-d0f773766c79> accessed 9 July 2019.
74 ibid.
75 Miller McPherson et al, ‘Birds of a Feather: Homophily in
Social Networks’ (2007) 27 Annual Review of Sociology 415, 416 -
417.
76 Fagan (n 50) 395.
77 Portland Communications (n 5).
326 SLJ 7 (1)
passed on and was replaced by a clone.78 Interestingly, a
study found that in Sub-Saharan Africa (compared to the
global north), there was a significantly higher amount of
perceived exposure to fake news, a perception that led to a
declining trust in the media.79
The application of non-specific laws to address problematic
content propagated by users is somewhat of a last resort
method in the absence of specific laws. In recognition of this,
various countries have adopted laws that seek to directly
criminalise the online publication of false information. For
example, Kenya recently enacted the Computer Misuse and
Cybercrimes Act, which contains such provisions.80 Due to
the novelty of these laws, it is at the time of writing this
Article too early to judge their efficacy. However, it is
worth noting that the constitutionality of the Kenyan Act was
unsuccessfully challenged in court by an association of local
bloggers.81 This outcome was upheld notwithstanding the
78 Stephanie Busari, ‘How fake news was weaponized in
Nigeria’s elections’ (CNN, 15 February 2019)
elections-intl/index.html> accessed 9 July 2019.
79 Dani Madrid-Morales and Herman Wasserman, ‘Fake News’,
Disinformation and Media Trust in Africa: A Comparative
Study of Kenya, Nigeria and South Africa’ (ICAfrica Biennial
Conference, 2018)
>
accessed 9 July 2019.
80 Computer Misuse and Cybercrimes Act, No 5 of 2018 (Kenya).
81 Bloggers Association of Kenya (Bake) v Attorney General & 3
others (Petition 206 of 2018).
Regulation of the Social Media in Electoral Democracies 327
existing critiques of the potential effects on the freedom of
expression.82
In light of the two types of problematic content discussed
herein, it is prudent to note that in a majority of African
democracies, there exists relatively minimal regulation of
campaign messaging. Mandating transparency may do more
to inhibit the spread of problematic content that isn’t
necessarily illegal. In the United States and elsewhere,
political campaigning is regulated to provide a certain level
of transparency particularly in funding and messaging.83
For example, regulations may exist to ensure that a campaign
ad for a particular candidate identify that candidate and, in
some cases, include a message that the candidate endorses
the ad. Also, for example, regulations may require
identification of the source of funds used to pay for a
campaign ad. Such regulations may be platform dependent
and, where the regulation is old and has not been recently
updated, may not apply to social media ads. In other cases,
regulations may not apply to social media ads because such
ads are not widely/indiscriminately broadcasted as is the case
for ads on traditional media platforms.
82 Sugow (n 71).
83 For example, in the US, these regulations are administered by
the Federal Electoral Commission. See
https://www.fec.gov/legal-resources/regulations/ accessed 9 July
2019.
328 SLJ 7 (1)
In Africa, few if any examples exist of regulations that
require transparency in campaign advertising84, and no laws
or court cases have sought such transparency for ads on
social media. This is highly problematic as it allows
campaigns to discreetly produce and distribute ads that are
controversial, likely to incite violence, and/or based on false
information. In the 2017 general election in Kenya, for
example, various websites and social media channels carried
messages that were highly disparaging of one presidential
candidate or the other, and although such messages were not
endorsed by political parties or candidates, they were clearly
meant to stir ethnicity-based distrust.85
Faced with the complexity resulting from insufficient
regulation, it may appear that attempts at governmental
control of social media ought to be negated by the existence
of self-regulatory mechanisms adopted by platforms (e.g. the
use of platform guidelines). The allure of this argument is
clear: self-regulation would ease the pressure on the judiciary
and prosecutorial bodies86, platforms are often protected by
free speech laws therefore regulating content on their
84 For example, Party Elections Broadcasts and Political
Advertisements Regulations (South Africa).
85 Nanjala Nyabola, ‘Texts, Lies, and Videotape’ (Foreign Policy,
1 August 2017)
and-videotape-kenya-election-fake-news/> accessed 9 July 2019.
86 Article19, ‘Self-regulation and ‘hate speech’ on social media
platforms’ (2018) https://www.article19.org/wp-
content/uploads/2018/03/Self-regulation-and-%E2%80%98hate-
speech%E2%80%99-on-social-media-platforms_March2018.pdf>
accessed 9 July 2019.
Regulation of the Social Media in Electoral Democracies 329
platforms isn’t as straightforward87, and platforms are often
best placed to self-regulate.88 However, these mechanisms
have been faulted by various stakeholders in relation to
different content. By and large, the most salient concern is the
chilling effect self-regulation would have on the freedom of
expression.89 Platforms, being operated by private entities,
lack sufficient mandate to make decisions on the legality of
content.90 It is also difficult for them to pass such decisions in
the absence of laws that directly address social media.
Therefore, according to those making the counterargument, it
would not be appropriate to leave the task of regulation of
such content to platforms. This is especially the case where
sanctions incentivise platforms to comply. Censorship of
users would dramatically increase in a bid to avoid liability.91
Existing self-regulatory models have been found to be
deficient of accountability and consistency in stakeholder
engagement, among other issues.92 Content curation,
deployment of algorithms, and enforcement of take down
requests have been cited as lacking transparency.93 This is
compounded by the fact that interests of states and of
platforms can often be at odds.94
87 Samples (n 49)
88 ibid.
89 Article19 (n 87).
90 ibid.
91 Pollack (n 51)
92 Article19 (n 87).
93 Select Committee on Communications (n 11) 59.
94 Fagan (n 50) 396.
330 SLJ 7 (1)
In summary, social media has been shown to be a potent
force in political processes both for good and for nefarious
intent. Faced with the widespread use of social media and the
dissemination of hate speech and disinformation thereon, a
number of countries currently lack sufficient regulations in
place to address these issues. Attempts to make use of
existing laws have proven futile in addressing the unique
design of platforms which often enable problematic content
to take root. Further, the reliance on self-regulatory
mechanisms is also problematic, both in failing to address the
core issue and in creating new issues around freedom of
speech and censorship by private individuals and companies.
For these reasons, there is an urgent need to develop suitable
regulation that addresses social media directly in such a
context. Over the past few years, there have been a number of
developments in Kenya and around the world. In the next
Section, the Article discusses the ongoing developments in
social media regulation.
IV. Ongoing Developments in Regulation
Over recent years, a number of jurisdictions have grappled
with the effects of social media and this has resulted in
various approaches to its regulation. More recently, the
conversation regarding the regulation of online content was
reignited by the alleged manipulation of the 2016 US
Regulation of the Social Media in Electoral Democracies 331
elections through social media.95 Following revelations that a
British consulting firm, Cambridge Analytica, used citizens’
data gathered from Facebook to micro target and interfere in
the US elections, concerns were heightened around the
word.96 As is noted in this Article, Kenya was not immune to
the operations of Cambridge Analytica and had similar
impetus to enhance its cybersecurity and data privacy. A
notable outcome of these concerns around the world was
legislation aimed at securing privacy or stemming the spread
of specific online content. These laws often centred on
problematic content i.e. they sought to criminalise or prohibit
problematic content like fake news or hate speech, essentially
‘digitising’ offline laws on such content. In this Section, the
Article discusses these trends, finding that they generally
raise more concerns than they address. Content-centric
regulations are inadequate for two main reasons: they do not
address the design of platforms that enable the spread of the
content they seek to control and in attempt to control such
content, they often lead to violations of free speech and
privacy. For purposes of this analysis, this Section begins by
canvassing regulations around the world aimed at imposing
liability on platforms and those aimed at users. The Section
95 Article19, ‘Free speech concerns amid “fake news” fad’ (2018)
amid-fake-news-fad/> accessed 12 July 2019.
96 Emma Graha m-Harrison and Carole Cadwalladr, ‘Revealed:
50 million Facebook Profiles Harvested For Cambridge
Analytica in Major Data Breach’ (The Guardian, 17 March 2018)
www.theguardian.com/news/2018/mar/17/cambridge-analytica-
facebook-influence-us-election > accessed 13 July 2019.
332 SLJ 7 (1)
then concludes by discussing the situation in Kenya in light
of these global trends.
A. Platform-targeted approaches at regulation
In Germany, the Bundestag enacted the Network
Enforcement Act (‘NetzDG’). Through this law, the
government sought to regulate hate speech online by
imposing a burden on platforms to expeditiously take down
‘manifestly unlawful’ content within 24 hours or risk facing a
fine of up to 50 million euros.97 The law also imposes bi-
annual reporting obligations on platforms.98 The context of
this law is the increasing danger of nationalist sentiments in
the wake of a migrant crisis.99 The law has been criticized for
leading to over-removal by platforms, privatization of
enforcement and in general, having a chilling effect on the
freedom of expression.100 Recognising that the concern of
over-removal was not unfounded, the government sought to
97 Network Enforcement Act 2017 (Germany), sections 3-4.
98 ibid section 2.
99 Article19, ‘Germany: Responding to Hate Speech. Country
Report’ (2018) https://www.article19.org/wp-
content/uploads/2018/07/Germany-Responding-to-
%E2%80%98hate-speech%E2%80%99-v3-WEB.pdf> accessed 12
July 2019.
100 Heidi Tworek and Paddy Leerssen, ‘An Analysis of
Germany’s NetzDG Law’ (20 19) Transatlantic Working Group
Working Paper
erssen_April_2019.pdf > accessed 12 July 2019.
Regulation of the Social Media in Electoral Democracies 333
make amendments to the law to enable the restoration of
wrongfully removed content.101
Prior to its enactment, Article 19, a free speech advocacy
group, raised the concern that expeditious take downs may
bring about the Streisand effect (i.e. the content would be
circulated further due to the notoriety it received after being
taken down).102 This concern proved true when a post that
had been taken down attracted media coverage that also
included the post’s content essentially defeating the
purpose of expeditious take downs.103 Thus far, Facebook has
been one of the platforms affected by this law with a recent
fine of 2 million Euros for failure to comply with its reporting
obligations.104
In Australia, the government enacted the Criminal Code
Amendment (Sharing of Abhorrent Violent Material) Act,
2019 in the wake of the Christchurch shooting in New
101 Emma Thomasson, ‘Germany Looks to Revise Social Media
Law as Europe Wat ches’ (Reuters, 8 March 2018)
hatespeech/germany-looks-to-revise-social-media-law-as-
europe-watches-idUSKCN1GK1BN > accessed 12 July 2019.
102 Tworek and Leerssen (n 101), 3.
103 ibid.
104 Lucinda Southern, ‘What to Know About Europe’s Fight on
Platform Hate Speech’ (Digiday UK, 12 July 2019)
to-know-about-europes-fight-
on-platform-hate-speech/ > accessed 13 July 2019.
334 SLJ 7 (1)
Zealand.105 The law criminalises the failure to expeditiously
remove or cease hosting abhorrent violent material.106 This
would mean that executives of these platforms could face
penal sanctions for content on their platform a fact that may
lead to over-removal. In other jurisdictions, governments
have sought to target the users themselves.
B. User-centred approaches at regulation
In response to recent disinformation campaigns, Malaysia
enacted the Anti-Fake News Act which criminalises the
creation, offering, or publishing of fake news.107 The law
primarily targets the persons responsible for authoring or
sharing the content as opposed to the role and contribution of
the platform.108 However, it also imposes a burden of
expeditious content removal on persons having the ability to
remove such content.109 This appears to be specifically
applied to the platforms, particularly because no penal
sanctions are contemplated in the provision. The Act has
105 Evelyn Douek, ‘Australia’s New Social Media Law is a Mess’
(Lawfare Blog, 10 April 2019)
law-mess > accessed 13 July 2019.
106 Provision 474.34, Criminal Code Amendment (Sharing of
Abhorrent Violent Material) Act 2019 (Australia).
107 Article19, ‘Malaysia: “Anti-Fake News Act”’ (Legal Analysis,
2018)
content/uploads/2018/04/2018.04.22-Malaysia-Fake-News-Legal-
Analysis-FINAL-v3.pdf > accessed 13 July 2019.
108 Anti-Fake News Act 2018 (Law 803) (Malaysia), section 4.
109 ibid section 6.
Regulation of the Social Media in Electoral Democracies 335
been criticized for its ambiguity in enacting legal duties of
‘truth’ and for its potential to lead to over-removal by
platforms in a bid to comply.110 This has led to the Act being
repealed altogether.111 A primary concern with most of these
regulations, has been their effect on freedom of expression,
and the potential for their misuse. For example, in enacting a
duty of ‘truth’, it has been argued that the government
effectively retains the ability to control content that is
contrary to its interests.112
Similarly, in Singapore, the government enacted the
Protection from Online Falsehoods and Manipulation Bill
which criminalises the communication of false statements of
fact.113 Interestingly, the burden imposed on platforms in the
case of this law differs. The law provides that internet
intermediaries may be required to issue correction notices
where the government has deemed speech to be false (i.e.,
indicate to end-users that the content has been deemed
false).114 This is in addition to the requirement of disabling
access by end users, which requirement can also be
110 Article 19, ‘Malaysia: “Anti-Fake News Act”’ (n 109).
111 Anisah Shukry, ‘Malaysia to Scrap Anti-Fa ke News Law Once
Used Against Mahathir’ (Bloomberg, 10 October 2019)
https://www.bloomberg.com/news/articles/2019-10-10/malaysia-
to-scrap-anti-fake-news-law-once-used-against-mahathir
accessed 13 March 2020.
112 Article 19, ‘Malaysia: “Anti-Fake News Act”’ (n 109).
113 Protection from Online Falsehoods and Manipulation Act
2019 (Singapore), section 7.
114 ibid section 23.
336 SLJ 7 (1)
mandated by the government. The law has also been
criticized for its apparent requirement of platforms to keep
records of what users view.115
C. Kenya’s approach
These global trends have, to a larger extent, influenced legal
and policy developments in Africa. In 2018, it was revealed
that Cambridge Analytica, a British consulting firm,
harvested the data of millions of users without their consent
and used that information for political advertising.116 The
effects of this were not exclusive to the US elections; it is
believed that Cambridge Analytica also played a role in the
Kenyan elections in 2017.117 As has been discussed earlier in
this Article, social media’s impact is often borderless and
countries in Africa have been grappling with the same issues
as other democracies: fake news, hate speech and even
115 Jennifer Daskal, ‘This ‘Fake News’ La w Threatens Free
Speech. But It Doesn’t Stop There’ (The New York Times Privacy
Project, 30 May 2019)
singapore.html > accessed 13 July 2019.
116 Graham-Harrison and Cadwalladr (n 97)
117 Larry Madowo, ‘How Cambridge Analytica Poisoned
Kenya’s Democracy’ (The Washington Post, 20 March 2018)
opinions/wp/2018/03/20/how-cambridge-analytica-poisoned-
kenyas-democracy/?noredirect=on&utm_term=.ba663902da68 >
accessed 13 July 2019.
Regulation of the Social Media in Electoral Democracies 337
terrorism in some instances.118 In fact, these situations are
often exacerbated owing to the fact that, according to a recent
report, social media is used for political conversations in
Africa more than it is in other continents.119
In a bid to control social media, African governments have
responded in various ways. In general, these responses range
from internet shutdowns to the detention of online critics.120
More recently, Uganda, Benin, Tanzania, and Zambia
adopted social media taxes121, although Benin repealed the
law recently following protests.122 The taxation of social
media has been criticized as unnecessarily stifling the
freedom of expression.123 An alternative approach adopted in
118 Kate Cox et al, ‘Social Media in Africa – a Double-Edged
Sword for Security and Development’ (UNDP-RAND 2018)
Social-Media-Africa-Research-Report_final_3%20Oct.pdf>
accessed 13 July 2019.
119 Portland Communications, ‘How Africa Tweets 201 8’
(Portland Communications, July 2018).
120 Babtunde Okunoye, ‘In Africa, A New Tactic to Suppress
Online Speech: Taxing Social Media’ (Council on Foreign
Relations, 10 January 2019)
suppress-online-speech-taxing-social-media > accessed 13 July
2019.
121 ibid.
122 Fatima Moosa, ‘Benin Repeals Social Media Tax After
Protests’ (Mail & Guardian, 25 September 2019)
09-25-benin-repeals-social-media-
tax-after-protests > accessed 13 July 2019.
123 Article19, ‘Eastern Africa: New Tax and L icensing Rules for
Social Media Threaten Freedom of Expre ssion’ (2 018)
338 SLJ 7 (1)
Kenya and Tanzania has been the revision of licensing
requirements for bloggers and other content generators,
although this approach has received similar criticisms.124 In
Nigeria, the Cybercrimes (Prohibition, Prevention Etc) Act,
2015 was introduced to regulate general cybercrimes. In
Ethiopia, the Computer Crime Proclamation was enacted in
June 2016 to deal with, inter alia, hacking, child pornography,
and dissemination of spam.125 Although none of these efforts
are specifically directed at election or other political
processes, the resulting laws are certainly available in such
contexts.
More particularly, in Kenya, the Computer Misuse and
Cybercrimes Act was enacted in 2018 and provides for the
regulation of problematic content by imposing penal and
monetary sanctions on persons involved in false publication,
child pornography, and computer forgery.126 Prior to its
enactment, critics proposed a revision or removal of a
number of the law’s provisions owing to human rights
concerns i.e. there existed a potential for abuse due to their
broad nature and the punitiveness of the penalties
provided.127 More recently the government proposed
licensing-rules-for-social-media-threaten-freedom-of-expression/
> accessed 13 July 2019.
124 ibid.
125 958/2016 (Ethiopia).
126 No 5 of 2018 (Kenya).
127 Article19, ‘Kenya: Computer and Cybercrimes Bill 2017’ (Legal
Analysis, 2018)
Regulation of the Social Media in Electoral Democracies 339
amendments to the Kenya Information and Communication
Act which directly aim to regulate social media.128 The
amendments provide for, among other things, licensing of
social media platforms, control of user behaviour with
restrictions on the formation of social media groups, and
registration of bloggers.129 Interestingly, the amendments
require that in order to be licensed, platforms ought to
establish a physical office in Kenya, maintaining the
stipulated information at said office.130 The impracticality of
these amendments cannot be overstated and they have been
criticized for posing a threat to privacy of users while at the
same time stifling free speech.131
It is difficult to know precise motivations for the passage of
these laws. In the Kenyan example, it is believed that some
parts of the Computer Misuse and Cybercrimes Act were
content/uploads/2018/04/Kenya-analysis-April-2018.pdf
accessed 13 July 2019.
128 Kenya Information and Communication (Amendment) Bill
2019.
129 ibid section 84IA Section 84ID.
130 ibid Section 84IA.
131 Jackline A kello, ‘The Social Media Bill: Proposed
Amendments to KICA to Regulate Use of Social Media
Platforms’ (CIPIT, 21 November 2019)
proposed-amendments-to-kica-to-regulate-use-of-social-media-
platforms/> accessed 13 March 2020.
340 SLJ 7 (1)
influenced by nude photos sent to parliamentarians while the
corresponding bill was being debated.132
From these occurrences, it is possible to infer that social
media regulation faces a specific difficulty: balancing the
freedom of expression with other individual rights and state
interests. Some have suggested that in the development of
the internet, certain norms have been adopted. These norms
include transparency, openness, and accountability.133 Social
media platforms are often based on these norms, of free
speech and privacy. Self-regulatory efforts have often been
lauded for respecting these norms.134 Some platforms even
specifically state that they seek to adhere to and respect these
norms.135 In adopting regulations, states must mind the
effects such regulations have on freedom of expression and
the economic interests of platforms, all while achieving the
specific objectives of lawmakers and government policies. In
light of this difficulty, there have been varied suggestions for
more effectively reaching an appropriate balance. Such
suggestions include governmental management of social
media platforms or shifting the commercial model of these
132 Dickson Olewe, ‘Kenya, Uganda and Tanzania in ‘Anti-Fake
News Campaign’ (BBC, 16 May 2018)
accessed 13 July
2019.
133Alexandra Paslawsky, ‘The Growth of Social Media Norms
and the Governments’ A ttempt at Regulati on (2017) 35(5)
Fordham International Law Journal 1485, 1535.
134 ibid, 1526.
135 Samples (n 49).
Regulation of the Social Media in Electoral Democracies 341
platforms away from data-driven ads.136 Some have also
proposed a combination of policy and legislative tools to
address problems such as fake news.137
However, this Article posits that attempts at social media
regulation which focus on inhibiting certain content will
inevitably encounter the problems discussed herein. In the
final Section, the Article discusses a potential model for
regulation that could largely obviate the free speech and
privacy concerns plaguing current trends.
V. Principles-Based Regulation: A Possible Solution for
Kenya
In the preceding Sections, this Article discussed the unique
challenges posed by social media use in political settings.
These challenges were exemplified by the examples of fake
news and hate speech. The Article also discussed attempts at
regulation and the resultant difficulties. A common thread
identified in regulations adopted around the world has been
the focus on content as opposed to the design of platforms
that enable such content to proliferate. This is despite the fact
that the architecture appears to be more of a problem than
the content in fact, the content in some instances is clearly
legal, and in many other cases is arguably legal. This content-
136 Blayne Haggart & Natasha Tusikov, ‘It’s ti me for a new way
to regulate social media platforms’ (The Conversation, 2019)
to-
regulate-social-media-platforms-109413> accessed 13 July 2019.
137 Sugow (n 71), 45-46.
342 SLJ 7 (1)
centric approach engenders concerns over free speech and
accountability. In this Section, by way of example, we
propose a principles-based approach that would inspire co-
regulation. This is in light of the cogent argument that
platforms are often best placed to regulate content.
In the enactment of content-centric legislation, Kenya is not
lagging behind. The Computer Misuse and Cybercrimes Act
criminalised false publication among other dissemination of
problematic content. The Kenya Information Communication
(Amendment) Bill proposed restrictions on the formation of
social media groups.138 As numerously set out in this Article,
the constitutionality of such an approach is often questioned
in light of the inevitable infringement upon fundamental
rights.139
From the Cambridge Analytica Scandal140, to the
dissemination of ethnically charged content on social media
platforms during Kenyan elections, an argument may be
made that the country ought to (or would at least like to)
address the structures in place that enable such content to
proliferate. This would not be in the form of the existing
laws. It would be more suitable to make use of a principles-
based approach at the level of Executive-branch regulation.
Such an approach has been proposed by policy analysts and
committees, as described below.
138 Kenya Information and Communications (Amendment) Bill,
2019, section 84IC.
139 See for example, Sugow (n 71).
140 Madowo, (n 118)
Regulation of the Social Media in Electoral Democracies 343
Despite the possibility of conflicting interests between
platforms and governments when the former engages in self-
regulation, some believe that platforms are the appropriate
authorities to moderate content.141 Noting that the primary
concerns abounding from self-regulation include
transparency, accountability, and the violation of
fundamental rights, there have been proposals of state
adoption of principles to guide regulators in enforcing
compliance and platforms in self-regulating.142 A seminal
example of such a proposal is contained in a report presented
to the House of Lords by the Select Committee on
Communications.143 The report notes that a principles-based
approach would be adaptive and would safeguard the
interoperability of the internet (bearing in mind that multiple
jurisdictions could have radically different rules-based
laws).144 Recognising the appeal of such an approach, the
report proposed ten principles to guide regulators and the
regulated. These include accountability, parity, transparency,
openness, ethical design, and privacy among others.145 Salient
in these principles is the implicit attempt at addressing the
architecture of platforms as opposed to content. By fostering
ethical design, these principles would address concerns of
data privacy, content curation (which plays on heuristics and
homophily), algorithms, and the opaque enforcement of
141 Samples (n 49). See also Fagan (n 50) (generally).
142 Select Committee on Communications (n 11), 15.
143 ibid.
144 ibid, 10 and 15.
145 ibid, 15.
344 SLJ 7 (1)
terms of service and guidelines.146 The approach would
ensure ethical design by imposing a duty of care on platforms
to comply with the principles.147 This is contrasted with the
direct attempt at classifying certain conduct as illegal an
approach found to be wanting in the report148 and in this
Article. To operationalise these aspirations, the report
proposed the formation of a central Digital Authority, tasked
with inter alia coordination of regulation, assessing existing
regulations, and mandating disclosures to vet compliance.149
Following this report, a similar approach was proposed in
France. Through an interim report on the regulation of social
networks submitted to the French Secretary for Digital
Affairs, a social network services law was proposed.150 The
proposed approach is similar to that detailed in the Select
Committee’s Report, i.e. it also recognises the optimal
positioning of platforms as regulators of content and the
government’s larger role of nudging platforms towards the
adoption of certain architecture.151 The French report also
proposes that regulations ought to be guided by three
conditions: focus on compliance by platforms as opposed to
the illegality of content; focus on systemic actors with larger
146 ibid 23-33.
147 ibid 53-55.
148 ibid 23.
149 ibid 61-63.
150 Interim Mission Report, ‘Regulation of Social Networks
Facebook Experiment’ (State Department, Digital Affairs,
France, May 2019).
151 ibid 2.
Regulation of the Social Media in Electoral Democracies 345
impact; and a flexible approach foreshadowing rapid
developments in the sector.152 To that end, it proposes
regulations based on five pillars: a grounding in fundamental
freedoms and economic interests of platforms, prescriptive
regulation focused on compliance (to engender transparency
and accountability), dialogue between stakeholders, vesting
of authority in a singular independent authority, and
regional cooperation.153
These approaches have been echoed by academics with
Fagan finding that architecture ought to be the appropriate
focus of regulators, as opposed to content.154
These approaches are transferable. In Kenya, the perceived
effects of social media on elections/electoral processes can be
traced to platform architecture (e.g. the Cambridge Analytica
scandal relied on targeting through the use of data).
Therefore, adopting an approach Fagan termed ‘Systemic
Social Media Regulation’ may prove beneficial to Kenya and
similar jurisdictions. Through focusing on the design of
platforms, states obviate human rights concerns, address
problematic content that is not illegal, and remain agile in
responding to developments in technology.
152 ibid 2-3.
153 ibid 3.
154 Fagan (n 50) 438-439.
346 SLJ 7 (1)
VI. Conclusion
In this Article, we have posited that social media platforms
contribute a great deal to political processes. We further
argued that these contributions are not always positive, with
fake news and hate speech being highlighted as particularly
problematic content in need of regulation. In view of the
difficulty of applying non-specific laws to online interactions
we subsequently mapped out attempts at enacting legislation
to address this type of content. These laws were found to be
deficient in balancing fundamental rights with the objectives
sought. At the heart of this deficiency, was the focus on
content as opposed to structure. As a response, the Article
discussed trends towards principles-based regulations that
put more emphasis on platform design as opposed to
material content. This was found to be a suitable approach
for Kenya and other similarly positioned democracies.
Bibliography
Books
Daniels G, ‘Scrutinizing Hashtag Activism in the #MustFall
Protests in South Africa in 2015’ in Bruce Mutsvairo (ed),
Digital Activisim in the Social Media Era (Palgrave
Macmillan, 2016)
Pollack R (ed), World Trends in Freedom of Expression &
Media Development Special Digital Focus (UNESCO 2015)
Cases
Bloggers Association of Kenya (Bake) v Attorney General & 3
others (Petition 206 of 2018)
Journal Articles
Aaker J and Chang V, ‘Obama and the power of social media
and technology’ [2010] The European Business Review,
biybj2966/f/t
ebrmay-june-obama.pdf> accessed 4 July 2019
Allcott H and Gentzkow M, ‘Social Media and Fake News in
the 2016 Election’ (2016) 31(2) Journal of Economic
Perspectives, 211
Allcott H et al, ‘Trends in the Diffusion of Misinformation on
Social Media’ (2016) NBER Working Paper No 25500
https://www.nber.org/papers/w25500 accessed 9 July 2019
348 SLJ 7 (1)
Flaxman S et al, ‘Filter Bubbles, Echo Chambers, and Online
News Consumption’ (2016) 80 Public Opinion Quarterly 298
Graham M and Avery E, ‘Government Public Relations and
Social Media: An analysis of the Perceptions and Trends of
Social Media Use at the Local Government Level’ (2013) 7(4)
Public Relations Journal 1
Enli G and Skogerbø E, ‘Personalized Campaigns in Party-
Centred Politics’ (2013) 16(5) Information, Communication
and Society 757
Fagan F, ‘Systemic Social Media Regulation’ (2018) 16(1)
Duke Law & Technology Review 393
Gorwa R and Guibeault D, ‘Unpacking the Social Media Bot:
A Typology to Guide Research and Policy’ (2018) Policy &
Internet, 1
Knoll J, Matthes J, & Heiss R, ‘The social media political
participation model: A goal systems theory perspective’
[2008] Convergence 1
Landsbergen D, ‘Government as Part of the Revolution:
Using Social Media to Achieve Public Goals’ (2010) 8(2)
Electronic Journal of e-Government 135
Lavi M, ‘Content Providers’ Secondary Liability: A Social
Network Perspective’ (2016) 26 Fordham Intellectual
Property, Media and Entertainment Law Journal 858
Regulation of the Social Media in Electoral Democracies 349
McPherson M et al, ‘Birds of a Feather: Homophily in Social
Networks’ (2007) 27 Annual Review of Sociology 415
Ndavula J and Mueni J, ‘New Media and Political Marketing
in Kenya: The Case of 2013 General Elections’ (2014) 3(6)
International Journal of Arts and Commerce 69
O’Morain H, ‘What is the role of social media in a general
election campaign? A case study of the 2016 Irish General
Election’ (2016) Maynooth University Research Project
Repository
ets/docu
ment/Hannah%20Byrne%20O%27Morain_0.pdf> accessed
July 4 2019
Paslawsky A, ‘The Growth of Social Media Norms and the
Governments’ Attempt at Regulation’ (2017) 35(5) Fordham
International Law Journal 1485
Ross K and Bürger T, ‘Face to face(book): Social media,
Political Campaigning and the Unbearable Lightness of Being
There’ (2014) 66(1) Political Science 46
Sugow A, ‘The Right to Be Wrong: Examining the
(Im)possibilities of Regulating Fake News while Preserving
the Freedom of Expression in Kenya’ (2019) 4(1) Strathmore
Law Review 19
350 SLJ 7 (1)
Wu P, ‘Impossible to Regulate: Social Media, Terrorists, and
the Role of the U.N.’ (2015) 16(1) Chicago Journal of
International Law, 281
Legislation
Anti-Fake News Act 2018 (Law 803) (Malaysia)
Communications Decency Act 1996 (US)
Computer Crime Proclamation 2016 (Ethiopia)
Computer Misuse and Cybercrimes Act, No 5 2018 (Kenya)
Constitution of Kenya (2010)
Copyright (Amendment) Act, No. 20 of 2019 (Kenya)
Criminal Code Amendment (Sharing of Abhorrent Violent
Material) Act 2019 (Australia)
Data Protection Act 2019 (Kenya)
Digital Millennium Copyright Act 1998 (US)
e-Commerce Directives, 2000 (EU)
Federal Electoral Commission Regulations (US)
Kenya Information Communication Act 1998
Regulation of the Social Media in Electoral Democracies 351
Kenya Information and Communication (Amendment) Bill
2019
Network Enforcement Act 2017 (Germany)
Party Elections Broadcasts and Political Advertisements
Regulations (South Africa)
Penal Code (Kenya)
Protection from Online Falsehoods and Manipulation Act
2019 (Singapore)
Websites and other sources
Akello J, ‘The Social Media Bill: Proposed Amendments to
KICA to Regulate Use of Social Media Platforms’ (CIPIT, 21
November 2019) 21/the-social-
media-bill-proposed-amendments-to-kica-to-regulate-use-of-
social-media-platforms/> accessed 13 March 2020
Article19, ‘Free speech concerns amid “fake news” fad’ (2018)
oncerns-
amid-fake-news-fad/> accessed 12 July 2019
Article19, ‘Germany: Responding to Hate Speech. Country
Report’ (2018)
content/uploads/2018/07/Germany-Responding-to-
%E2%80%98hate-speech%E2%80%99-v3-WEB.pdf> accessed
12 July 2019
352 SLJ 7 (1)
Article19, ‘Eastern Africa: New Tax and Licensing Rules for
Social Media Threaten Freedom of Expression’ (2018)
new-tax-and-
licensing-rules-for-social-media-threaten-freedom-of-
expression/ > accessed 13 July 2019
Article19, ‘Kenya: Computer and Cybercrimes Bill 2017’
(Legal Analysis, 2018) g/wp-
content/uploads/2018/04/Kenya-analysis-April-2018.pdf
accessed 13 July 2019
Article19, ‘Malaysia: “Anti-Fake News Act”’ (Legal Analysis,
2018)
content/uploads/2018/04/2018.04.22-Malaysia-Fake-News-
Legal-Analysis-FINAL-v3.pdf > accessed 13 July 2019
Article19, ‘Self-regulation and ‘hate speech’ on social media
platforms’ (2018) -
content/uploads/2018/03/Self-regulation-and-
%E2%80%98hate-speech%E2%80%99-on-social-media-
platforms_March2018.pdf> accessed 9 July 2019
BBC News, ‘Omar Al Bashir: Sudan’s ousted president’ (BBC
News 2019)
16010445> accessed 12 July 2019
Bior A, ‘Sudan’s Social Media Deemed Major Player in Bashir
Ouster’ (VOA News, 18 April 2019)
www.voanews.com/archive/sudans-social-media-deemed-
major-player-bashirs-ouster accessed 4 July 2019
Regulation of the Social Media in Electoral Democracies 353
Business Daily, ‘Twitter Post’
7263232>
accessed July 30, 2019
Brannon V, ‘Free Speech and the Regulation of Social Media
Content’ (Congressional Research Service Reports, 27 March
2019)
650>
accessed 5 July 2019
Busari S, ‘How fake news was weaponized in Nigeria’s
elections’ (CNN, 15 February 2019)
s-nigeria-
elections-intl/index.html> accessed 9 July 2019
Communications Authority of Kenya and National Cohesion
and Integration Commission, ‘Guidelines on Prevention of
Dissemination of Undesirable Bulk and Premium Rate
Political Messages and Political Social Media Via Electronic
Communications Networks’ (July 2017) /ca.go.ke/wp-
content/uploads/2018/02/Guidelines-on-Prevention-of-
Dissemination-of-Undesirable-Bulk-and-Premium-Rate-
Political-Messages-and-Political-Social-Media-Content-Via-
Electronic-Networks-1.pdf accessed March 12, 2020
Cooke J, ‘Background on the Post-Election Crisis in Kenya’
(Center for Strategic & International Studies, 6 August 2009)
round-post-
election-crisis-kenya> accessed July 8 2019
354 SLJ 7 (1)
Cox K et al, ‘Social Media in Africa – a Double-Edged Sword
for Security and Development’ (UNDP-RAND 2018)
NDP-
RAND-Social-Media-Africa-Research-
Report_final_3%20Oct.pdf> accessed 13 July 2019
Dahir A L, ‘African strongmen are the biggest stumbling
blocks to internet access on the continent’ (Quartz Africa, 20
March 2019)
dictatorships-shut-down-the-internet-the-most/> accessed
July 5 2019
Dahir A L, ‘Fake News is Already Disrupting Kenya’s High-
Stakes Elections Campaign’ (Quartz Africa, 25 June 2017),
misinformation-are-upstaging-kenyas-upcoming-high-
stakes-election/> accessed 4 July 2019
Daily Nation, ‘Twitter Post’
5680640>
accessed July 30, 2019
Daskal J, ‘This ‘Fake News’ Law Threatens Free Speech. But
It Doesn’t Stop There’ (The New York Times Privacy Project,
30 May 2019) on/hate-
speech-law-singapore.html > accessed 13 July 2019
Dendere C, ‘Why Are So Many African Leaders Shutting Off
the Internet in 2019?’ (The Washington Post, 30 January 2019)
Regulation of the Social Media in Electoral Democracies 355
cage/wp/2019/01/30/why-are-so-many-african-leaders-
shutting-off-the-internet-in-2019/> accessed 4 July 2019
Douek E, ‘Australia’s New Social Media Law is a Mess’
(Lawfare Blog, 10 April 2019)
new-social-media-
law-mess > accessed 13 July 2019
Downing J, ‘Social Movement Media and Media Activism’,
Oxford Research Encyclopedia of Communication (2018)
Faisal N and Bendimerad R, ‘#BlueforSudan: Why is social
media turning blue for Sudan?’ (Al Jazeera News, 13 June
2019) udan-
social-media-turning-blue-sudan-190613132528243.html>
accessed July 4 2019
Graham-Harrison E and Cadwalladr C, ‘Revealed: 50 million
Facebook Profiles Harvested For Cambridge Analytica in
Major Data Breach’ (The Guardian, 17 March 2018)
www.theguardian.com/news/2018/mar/17/cambridge-
analytica-facebook-influence-us-election > accessed 13 July
2019
Haggart B & Tusikov N, ‘It’s time for a new way to regulate
social media platforms’ (The Conversation, 2019)
new-way-to-
regulate-social-media-platforms-109413> accessed 13 July
2019
356 SLJ 7 (1)
Howard S, ‘Community Radio and Ethnic Violence in Africa:
The Case of Kenya’ (MA Thesis, University of East Anglia,
2009)
Interim Mission Report, ‘Regulation of Social Networks
Facebook Experiment’ (State Department, Digital Affairs,
France, May 2019)
Jack C, ‘Lexicon of Lies: Terms for Problematic Information’
(Data & Society Research Institute, 2019)
y_LexiconofL
ies.pdf> accessed 9 July 2019.
Laub Z, ‘Hate Speech on Social Media: Global Comparisons’
(Council on Foreign Relations, 7 June 2019)
media-
global-comparisons> accessed July 8 2019
Madowo L, ‘How Cambridge Analytica Poisoned Kenya’s
Democracy’ (The Washington Post, 20 March 2018)
opinions/wp/2018/03/20/how-cambridge-analytica-poisoned-
kenyas-democracy/?noredirect=on&utm_term=.ba663902da68
> accessed 13 July 2019
Madrid-Morales D and Wasserman H, ‘Fake News’,
Disinformation and Media Trust in Africa: A Comparative
Study of Kenya, Nigeria and South Africa’ (ICAfrica Biennial
Conference, 2018)
news.pdf
> accessed 9 July 2019.
Regulation of the Social Media in Electoral Democracies 357
Miller N, The Tyranny of Numbers on Social Media During
Kenya’s 2017 Elections’ (LSE Africa Blog, 4 September 2017)
https://blogs.lse.ac.uk/africaatlse/2017/09/04/the-tyranny-of-
numbers-on-social-media-during-kenyas-2017-elections/ >
accessed 4 July 2019
Moosa F, ‘Benin Repeals Social Media Tax After Protests’
(Mail & Guardian, 25 September 2019)
l-
media-tax-after-protests > accessed 13 July 2019
Muthuri R et al. ‘Biometric Technology, Elections and
Privacy: Investigating Privacy Implications of Biometric
Voter Registration in Kenya’s 2017 Election Process’
(Strathmore Centre for Intellectual Property and Information
Technology Law, 2018)
2018-
06/Biometric%20Technology-Elections-Privacy.pdf> accessed
July 5 2019
Mutung’u G, ‘The Influence Industry Data and Digital
Campaigning in Kenya’, (Our Data Ourselves, June 2018)
https://cdn.ttc.io/s/ourdataourselves.tacticaltech.org/ttc-
influence-industry-kenya.pdf > accessed 4 July 2019
Nanfuka J, ‘New Year, Old Habits: Threats to Freedom of
Expression Online in Kenya’ (CiPESA, 27 January 2016)
to-
freedom-of-expression-online-in-kenya/> accessed 8 July 2019
358 SLJ 7 (1)
Nyabola N, ‘Texts, Lies, and Videotape’ (Foreign Policy, 1
August 2017) texts-lies-
and-videotape-kenya-election-fake-news/> accessed 9 July
2019
Odinga C, ‘Use of New Media during the Kenya Elections’
(Uppsala Universitet, 2013)
portal.org/smash/get/diva2:633138/FULLTEXT01.pd>
accessed March 12, 2020
Okunoye B, ‘In Africa, A New Tactic to Suppress Online
Speech: Taxing Social Media’ (Council on Foreign Relations,
10 January 2019) new-tactic-
suppress-online-speech-taxing-social-media> accessed 13 July
2019
Olewe D, ‘Kenya, Uganda and Tanzania in ‘Anti-Fake News
Campaign’ (BBC, 16 May 2018) ws/world-
africa-44137769> accessed 13 July 2019
Open Society European Policy Institute, Social Media
Monitoring During Elections: Cases and Best Practices to
Inform Electoral Observation Missions, (Open Society
Foundations, May 2019)
Owen D, ‘The Age of Perplexity: Rethinking the World We
Knew. Radically Reassessing ‘The Economic’’, (Penguin
Random House Grupo Editorial, 2017)
Portland Communications, ‘How Africa Tweets 2018’
(Portland Communications, July 2018)
Regulation of the Social Media in Electoral Democracies 359
Portland Communications, ‘The Reality of Fake News in
Kenya’ (2018) -
communications.com/pdf/The-Reality-of-Fake-News-in-
Kenya.pdf> accessed 4 July 2019
Ruteere M, ‘The State must deal firmly with ethnic incitement
on internet and social media’ (Daily Nation, 13 July 2014)
https://mobile.nation.co.ke/blogs/Internet-Social-Media-
Ethnic-Incitement/1949942-2382538-format-xhtml-
uyfqb7/index.html> accessed 8 July 2019
Social Media Lab Africa (SMElab Africa), ‘Social Media
Consumption in Kenya: Trends and Practices, (USIU Arica,
2018)
l_Media_C
onsumption_in_Kenya_report.pdf> accessed 12 March 2020
SocialPilot, accessed 5 July 2019
Schmiemann G, ‘Social Network Sites and Political
Participation: Attributes of the European Union’s Facebook
Pages’ (2015) University of Twente Essays
_MB.pdf>
accessed 4 July 2019
Samples J, ‘Why the Government Should Not Regulate
Content Moderation of Social Media’ (Cato Institute, Policy
Analysis No. 865, 9 April 2019)
https://www.cato.org/publications/policy-analysis/why-
360 SLJ 7 (1)
government-should-not-regulate-content-moderation-social-
media > accessed 9 July 2019
Select Committee on Communications, Regulating in a
Digital World (House of Lords, 9 March 2019, 2nd Report of
Session 2017-19)
Shukry A, ‘Malaysia to Scrap Anti-Fake News Law Once
Used Against Mahathir’ (Bloomberg, 10 October 2019)
https://www.bloomberg.com/news/articles/2019-10-
10/malaysia-to-scrap-anti-fake-news-law-once-used-against-
mahathir accessed 13 March 2020
Southern L, ‘What to Know About Europe’s Fight on
Platform Hate Speech’ (Digiday UK, 12 July 2019)
to-know-about-europes-
fight-on-platform-hate-speech/ > accessed 13 July 2019
Stevenson A, ‘Facebook Admits It Was Used to Incite
Violence in Myanmar’ (New York Times, 6 November 2018)
mar-
facebook.html> accessed July 5, 2019
Thomasson E, ‘Germany Looks to Revise Social Media Law
as Europe Watches’ (Reuters, 8 March 2018)
hatespeech/germany-looks-to-revise-social-media-law-as-
europe-watches-idUSKCN1GK1BN > accessed 12 July 2019
Tworek H and Leerssen P, ‘An Analysis of Germany’s
NetzDG Law’ (2019) Transatlantic Working Group Working
Regulation of the Social Media in Electoral Democracies 361
Paper
_Tworek
_Leerssen_April_2019.pdf > accessed 12 July 2019
Wardle C, ‘Fake news. It’s complicated’ (Medium,16
February 2017) news-its-
complicated-d0f773766c79> accessed 9 July 2019

To continue reading

Request your trial

VLEX uses login cookies to provide you with a better browsing experience. If you click on 'Accept' or continue browsing this site we consider that you accept our cookie policy. ACCEPT