Astroturfing (Global)

From Global Informality Project
Revision as of 12:13, 31 May 2018 by Admin (talk | contribs) (Admin moved page Astroturfing to Astroturfing (Global) without leaving a redirect)
Jump to: navigation, search
World map.png
Location: Worldwide
Author: Anna Bailey and Sergei Samoilenko
Affiliation: University College London and George Mason University

Original text by Anna Bailey and Sergei Samoilenko

‘Astroturfing’ refers to the practice of artificially creating the impression of widespread public support for a policy, cause, organisation, individual or product, where little or no support in fact exists. Such practices exist globally, but the term itself predominates in English-language political and media discourse, particularly in the USA and UK.

Etymologically, the term derives from AstroTurf, a brand of artificial turf which, when spelt in lower case, is commonly used to refer to artificial grass in general. The metaphorical adaption of the term is commonly attributed to US Senator Lloyd Bentsen, who in 1985 complained about the large volume of letters he received, ostensibly from members of the public, which he viewed as part of a hidden organised campaign by insurance companies to protect their interests. ‘A fellow from Texas can tell the difference between grassroots and astroturf,’ Bentsen stated (Sager 2009[1]). The concept of ‘astroturfing’ is thus intrinsically bound up with the concepts of ‘grassroots support’ and ‘grassroots activity’. ‘Grassroots activity’ refers to coordinated political or social activity spontaneously generated and organised from the ‘bottom up’ by ordinary citizens. The use of the term ‘grassroots’ in this sense can be traced back to the US Senator Albert Jeremiah Beverege, who in 1912 referred to the Progressive Party as the party that ‘has come from the grass roots. It has grown from the soil of people’s hard necessities’ (Samoilenko 2014: 189[2]). In the case of astroturfing, the connotation is that genuine grassroots support does not exist but has been artificially replicated, just as astroturf replicates real grass.

A typical pre-internet form of astroturfing was paid-for letter writing campaigns, such as that referred to by Bentsen when he coined the term. Such campaigns were used by corporate clients as a lobbying tool, with the aim of convincing political representatives that their cause enjoyed greater public support than was in fact the case (Lyon and Maxwell 2004: 563-4[3]). However, the growth of internet use, in particular social media and crowdsourcing platforms, has dramatically increased the range and scope of astroturfing behaviours. The cloak of anonymity makes the internet a highly effective platform for astroturfing, while the growing importance of online and crowdsourced information provides a powerful incentive for engaging in it.

The core type of deception involved in astroturfing is identity-based deceit – a false representation of the identity of the author or supporter. However, some forms of astroturfing also involve message-based deceit – the delivery of false or misleading information (Zhang, Carpenter and Myung 2013: 3[4]). The latter is the case, for example, in fake product reviews and other forms of disinformation. Astroturfing involving message-based deceit is often employed in a corporate context, for example to generate positive consumer reviews for one’s own product or service, or to generate negative reviews for that of a rival. One recent study found that nearly one in five reviews on the business review website Yelp were suspected of being fake (Luca and Zervas, forthcoming [5]). Some major companies use sophisticated persona management software to create entire astroturfing ‘armies’ of authentic-looking but nevertheless fake social media accounts, which can be deployed as and when needed (Bienkov 2012[6]).

Astroturfing has been the subject of increasing political and media attention in the twenty-first century, as the growth of internet and social media usage has led to these platforms being exploited by governments and their supporters as tools of information warfare. Some national governments are alleged to employ large armies of hidden paid agents to troll online discussion forums with pro-government views. For example, the Chinese state employs an army of paid online commentators (dubbed the ’fifty-cent army’ after the amount they are supposedly paid per post) to spread pro-regime propaganda on online forums (Han 2015[7]). State-sponsored trolling is by no means confined to ‘authoritarian’ regimes, but is also employed by Western democracies. For example, the United States Central Command (Centcom), which oversees US armed operations in the Middle East and Central Asia, has awarded contracts to companies to develop persona management software that will allow its military personnel to secretly propagate pro-American propaganda on social media sites via fake online personas (Fielding and Cobain 2011[8]).

Another form of astroturfing that has received increased attention in recent years is the phenomenon of ‘sock puppeting’. In its literal meaning, a sock puppet is a simple form of puppet made by wearing a sock on one’s hand. The gap between fingers and thumb give the impression of a mouth, and the addition of simple details like eyes make the sock resemble a face. In political and media discourse, a ‘sock puppet’ refers to an organisation that has the façade of independence, but whose existence is in fact dependant on often concealed funding from another source, thus compromising its independence. The term can also refer to an author who uses a fake persona, often online, to positively review or discuss their own work.

Sock puppets typically champion policies which do not enjoy significant public support, but which are favoured by a government ministry or bureaucratic department. The government can then justify the adoption of these policies by claiming that they are responding to external pressure from civil society. This is by no means a new phenomenon: the 1960s and 1970s saw the rise of single-issue health pressure groups in the UK – ostensibly independent but funded by the government – campaigning on such issues as smoking and alcohol (Berridge 2007: 164[9]). For example, the foundation of the anti-tobacco lobby group Action on Smoking and Health (ASH) in 1971 was actively encouraged by the Department of Health, which also provided the bulk of its funding in its first two decades. ASH provided a source of external pressure for policies the Department of Health itself favoured, and in practice they often worked collaboratively (ibid.: 167-177). At the other end of the spectrum, tobacco companies have responded by creating sock puppets of their own to counter-lobby. For example, in 1993 several major tobacco companies funded the foundation of the National Smokers Alliance (NSA), which purported to be a grassroots organisation representing smokers’ rights (Givel 2007[10]).

In the UK, the charitable sector as a whole has been criticised for becoming increasingly dependent on state funding, and thus risking turning itself into an entire sector of sock puppets (Snowdon 2012[11]). Sock puppeting has even been criticised from within the state itself. The UK’s Department for Communities and Local Government (DCLG) called for other government departments to ‘cease funding “sock puppets” and “fake charities”’ in order to reduce wasteful government spending. The DCLG stated: ‘Many pressure groups – which do not deliver services or help the vulnerable – are now funded by state bodies. In turn, these nominally “independent” groups lobby and call for more state regulation and more state funding’ (DCLG 2012: 11[12]).

It is possible that, with the increased trend towards Freedom of Information (FOI) legislation in Western polities, sock puppeting will become increasingly subject to public exposure as covert funding streams are revealed. However, whether exposure alone is sufficient to curb such an entrenched practice is another matter.

Political campaigns in the future will be progressively threatened by online astroturfing in the form of social bots and other imposters posing as autonomous individuals on the internet, with the intent of promoting a specific agenda. As astroturfing technology develops it is becoming increasingly difficult to distinguish fake personas from real individuals (Bienkov 2012[13]), which poses a threat to open democratic debate as well as the utility of crowdsourcing platforms such as consumer review websites. There is a very real danger, in both politics and business, that participants will be forced to spend money on astroturfing just to remain competitive with their rivals. In game theory terminology astroturfing is thus a form of non-optimal behaviour known as a ‘prisoner’s dilemma’, since money is sunk on rival disinformation which effectively cancels out both sides. It also represents a ‘social loss’ (Simmons 2011: 187-8[14]), as resources are spent on ‘non-productive’ uses rather than those which generate wealth or add value.

References and Bibliography

  1. Elder, M. 2012. ‘Hacked emails allege Russian youth group Nashi paying bloggers’,, 7 February,
  2. Kte'pi, B. 2014. ‘Deception in political social media’, in K. Harvey (Ed.), Encyclopedia of social media and politics, Vol. 4, pp. 357-359. Thousand Oaks, CA: SAGE Publications
  3. Snowdon, C. 2014. The Sock Doctrine. What can be done about state-funded political activism?, IEA Discussion Paper No. 53. London: Institute of Economic Affairs


  1. Sager, R. 2009. ‘Keep off the astroturf’, New York Times, 18 August,
  2. Samoilenko, S. 2014. ‘Campaigns, grassroots’, in K. Harvey (Ed.), Encyclopedia of social media and politics, Vol. 3, pp. 189-194. Thousand Oaks, CA: SAGE Publications
  3. Lyon, T. P. and Maxwell, J. W. 2004. ‘Astroturf: Interest group lobbying and corporate strategy’, Journal of Economics and Management Strategy, 13(4): 561-97.
  4. Zhang, J., Carpenter, D. and Myung, K. 2013. ‘Online astroturfing: A theoretical perspective’, Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, 15-17 August. Available online at
  5. Luca, M. and Zervas, G. Forthcoming. ‘Fake It Till You Make It: Reputation, Competition, and Yelp Review Fraud’, Management Science, available at,zervas_fake-it-till-you-make-it.pdf?sequence=1
  6. Bienkov, A. 2012. ‘Astroturfing: What is it and why does it matter?’,, 8 February,
  7. Han, R. 2015. ‘Manufacturing Consent in Cyberspace: China's 'Fifty-Cent Army'’, Journal of Current Chinese Affairs, 44(2): 105-34 .
  8. Fielding, N. and Cobain, I. 2011. ‘Revealed: US spy operation that manipulates social media’,, 17 March,
  9. Berridge, V. 2007. Marketing Health. Smoking and the Discourse of Public Health in Britain, 1945-2000 (Oxford: OUP).
  10. Givel, M. 2007. ‘Consent and Counter-Mobilization: The Case of the National Smokers Alliance’, Journal of Health Communication, 12(4): 339-57.
  11. Snowdon, C. 2012. Sock Puppets. How the government lobbies itself and why, IEA Discussion Paper No. 39. London: Institute of Economic Affairs.
  12. Department for Communities and Local Government (DCLG). 2012. 50 Ways to Save: Examples of sensible spending in local government, London: DCLG.
  13. Bienkov, A. 2012. ‘Astroturfing: What is it and why does it matter?’,, 8 February,
  14. Simmons, R. T. 2011. Beyond Politics. The Roots of Government Failure (Oakland, California: The Independent Institute)