Skip to main content
SearchLoginLogin or Signup

‘Next Generation’ PR: The rise of ‘digital communications’ in a post- neoliberal society


Published onApr 06, 2024
‘Next Generation’ PR: The rise of ‘digital communications’ in a post- neoliberal society

The notion of ‘communication’ has been normatively established as a concept that contributes and nourishes individual agency in democracies, by providing a discursive space for political participation and enabling the percolation and circulation of ideas. Despite competing paradigmatic inflections, ‘communication’ as an idea has been seamlessly implanted into the vernacular, academic disciplinary fields, and industry and professional practice. This paper provides a critique of the concept of ‘communication’ in contemporary settings, arguing that new generation public relations (PR) companies are masking their political function by their prominent use of data, and establishing legitimacy through a ‘non-ideological’ representation. Drawing on theories of public language, power and politics by Jürgen Habermas (1994), the paper analyses these media industries in post-neoliberal settings (Davies and Gane 2021). Unlike PR companies that were established in the twentieth century, it argues that hybridised digital communication consortiums like Meltwater, Cision and Onclusive, which publicise software, data and algorithmic information as key tools in the delivery of their services, are largely unseen as ideological ‘PR’, and thus serve to entrench a reified reading of their activities as natural and politically neutral. The growing aura of generative Artificial Intelligence (Gen-AI) highlights the need to examine its inter- relationship with politics, and the way this nexus may serve to limit participation in the public sphere or debase this space, rather than give creative expression to, the social imaginary in a post-neoliberal society.

Key words: public relations (PR), digital communications, generative Artificial Intelligence (Gen-AI), post-neoliberalism, public sphere


The mainstreaming of ‘generative’ Artificial Intelligence (Gen-AI) has seeded many contested positions in public, political and academic debates. Unlike ‘old’ Artificial Intelligence (AI),1 ‘new’ Gen-AI builds personalised human-like content, and this advanced, seemingly futuristic technological development has opened speculation about its imminent cultural and socio-political impacts. For the most part, the thrall of Gen-AI and its destructive potential has focused on Large Language Models (LLMs) and platforms that mimic human characteristics such as conversation, reasoning and imagination. Much of the fear of this technology rests in Gen-AI’s persuasive potential to ‘churn out misinformation’ and produce ‘deep-fake’ news, photos and videos that support conspiracy theories in ways that could ‘overwhelm content moderation systems’ (Fried 2023).

This prospect has given rise to a wave of consternation, especially between ‘the complex interplay between AI, democracy and power’ (Schippers 2020: 35). Warnings about the threat and risk have focused on the deviating effects of Large Language Models (LLMs)2 and algorithms in ‘polarizing societies, intensifying political echo chambers and distorting meaningful political debate in digital space’ (Unver 2018: 3). More recent studies have fixated on ChatGTP ‘that enables the production of natural-sounding text’ and the implications for misinformation and harm, in ‘ethics, privacy, and employment displacement’ (Wach et al. 2023: 3). Such concerns have spilled into real-world settings. In gearing up for the 2024 US presidential election, Marion Fernandez (2023) reported for Politico that dozens of Democrats working out the campaign approach met to discuss ‘how generative AI could produce misinformation and disinformation at a pace and scale campaigns have not experienced before’. They were worried about the ‘flooding zone’ – that period when the new technology opens doors and is comparatively unregulated, while legislators scramble to catch up with the new ways that it can be exploited for unscrupulous ends. Part of their response was to prepare themselves on a platform level but also to help voters ‘educate themselves’ (ibid).

In tandem with these technological developments, digital communication(s) has emerged as a branch of industry and an academic field which emphasises data and information services in deploying a persuasive strategy for vested interests, in ways that do not resemble nor always identify with more traditional forms of ideological PR.3 Digital communications as a field is growing, as is an active corporate agenda to deregulate the Gen-AI industry so there is less – not more – oversight or scrutiny. The Center for Media and Democracy (CMD 2023)4 recently spotlighted the activities of the American Legislative Exchange Council (ALEC)5, a powerful, right-wing US lobbyist and political force, in relation to the AI industry at its annual meeting in Florida. Here, its ‘communication and technology’ task force was set to debate a resolution ‘in support of free market solutions and enforcement of existing regulations for uses of Artificial Intelligence, which would prevent governments from ensuring that artificial intelligence is developed with diversity, equity, and inclusion (DEI) in mind’ (Armiak 2023). Organisations such as ALEC championing a free market approach and promoting deregulation or non-regulation to the rapidly expanding Gen-AI industry would bestow not only immense power to digital communications firms in profit-making through the shaping of public opinion, but entrench a powerful semblance of political neutrality within these communcation interventions. This new ordering of the ‘object of public relations’ which aims to embed a ‘discursive monoculture’ (Demetrious 2008: 30) as ‘digital communication’, is significant because old ‘PR’ was reviled as coercive and unethical in influencing the politicisation process in democratic society (Miller and Dinan 2003; Demetrious 2013; Jansen 2017) and ipso facto, new, seemingly data-based forms should also be scrutinised.

The transmutation of ‘PR’ to ‘digital communication’ is taking place in conditions where neoliberal hegemony has entrenched a widespread belief in inevitability of market power in the last 50 years (Davies and Gane 2021). Central to this, is a recasting of the idea of ‘communication’ from human to human to ‘human-machine communication (HCM)’. HCM works with Gen-AI to produce so-called ‘original’ content, and this has led to a call for greater reflection on what this means for public discourse, politics, ethics and society and the imperative to develop a ‘conceptual framework that can be viewed as a combination of human and machine intelligence, where both work together to achieve a common goal’ (Mourtzis et al. 2023: 174). Guzman and Lewis (2020: 70) argue that this is an urgent theoretical issue for communication scholars requiring a new research agenda due to the ‘blurring ontological boundaries surrounding what constitutes human, machine and communication’. Therefore, the onset of generative Artificial Intelligence in both its successes and failures constitutes a major shift in the meaning of ‘communication’ in society, at a perilous formation stage.

This paper argues that while the ‘digital communication’ industry frames its ‘next generation’7 PR services as providing socially beneficial ‘information’, this is not always the case. Masked by their prominent use of data and by the appearance of non-ideological corporate agendas, these digital communication firms do not announce their services as a discursive political intervention and in this respect, are rhetorically opposed to ‘old’ PR practices. However, several examples will show the political use of data, which may not be misinformation, but neither is it benign nor ethically neutral. It will contend that the routine integration of generative Artificial Intelligence by global industries like Meltwater, Cision and Onclusive requires an expanded understanding of their role in the politicisation process in a post-neoliberal setting to fully apprehend their impact. As an overview, this paper is interested in how digital communication industries, such as these, with an agenda to change, neutralise or shape public opinion, are bound up with AI in its various forms, and render normative categories such as ‘disinformation’ or ‘misinformation’ incomplete or obsolete.

The three industry examples of ‘next generation’ PR offer similar algorithmic and data-based services. Meltwater is a ‘software as a service’ and ‘media monitoring company’8 and an ‘early adopter’ of Gen-AI. According to Thun (2023): ‘Meltwater became the world market leader in media monitoring while also succeeding in media intelligence, media relations and media management in general.’ Cision, based in the US, is also a cloud-based PR software platform offering research services such as media monitoring, social listening, targeting and campaigns. In 2016, Cision bought press release distributor PR Newswire for US $841 million.7 US-based Onclusive: Media Monitoring, Media Analysis, PR and Communications, with the tag line ‘on your side’, is another data-focused digital communication company.9

Two essential questions are: what ideological traces of coercive PR industry discourse are present in the self-described services offered by Meltwater, Cision and Onclusive? And what assumptions about data and information do these ‘next generation’ digital communication firms publicise?

In considering these questions, this paper offers a critical and conceptual engagement with social, cultural, economic and political developments colliding with the field of communication and develops a crucial exploration of new structures and institutional formations in public relations and what this may mean. In tracing any ‘shifts or changes’ to meaning, this paper is interested in ‘unmasking’ the ideological distortions embedded in digital communication firms and understanding how this may lead to coercion in new forms (Boucher 2021: 2). It argues that it is important not to be solely preoccupied with chatbots like Bard, LLaMA or ChatGTP, and the public face of Gen-AI; rather, the routine, everyday, masked PR activity behind this, is equally significant and less scrutinised.

The naturalisation of corporate economics

In the last 50 years, the powerful idea of a bountiful global market- based economy, freed from constraint, has naturalised and embedded as ‘neoliberalism’ in a myriad of ways. For Robinson and Harris one of these is that it works to ‘shift the locus of class formation from national to emergent transnational space’ (2000: 32). More broadly, Philip Mirowski (2015) argues neoliberalism as an idea and practice is everywhere and to grasp its full complexity, it is best conceived as a ‘thought collective’ defined by the market-based dominance via the centrality of free markets. At the same time, it renders government to a supplementary and reductive role as an administrator, while more broadly economic calculation or monetisation rises as the basis of determining or judging the value of all human, social and political activities.

Mirowski’s (2015) interpretation of neoliberalism is broadly in line with a Habermasian view of a liberal demos made up of networks and interactions of ‘private persons’ within market structures with government relegated to an administrative role in the pushing and shoving by vested interests to get the best deal (Habermas 1994). For Habermas: ‘The liberal model hinges, not on the democratic self- determination of deliberating citizen, but on the legal institutionalization of an economic society that is supposed to guarantee an essentially non-political common good by the satisfaction of private preferences’ (ibid: 7). The in-built market-based characteristics of neoliberalism were turbo-charged in the 1970s with an aggressive approach to the transnational movement of capital. Intensification through ‘globalisation’ was achieved by the extension of free enterprise zones, mergers, acquisitions and take-overs while subcontracting and outsourcing precipitated the economic restructuring on a massive scale.

Globalisation, a key pillar of neoliberalism in the last fifty years, led to vast economic and cultural expansion, and was particularly relevant to the export of communication industries. David Miller and William Dinan list the global PR companies in a ranking at the outset of the twenty-first century, citing Fleishman-Hillard, Weber-Shandwick Worldwide, Hill & Knowlton and Burson-Marstellar as the top four global players (2003: 198). This multi-national colonisation process was highly significant. It carried PR’s distinctive, ideologically inflected language practices, as well as templated communication tactics and strategies, to new nation states, and social and political settings (Demetrious 2013). In addition, PR was useful to government in softening the ground to implement new neoliberal strategies like privatisation, such as in the UK under the prime ministership of Margaret Thatcher (Miller and Dinan 2003: 193).

Nonetheless, challenges and changes to the established neoliberal order have come in the intervening years. William Davies and Nicholas Gane (2021) examine contemporary contexts, and delineate the notion of ‘post-neoliberalism’ largely defined by impacts of the 2008 meltdown affecting the global financial sector, in tandem with the proliferation of social media and the rise of populism and the political hard right. The rise of post-neoliberal cultures and digital contexts raise questions for PR that are distinct from other periods of neoliberalism. These should be viewed in light of intensifying action to maintain a free-market dominance in the AI industry in ways that actively avoid scrutiny and oversight (Armiak 2023).

Next Generation PR ‘digital communications’

The communicative spaces in which PR participates today are characterised by a widespread and increasingly normalised culture of digital disinformation (fake news, conspiracy and hate speech) (Edwards 2021; Ong and Cabañes 2018). These conditions differ substantially from those in the 20th century10 when disinformation campaigns were largely discrete and associated with corporations with harmful products and/or production processes such as tobacco, forestry, asbestos and energy (Demetrious 2013; Jansen 2017). Large corporations in those settings seeking to extinguish public criticism had access to financial resources, occupational domains such as public relations and a keen understanding of the crucial role of media in democratic society. Some of these corporate campaigns, characterised by lack of visibility and a high level of manipulative misinformation, were targeted by investigative journalists towards the end of the twentieth century and were exposed, for example, in magazines like PR Watch11 which were established to ‘investigate the PR industry and politics’ (2023). Today’s conditions are substantially different, as access to mount an information or disinformation campaign which reaches a mass audience is widely available.

In the twenty-first century, public relations, as an industry and a political practice, is morphing and this is accelerating with the developments in data-driven information technologies in ways that are unchartered. Indeed, the integration of PR practices with Gen-AI services has profound theoretical and practice-based implications for the field. Arguably, the absorption of PR with AI is seamlessly aligned with an ethically neutral notion of neoliberalism. Clea Bourne (2019:1) sheds light on this by arguing that PR’s habitus is focused on ‘optimism and futurity’ in positioning it as indispensable for business, but that its association with neoliberalism ‘puts it at odds with’ ethics. Bourne further argues that Artificial Intelligence ‘as a way of life’ has a range of implications and that only some voices are privileged by PR (ibid). Therefore, its potential to leverage advantage for some in society, by excluding others and reifying a particular way to think about the world goes relatively unnoticed and unchallenged.

Another way PR is smoothly integrated with digital cultures is in its ethically neutral approach to exercising coercive language with market power. Lee Edwards argues (2021: 169) that ‘organised lying’, disinformation and the production of ‘fake news’ are not new but part and parcel of established PR tactics that had their provenance in the twentieth century. Contributing to the conditions that support this activity are social media platforms which have features for users to signal approval of content such as videos, comments, photographs, links and updates, and to share this more broadly. Edwards argues that conditions today normalise disinformation and that PR must re- orient itself ‘towards democratic rather than organisational ideas’ (ibid). Disinformation, hence, needs to be understood, not just in relation to an event or site, but as a cultural phenomenon which has widespread effects in post-neoliberal society. Edwards says: ‘Thus, organised lying has the potential to replace concern for the common good in political debates with a concern for vested interests, while misrepresenting those interests to both the public and to policymakers as the common good’ (ibid).

Fake news, conspiracy theories and invented content are circulating through user-generated social media platforms like Facebook, Twitter and Instagram, amongst others, and are given impetus by new affordances such as algorithm tracking, trolls and bots and big data. These developments can be linked to both the language, logic and practices of public relations (ibid). Adding to the critique, Anne Cronin argues that, in a ‘commercial democracy’, PR has come to displace, in part, the social contract, ‘with alternative promises of representation, voice, and agency’ (2018: 23). These are valuable appraisals of the occupational domain of public relations; however, the pairing of AI and Gen-AI with PR to form ‘digital communications’ opens an undescribed discursive space somewhere on the continuum between ‘organised lying’ and democratic ideals. Global digital communication firms like Meltwater, Cision and Onclusive use techonolgies that are problematic because they have a legitimising effect on PR that deflects attention from its ideological, libertarian roots, and market-based political agenda, at the same time as producing ethically empty knowledge on an industrial scale, tied to this worldview. The next section discusses the research approach in examining these ‘next generation’consortiums that offer data-driven services in PR.

Methodological approach

In approaching the research, I undertook a critical analysis of the self presentation of Meltwater, Cision and Onclusive as text producers, noted their distribution networks and considered the reception intended on their websites. This included general descriptive background, service and product descriptions, user testimonials and onsite media releases. To gain a broader context of how these companies were represented outside the company, I used a Google search for each, appending the term; AI + communication, AI+ public relations, Gen AI + communication.

The textual analysis draws on a three-part framework that looks at how language patterns reveal relations of power and control (Fowler et al. 1979):

  1. What language is selected and signified. And what words direct or embody a ‘specific view – or theories – of reality’ in other words how do these ‘cut up the world’? (ibid: 1).

  2. What are the variations, and what are the convergences between the different sites and what gives rise to them? (ibid: 2).

  3. How is language within these individual sites articulated within the context of ‘the perpetuation of social relationships which underpin them’? (ibid: 3).


Meltwater Software Company ‘All-In-One Solution’ is not immediately identifiable as ‘PR’ despite offering a range of products and services that are commonly linked to the occupation, such as crisis communication, risk management and media intelligence and monitoring. Describing itself as providing a ‘suite of solutions’ by ‘harnessing the world’s data in real time to unlock our customer’s competitive edge’, Meltwater promotes its values as ‘fun’, being ‘number one’, ‘respect’ and ‘more’ by which they mean ‘the drive to improve’ (Meltwater 2023). Despite the emphasis on social media analytics and data, Meltwater attempts to appeal to conventional PR and communications expectations, for example: ‘As a modern PR and communications professional your role has been rapidly evolving. Today these seats hold high responsibilities that include traditional and non-traditional media, both on and offline’ (ibid).

By talking about ‘solutions’ rather than a focus on ‘problems’, the framing of Meltwater’s website is non-adversarial, yet the reader is positioned in an unspoken oppositional relationship either with a critic or the public. However, more explicit references to the challenges to which it alludes are present, for example: ‘Don’t let negative sentiment have a negative impact on you. Monitor mentions of your brand, executive team, campaigns and hashtags in real-time.’ Taking this one step further and navigating through the treacherous waters of ‘public opinion’, ‘The Meltwater Academy’ offers free courses on ‘media monitoring’ and ‘crisis and issue management’ amongst others. In this, Meltwater’s remit appears well beyond its stated purpose as a ‘software company’ and its promotional text assumes that mounting a PR campaign is an unproblematic, neutral activity: ‘Unlike traditional advertising or marketing, a PR campaign is designed simply to get people talking. It’s a way to put your brand in the spotlight, drum up some press, and make a lasting impression on your audience.’

Meltwater lists many clients including the PR and communication arms of the theological organisation Seventh-Day Adventist Church, multi- national Coca-Cola, asset manager SimCorp, and investment banking company Saxo Bank amongst others (Meltwater 2023). A firehose of media intelligence, as well as social listening and data analytics is also supplied to global PR agencies such as Publicis, Ogilvy and Hill & Knowlton (2023).


Cision: PR Software Platform & Marketing Solutions, like Meltwater, frames the interpretation of its services around dissolving or expunging problems through, for example, ‘influencer outreach solution’, ‘global monitoring solution’ and ‘optimal solution’. All these statements explicitly ignore the political and ethical implications of communicative interventions and all assume an entitlement on the part of the client and its service provider to wield discursive power and control. Significantly, Cision incorporates ‘Cision: PR Newswire’, a media company specialising in press or media release distribution to ‘share your story’ (Cision 2023). The website states that only ‘PR Newswire can get your news to 200,000+ newsrooms and direct feeds – 180,000+ journalist and influencer inboxes and 9,000+ web sites and digital media outlets. Trusted media pros count on us’ (ibid).

More broadly, the Cision website deploys market-based meaning embedded in language that resonates with democratic themes, like ‘empowerment’,‘narrative’ and ‘story’. For example, Cision is ‘empowering brand narratives’ ‘strategically navigating today’s complex media’ (ibid). Similarly, Cision forms partnerships with clients where ‘stories’ are ‘shaped’, given ‘reach’ and ‘amplified’. To augment its services, Cision offers a ‘Cision Story Kit’ with the tag line: ‘PR pitches can leave a lot to be desired. Cision Story Kit updates the humble pitch for a world of modern storytelling.’

Cision makes a specific reference to Gen-AI and PR through ‘sentiment analysis’. In a tutorial style page, Cision advises potential client PR practitioners that when composing a press release:

Ask the AI to draft a release based on a set of bullet points summarizing a new product launch, a corporate announcement or business initiative. Make sure you provide the information the AI will not know, for example what the product might be called or the launch date. If the first response isn’t quite right, provide feedback to improve the AI’s understanding of what’s required. You can give extra context, or provide guidance around tone and length, even ask it to generate a quote from a specific person to include in the release.

Despite this, there is an instance where Cision counsels clients to exercise a level of oversight when surrendering agency and deferring this task to a machine thinking for itself because ‘AI still tends to “hallucinate” facts, including research reports and authors that don’t exist. So be extra vigilant if you’re asking an AI to create text from simple prompts as opposed to re-formatting or summarizing text you’ve provided’ (ibid). However, rather than a concern about ethics, this appears to be little more than advice to mitigate reputational risk.


Onclusive’s homepage greets the user with the tag ‘On your side: Let’s make sense of it all’. Once again, and in a similar way to both Cision and Meltwater, the oblique reference to an opponent or adversary is made. Other similar features are the liberal use of the term ‘empower’ with its roots in democratic theory, being used to describe ‘the world’s leading brands and agencies’. The idea of ‘modernising’ communications is a recurring theme in the three sites. Similarly, the idea of ‘AI powered sentiment analysis’ whereby clients can ‘receive real-time alerts when peaks of negative sentiment activity, keywords topics, or hashtags arise’.

Identify individuals and key media involved with the spread of negative sentiment news. Create a customized crisis detection dashboard in minutes. Know who triggered the crisis, where it spread, and the impact of the actions you took to prevent it (Onclusive 2023).

Like Meltwater and Cision, Onclusive features client testimonals. Third party endorsements are from ‘Sophos’, ‘Edmunds’ and ‘ConnectWise’ and their representative communications managers who extol the value and virtue of ‘robust data’, ‘metrics’ and ‘automated platforms’ for PR work (Onclusive 2023).

Splitting the human: ‘The rise of the machine’

Meltwater, Cision and Onclusive’s websites show that two reductive but dominant interpretations of ‘digitial communications’ are at work. The first is ‘disinformation’ which is framed as malicious and unethical, and the second is ‘data’ or ‘information’ which is framed as the normal provision of a service to clients. This void, both in the spectrum of ideas and vocabulary, raises critical questions about the adequacy of ethical frameworks used to ground thinking, action and practice in relation to the casual application of Gen-AI. This section explores the way this distinction serves to deepen the everyday depoliticising effects of the digital communications industry and highlights the need to develop new categories in this space.

There is little doubt that personalised, human-like Gen-AI can reduce workloads and improve efficiency in an array of fields like teaching, health and transportation (Dhoni 2023). However, the need to safeguard against the effects of ‘predatory technology’ that threaten inclusive and rational public debate and the forging of democratic political agendas, cannot be ignored (Eisikovits 2023).These effects may take the form of echo chambers and disinformation like ‘fake news’ that is polarising and distorting (Unver 2018; Schippers 2020; Fernandez 2023). Another concern is where AI becomes an active ‘conscious, self-aware or sentient’ machine with its own agenda (Eisikovits 2023). Nonetheless, Eisikovits (ibid) argues the idea or sentience, such as an AI virtual assistant having a love affair, is less important than the common inclination for fetishisation and anthropomorphising of AI, which is embedded in the normal business of providing data and services to digital communication clients. The sites examined often show a two- fold anthropomorphism of AI bonded to ‘brands’ that achieves an interesting numbing effect so deflecting attention from any latent damaging features of the technology. Meltwater, Cision and Onclusive routinely refer to ‘brands’ as embodied with human-like qualities to the extent that they have stories, a presence and even the agency to invest and ‘provide solutions to improve quality of life’ (Cision 2023). At the same time brands also need ‘safeguarding’, ‘protecting and defending’ when they are under attack (ibid). Meltwater extends the metaphorical exchange by stating that it is important ‘to get an accurate pulse of your brand health’ (Meltwater 2023). Cision has a running banner that states that its services include ‘empowering brand narratives’ (Cision 2023). Similarly, this two-fold anthropomorphising is evident in an Onclusive page titled: ‘How brands benefit from exuding real, human emotion’ (Onclusive 2023). In a discussion of the merits of ChatGTP, Cision cautions clients to check the work before it is sent on to social media because ‘AI still tends to “hallucinate”12 facts, including research reports and authors that don’t exist’ (Cision 2023). These distortions of language are not captured by definitions of ‘disinformation’, nor are they a neutral form of‘information’.

Geoff Boucher’s work on the Habermasian public sphere sheds light on the industry penchant for anthropomorphism and elicits possible implications for politics and the social imaginary. Boucher argues that rational critical debate in the full Habermasian sense rests on the idea of inter-subjectivity within public dialogue. However, Habermas saw ‘the main problem confronting the critique of ideology today is not ideological false consciousness but the fragmentation of subjectivity’ (Boucher 2021: 151). Taking up the idea of fractured subjectivities in society, Boucher identifies a three-part typology: repression, disavowal and projection that form pathological defence mechanisms.13 Of interest to this study of public language, power and politics is the notion of ‘disavowal’ which involves self-contradiction, as ‘the ego simultaneously affirms and denies its attachment to a prohibited symbolization of a need-interpretation’ (ibid: 151).

This strategy of disavowal as self-contradiction which makes it impossible to argue with is manifest on all three sites. For example, Meltwater proclaims: ‘Misinformation & disinformation: What is it and how do you protect your brand against it?’ (2023). Similarly, Cision provides advice about how to ‘track sentiment: Sentiment analysis can help you quickly identify negative mentions of your brand’ (Cision 2023). Onclusive also suggests the ‘constant misinformation being spread is causing even real news to be perceived as fake in many instances that can be remedied by “social listening” to manage PR crises, inform content, & identify influencers’ (Onclusive 2023). These opaque obstructive language strategies, insistently applied, that deny participation, are neither ‘disinformation’ nor ‘information’, and show how these categories are both inadequate and outmoded.

Scratch the surface and there are other troubling discursive dynamics set in motion by the routine production of data and information in digital communications. On the one hand these ‘next generation’ PR companies Meltwater, Cision and Onclusive give prominence to a lurking violence of misinformation and disinformation but, on the other hand, point to the only ineradicable solution, which is even more AI, and data collection, monitoring, listening. The enclosure of ideas within the company purview about the threat of disinformation, is significant because it means that there is not a solution outside this forum. Each one’s rhetoric of ‘inclusion’ pinpoints an adversarial relationship between organisation and stakeholders from an outside threat that can only be resolved within that setting. This suggests participants in the site are led to a circular ‘cul de sac’ of ideas or ‘intra’-subjectivity rather than to the open stimulus of Habermasian inter-subjectivity. This, in turn, could hollow-out rational critical debate by reducing confrontation and elevating the authority of the institutional site (Boucher 2021: 154), a duality that works against, rather than for, any radical reflection. I am not arguing that all forms of Gen-AI lead to ruin and catastrophe; rather, that we must acknowledge that there are outmoded ways in which ideas and language are working in relationship to digital communication that require new points of critique.

Algorithmic news + algorithmic politics

The relationship of Generative AI and PR to news production is of interest in this study especially in the gaining of legitimacy through public opinion, which for Habermas is a crucial site to uphold democratic ideals (1995). Historically global PR firms such as Hill & Knowlton placed great weight on cultivating a relationship with the media industry and journalists by preparing a news product: that is a media or press release which would ‘pre-formulate’ or package ‘a story’ from an organisational self-interest. This ‘pre-formulating’ of a story involves the interpretation of organisational self-interest through the lens of independent and objective news value or ‘newsworthiness’ (Sissions 2012: 274). Their reproductive cultural potential of commercial self-interest channelled through media releases is discussed by Talita Mollerup-Degn (2020: 19- 20) who argues that they should be seen as ‘an expression of social processes and practice and as being part of the construction of social identity, systems of knowledge or influential to political actions’.

Arguably, Gen-AI will play an increasing role in pre-formulating news that sets political agendas within socially complex conditions. The act of distributing a press release on a large scale will occur within the regressive context of ‘the corporatization of the media, the rise of manipulative versions of public relations, the advent of commodity aesthetics, the entrenchment of the special interests of powerful groups and the transformation of mass political formations into party apparatuses staffed by professional apparatchiks’ (Boucher 2021: 61). In relation to this, Boucher (ibid) argues that inter-subjectivity, by which he means open discussion, inclusive debate and the engagement of critique, is important to uphold because reason ‘is social, and it unmasks social unreason ... the “suffering” this causes is protected by ideological distortions’ and, moreover, is emancipatory since ‘it exposes these distortions, points towards equality and provides an alternative to coercion’ (Boucher 2023: 2).

It follows, that digital communication firms that produce media or press releases are ideological because they construct ‘news’ as a reality, via the discursive intervention to shape public opinion, at the behest of special interest, and present closed ideas, designed to subdue or discourage inter-subjectivity. Adding to this, the text producers of the media or press release suppress the authorship because otherwise it would reveal the institutional sites that have camouflaged ‘news’ as independent or objective. This opacity is futher complicated by Gen- AI, because ‘human-machine communication’ is not ‘owned’ in the traditional sense (Guzman and Lewis 2020).

In tandem, changes to newsroom practices in neoliberal conditions have seen an increasingly uncritical or passive approach to incorporating media releases into the published product, ‘with journalists failing to corroborate material they are receiving from sources’ (Sissions 2012: 292). Meltwater, Cision and Onclusive all offer media or press release services, which are political intercessions, as if this act has no ethical or other responsiblitires. Meltwater not only offers instruction on preparing a media pitch, analysing audiences and utilising their database with ‘over 400,000 contacts’ but lists global media company News Corp as one of its clients. Cision specialises in this practice, with a discrete service: PR Newswire. This nexus between large media consortiums and next generation PR companies like Meltwater is significant because, as Amanda Meade reported (2023): ‘News Corp Australia is producing 3,000 articles a week using generative artificial intelligence ... a team of four staff use the technology to generate thoughts of local stories each week on weather, fuel prices and traffic conditions.’14 It is the bond between data-driven PR consortiums like Meltwater, Cision and Onclusive to global news and technology platforms to inform public debate and ultimately policy direction, which is increasingly complicated and ethically fraught. This is informed by the sheer scale and scope of the global players and their reach that ‘will likely lead to more dangerous and unnerving developments in algorithmic politics in comparison to what democratic or authoritarian states may or may not do with AI’ (Unver 2018: 15).

Communication: The ‘next century is already here’15

Companies like Meltwater, Cision and Onclusive are conferred with authority and trusted in post-neoliberal contexts to guide the way to digitial prosperity. In his 1989 book, US management theorist Peter F. Drucker speculated about the transformative potential of information, moving ‘the capital investment analysis from opinion into diagnosis’ (Drucker 1989: 201-202). Information and data in these digital communications companies are endowed with ‘relevance and purpose’ that has just this reifying apolitical effect (ibid: 202). The opening statement on Meltwater’s home page states: ‘You’re surrounded by billions of datapoints. Break through the noise with ...’ and a rolling adjunct in the following order: ‘Media Intelligence, Consumer Intelligence, Social Listening, Influencer Marketing, Social Media Management, Sales Intelligence, Media Relations, Data and API (Application Programming Interface) Integration. Meltwater.’ The fixed statement on Cision’s homepage is the ‘Strategic Communicator’ and the rolling adjunct includes these five groups: ‘Targets & Cultivates, Monitors & Understands, Measures & Verifies, Protects and Responds, Distributes & Amplifies.’ Onclusive’s homepage contains a single statement: ‘On your side: Let’s make sense of it all’ albeit with a clickable ‘Explore’ button.

Meltwater’s, Cision’s and Onclusive’s extensive online services suggest that they are the ‘new elites’ that Finlayson claimed ‘can accelerate technological innovation and institute the order of artificial intelligences which it sees as out destiny and salvation’ (2021: 171-172). Yet the rise of this ‘next generation’ PR in the form of digital communciaitons appears to be overlooked as a site for critique. One reason for this may be the adherence to theoretical traditions established by Habermas in focusing on how communication ‘action’ establishes reductive language conditions that invisibly promote the role of markets and profit-making in contemporary society (1994). However, the rise of Gen-AI is taking place within a highly depoliticised public sphere, shaped by complex forms of disinformation and misinformation.

Therefore, an interesting twist is by focusing on ‘digital communication’ both as an action-based product of industry, public relations, advertising and as a market-based self-referential power structure. Luhmann (1992) referred to these systems as ‘autopoiesis’ and marks a shift in the action-based idea of communication, as a product of individual action where there is a ‘goal’ to the conditions that allow it to grow. For Luhmann, this self-organisation and autopoiesis is a ‘profound revision of the conceptual framework of communication’ that opens the field to far more investigation. Thus, these firms both construct and fortify ‘communication’ as a self-organising system of ‘autopoiesis’, a framework that is quite distinct from action-based models, and warrants further investigation. The research may seek to work out how we can live with the new technology in a way that addresses problems of the past, and anticipates what may take place in the contexts that have come to be.


The review of ‘next generation’ digital communication services offered by Meltwater, Cision and Onclusive revealed traces of coercive PR industry discourse, which is at odds with the representation of data and information as benign and apolitical. Humanistic Gen-AI and data is characterised as politically neutral while this non-ideological appearance serves to distract from the privileges and potential of the manifest ‘live’ persuasive corporate agendas at play. At the same time these digital communications companies are conferred with an elevated authority that, through persistent self-contradiciton strategies like disavowal, numbs confrontation and eshews scrutiny through tactics such as anthropomorphosis. In doing so they assimilate seamlessly into a futuristic free-market culture in which organised lying is normalised (Bourne 2019; Edwards 2021).

The research also reveals a gap in language and ideas to capture an expanded idea of data as political ‘information’ that is distinct from ‘disinformation’ but this is not an agenda that is actively supported by industry players such as Meltwater, Cision and Onclusive. Quite the opposite. Nonetheless, the imperative to do so is demonstrated by news production services, increasingly within the purview of information-based digital communications. It also shows that there is a need to review action-based communication theories considering the way ‘PR’ is morphing and self-perpetuating and reproducing through a culture that supports a limited set of reified market-based ideas, that mimic modes of inter-subjectivity. New vocabularies must emerge that respond to these changed conditions, and provide a foot-hold to hold digital communications firms to account for the services they provide in the supply chain to shape public opinion.

This paper argues that Gen-AI’s propulsion through the services of global digital communication consortiums, and ‘next generation’ PR is not only a largely unacknowledged site of cultural production with political implications but is accompanied by a profound level of conceptual and theoretical confusion. More work is required to uncover and understand the dynamics of language, power and politics and how they are interacting with neoliberal cultural conditions, that I have argued characterises contemporary socio-political conditions in opaque and counterintuitive ways (Demetrious 2022).


1 Luciano Floridi (2023) uses an interesting metaphor to differentiate between old Artificial Intelligence (AI) and new ‘generative’ Artificial Intelligence (Gen-AI) like ChatGTP, Bard and LLaMA. Old AI is simply delivering back words in the dictionary but ‘today, artificial intelligence (AI) manages the properties of electromagnetism to process texts with extraordinary success and often with outcomes that are indistinguishable from those that human beings could produce. These AI systems are the so-called large language models (LLMs), and they are rightly causing a sensation’

2 According to Luciano Floridi (2023) Large Language Models (LLMs) ‘are like that trickster: they gobble data in astronomical quantities and regurgitate (what looks to us as) information’

3 QUT, the top-ranked Australian university in media and communication, has no mention of PR in the website promotion of its Master of Digital Communication. Rather than PR it states it provides students with skills to: ‘Analyse and create data-driven compelling stories via diverse social media platforms (Facebook, Instagram, Twitter
and WeChat). Develop a deep understanding of the digital creative economy and how platforms and artificial intelligence are transforming the media and communication environment. Learn how to use and build tools to deliver advanced data visualisation and analysis of audience behaviour’ (QUT 2023)

4 The Center for Media and Democracy (CMD 2023) is a progressive not-for-profit corporate investigator

5 ALEC, which describes itself as ‘America’s largest non-partisan organization of state legislators dedicated to the principles of limited government, free markets and federalism’ (2023)

6 SaaSworthy is a software review and comparison site which describes Meltwater as ‘a cloud-based public relation software’ company that ‘comes enriched with a myriad of public relation features like article author database, clippings management, digital asset management etc.’ Available online at

7 ‘Next Generation’ is a headline term used to describe Cision’s communications. Available online at generation-cision-communications-cloud/

8 Reuters (2016) reported that Cision, ‘won US anti-trust approval to buy press release distributor PR Newswire’. Available online at

9 In a Forbes review, John Hall (2023) writes: ‘Onclusive tracks and analyzes media coverage, social media engagement, website traffic, and other key metrics. These hard numbers can better enable you to make data-driven decisions and assess the value of PR to your organization’

10 Ong and Cabañes (2018, 14) argue: ‘The work structure of digital disinformation reveals a professionalized and normalized hierarchy headed by disinformation architects who are culturally embedded in the promotional industries, and are dependent on the complicity of professional elites in advertising and PR as well as underpaid digital and creative workers’

11 PR Watch, a quarterly publication of the Center for Media & Democracy, is dedicated to investigative reporting on the public relations industry. It serves ‘citizens, journalists and researchers seeking to recognize and combat manipulative and misleading PR practices’. See

12 The idea of AI ‘hallucinating’ is a common reference point in the emerging discourse that denotes childlike power and the potential to misuse it. See ai-hallucinations/_

13 Geoff Boucher identifies the work of Anna Freud (1967) as the source for this typology, although he states that her work originally identified nine pathological defence mechanisms (2021: 151)

14 The growing reliance of journalists on data-driven communication industries such as Meltwater, Cision and Onclusive is not confined to News Corp Australia which publishes the Australian, Herald-Sun and Daily Telegraph. According to the Australian Newspaper History Group (2022), many other large media companies are using data-driven measurement and PR ‘news products’. For example, ‘The Guardian Australia has a tool called Ophan that is used to boost audiences while The Age uses a platform called My Story’ (ibid)

15 Peter Drucker’s 1989 book The new realities which focuses on the intersections of language, culture, technology and commerce, begins the Preface by speculating that while we yearn to know what is to come (the future) it is already here. He explicitly rejects the book as concerned with ‘futurism’ and argues it is about ‘realities’


Center for Media and Democracy (2023) Available online at https://www.sourcewatch. org/index.php/Center_for_Media_and_Democracy

Cision (2023) AI and the future of comms teams. Available online at https://www.cision. com/resources/webinars-and-events/ai-future-comms-teams/

Meltwater (2023) See

Onclusive (2023) Onclusive launces AI-Driven global sentiment analysis. Available online at com/resources/blog/ai-driven-sentiment-analysis/ what-are-generative-ai-large-language-models-and-foundation-models/

Queensland University of Technology (QUT) (2023) Master of Digital Communication. Available online at elections


Armiak, David (2023) ALEC ditches free market and limited government principles at its 50th annual meeting, Exposed by CMD, The Center for Media and Democracy. 26 July. Available online at market-and-limited-government-principles-at-its-50th-annual-meeting/, accessed on 28 August, 2023

Australian Newspaper History Group (2022) Newsletter, No 118, 18 July. Available online at chrome-extension://efaidnbmnnnibpcajpcglclefindmkaj/ au/__data/assets/pdf_file/0008/1227761/ANHG-Newsletter-No.-118-July-2022.pdf

Boucher, G. (2021) Habermas and literature: The public sphere and the social imaginary, Bloomsbury Publishing USA.

Bourne, C. (2019) AI cheerleaders: Public relations, neoliberalism and artificial intelligence, Public Relations Inquiry, Vol. 8, No. 2 pp 109-125

Cronin, A.M., (2018) Public relations capitalism: Promotional culture, publics and commercial democracy, Cham: Switzerland, Palgrave Macmillan

Davies, W. and Gane, N. (2021) Post-neoliberalism? An introduction, Theory, Culture & Society, Vol. 38, No. 6 pp 3-28. Available online at https://doi. org/10.1177/02632764211036722

Demetrious, K. (2008) The object of public relations and its ethical implications for late modern society – a Foucauldian analysis, Ethical Space: The International Journal of Communication Ethics, Vol. 5, No. 4 pp 22-31

Demetrious, K. (2013) Public relations, activism and social change: Speaking up, New York and Abingdon, Oxon, Routledge

Demetrious, K. (2022). Public Relations and neoliberalism The language practices of knowledge formation, Oxford, Oxford University Press

Dhoni, P. (2023) Unleashing the potential: Overcoming hurdles and embracing generative AI in IT workplaces: Advantages, guidelines, and policies, TechRxiv, preprint. Available online at

Dwivedi, Y.K., Hughes, L., Ismagilova, E., Aarts, G., Coombs, C., Crick, T., Duan, Y., Dwivedi, R., Edwards, J., Eirug, A. and Galanos, V. (2021) Artificial Intelligence (AI): Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy, International Journal of Information Management, Vol. 57 p. 101994

Edwards, L. (2021) Organised lying and professional legitimacy: Public relations’ accountability in the disinformation debate, European Journal of Communication, Vol. 36, No. 2 pp 168-182. DOI: 10.1177/0267323120966851

Eisikovits, N. (2023) AI isn’t close to becoming sentient – the real danger lies in how easily we’re prone to anthropomorphize it, The Conversation, 15 March. Available online at danger-lies-in-how-easily-were-prone-to-anthropomorphize-it-200525

Fernandez, M. (2023) Fakery and confusion: Campaigns brace for explosion of AI in 2024, Politico, 18 June. Available online at campaigns-ai-2024-00102463, accessed on 28 August 2023

Finlayson, A. (2021) Neoliberalism, the alt-right and the intellectual dark web, Theory, Culture & Society, Vol. 38, No. 6 pp 167-190

Floridi, L. (2023) AI as agency without intelligence: On ChatGPT, large language models, and other generative models, Philos. Technol, Vol. 36, No. 15. Available online at

Fowler, R, Hodge, B, Kress, G, and Trew, T. (1979) Language and control, London, Routledge

Fried, I. (2023) How AI will turbocharge misinformation – and what we can do about it’. Axios, 10 July. Available online at response-measures, accessed on 22 September 20023

Guzman, A.L. and Lewis, S.C. (2020) Artificial intelligence and communication: A human-machine communication research agenda, New Media & Society, Vol. 22, No. 1 pp 70-86

Habermas, J. (1994) Three normative models of democracy, Constellations, Vol. 1, No. 1 pp 1-10

Hall, J. (2023) 7 PR tools that will help you earn and track tedia attention, Forbes, 14 July. Available online at that-will-help-you-earn-and-track-media-attention/?sh=25e5df0a6150, accessed on 1 September 2023

Jansen, S.C. (2017) Stealth communications: The spectacular rise of public relations, London, John Wiley & Sons

Luhmann, N. (1992) What is communication?, Communication Theory, Vol. 2, No. 3 pp 251-259

Meade, A. (2023) News Corp using AI to produce 3,000 Australian local news stories a week, Guardian. Available online at news-corp-ai-chat-gpt-stories

Miller, D. and Dinan, W. (2003) Global public relations and global capitalism, Demers, D.D. (ed.) Terrorism, globalization and mass communication (Papers presented at the 2002 Center for Global Media Studies Conference), Spokane, Marquette Books pp 193-214

Mirowski, P. (2015) Postface: Defining neoliberalism, Mirowski, P. and Plehwe, D. (eds) The road from Mont Pèlerin: The making of the neoliberal thought collective, Cambridge, MA and London, Harvard University Press pp 417-455

Mollerup-Degn, T. (2020) The power of words: A critical discourse analysis of governmental media releases from Australia and Nauru, Malmö University. Available online at

Mourtzis, D., Angelopoulos, J. and Panopoulos, N. (2023) The future of the human- machine interface (HMI) in society 5.0, Future Internet, Vol. 5, No. 5 pp 162-187

Ong, J.C. and Cabañes, J.V.A. (2018) Architects of networked disinformation: Behind the scenes of troll accounts and fake news production in the Philippines, Scholarworks, UMassAmherst. Available online at

Robinson, W.I. and Harris, J. (2000) Towards a global ruling class? Globalization and the transnational capitalist class, Science & Society, Vol. 64, No. 1 pp 11-54

Schippers, B. (2020). Artificial Intelligence and democratic politics, Political Insight, Vol. 11, No. 1 pp 32-35. Available online at

Thun, L. (2023) More than a buzzword: Leveraging AI in day-to-day work, Forbes, 11 July. Available online at ai-in-day-to-day-work.html, accessed on 31 July 2023

Unver, A. (2018) Artificial intelligence, authoritarianism and the future of political systems. EDAM Research Reports. Available online at abstract=3331635

Wach, K., Duong, C.D., Ejdys, J., Kazlauskaitė, R., Korzynski, P., Mazurek, G., Paliszkiewicz, J. and Ziemba, E. (2023) The dark side of generative artificial intelligence: A critical analysis of controversies and risks of ChatGPT, Entrepreneurial Business and Economics Review, Vol. 11, No. 2 pp 7-24

Note on the contributor

Dr Kristin Demetrious is an Associate Professor of Communication at Deakin University, Australia. She is the author of Public relations and neoliberalism: The language practices of knowledge formation (OUP, 2022) and Public relations, activism and social change (Routledge, 2013). Currently a joint editor for Public Relations Inquiry (Sage), she has researched the relations of power in public relations for over twenty years, with particular interest in post-war US public relations, Australian political communication, risk producing industries’ use of PR, PR workplaces and gender, ethics and ethical theory, activism, civil society and social change, as well as social media.

No comments here
Why not start the discussion?