First Monday

A corpo-civic space: A notion to address social media's corporate/civic hybridity by Carolina Are



Abstract
This article proposes a solution to understand the spatial hybridity of social media spaces such as Facebook and Instagram, constructed between a corporate entity and a civic space. Switching the main poles of third space theory to represent ‘corporate’ and ‘civic’ spaces, this essay compares Facebook/Instagram to similar off-line spaces in order to propose they are a ‘corpo-civic’ space. In doing so, it provides recommendations for fairer moderation of user content posted on these platforms based on international human rights standards and ethics that already exist off-line.

Contents

I. Introduction
II. Off-line spatial hybridity
III. Social media spatial hybridity
IV. A corpo-civic space
V. Discussion
VI. Conclusion

 


 

I. Introduction

This essay compares and contrasts social networking platforms such as Facebook/Instagram to off-line spaces in order to provide a possible solution towards understanding their spatial hybridity, constructed between a corporate entity and a civic space. It does so by applying third space theory [1] to social media [2], switching the main poles of third space theory to represent ‘corporate’ and ‘civic’ spaces, (Humphreys, 2007; Oldenburg, 1999; Svensson, 2018). While social networks describe themselves as ‘platforms’ (Ball, 2018; Gillespie, 2010; Zuckerberg, 2018), they have been widely conceptualised as virtual or a new set of ‘spaces,’ where people meet and interact (Burkell, et al., 2014; Svensson, 2018; White and Le Cornu, 2011). In comparing off-line spaces presenting similar spatial hybridity to social media, this essay advances the idea of ‘corpo-civic’ social networking spaces in order to provide a framework for their governance not unlike that of similar off-line areas.

Other articles before this pointed out social networking platforms’ role as civic engagement tools (Sloan and Quan-Haase, 2017; Smith, et al., 2014), while also noting their hybridity and the interests at stake in moderating what is posted on them (Baym and boyd, 2012; Bartlett, 2018; Gillespie, 2010; Kaye, 2019; Paasonen, et al., 2019). Yet, no clear attempt has been made so far from mere description of that hybridity, and defining it in order to provide possible solutions to these spaces’ future governance. As will become apparent in this essay, social media would benefit from better relationships between the governed and the governing, abandoning a top-down approach in favour of more involvement with citizens’ activities (Foucault, 1980) and introducing more government regulation to ensure social networks embody much needed public values. This essay explores the idea that online and social media spaces like Facebook/Instagram are neither a civic or a corporate space, but a ‘corpo-civic’ space instead.

At the time of writing — Spring 2020 — nation-states worldwide have implemented shutdown measures to protect their citizens from the coronavirus, with a subset of white-collar workers compelled by the global and public emergency to work remotely and social interactions taking place online (Paul, 2020). As a consequence, the entire social space is rendered digital by lockdowns. This has intensified a variety of censorship, privacy and harassment concerns.

Already before the pandemic, social networking platforms were accused of wrongly censoring content that they considered dangerous for their community and their business. Since 2019, Instagram’s murky account deletion and censorship processes such as the ‘vaguely inappropriate content policy’ [3], known amongst users as the ‘shadowban’, have affected athletes, educators, artists, sex workers, the LGBTQIA+ community and people of colour (Cook, 2020). Through this policy, Instagram has been hiding posts it deems as sexually suggestive from its Explore page and, in August 2019, it had to apologise to pole dancers [4], carnival dancers [5] and other communities for wrongly censoring their posts (Are, 2019b; Taylor, 2019). Instagram denied intentional targeting of specific communities [6], arguing content and hashtags were moderated “in error” (Are, 2019b). At the time of writing, the platform has either denied that certain censorship techniques exist [7] (Cook, 2020) or refused to provide any further clarity about the reasoning behind them (Are, 2019a, 2019b; Cook, 2020; Kaye, 2019; Paasonen, et al., 2019). The ‘shadowban’ may take new shapes and forms now, at a time when online visibility is the only means of expression and communications allowed to most users [8].

With regards to privacy, video conferencing app Zoom [9] — currently one of the most used services for remote working during the pandemic — has been raising a variety of concerns about its handling of user data and chat infiltrations known as “Zoombombing” [10] (Lorenz and Alba, 2020; Paul, 2020). Additionally, news that the United States’ Center for Disease Control and Prevention (CDC) is looking at mobile advertising industry data to analyse people’s movements in the midst of the pandemic [11] worried users about companies weaponising their data against them for prediction and control purposes (Hern, 2020b). Meanwhile, social media users continue to be trolled online, even when they are sharing their journey of surviving the virus [12] (Mohdin, 2020).

Precisely because the entire social, cultural and political space has now been rendered digital by COVID-19 lockdowns, it is necessary to address how social networking platforms’ spatial hybridity — in between a civic and a corporate space — can be used in a way that does not harm public spaces and their users’ rights. This article proposes a new definition for social media spaces such as Facebook and Instagram. To do so, section II describes off-line spaces presenting a similar spatial hybridity; in section III, social media’s own spatial hybridity is compared to that of off-line spaces; and finally, in section IV, third space theory frames social media spaces as ‘corpo-civic’ spaces, providing recommendations for fairer moderation of user content posted on these platforms, based on international human rights standards.

 

++++++++++

II. Off-line spatial hybridity

This essay applies different notions and instances of off-line spatial hybridity to social networks, and to Facebook/Instagram in particular. These spaces present a spatial conundrum, as they are destined for public use but are owned by private corporations and/or their stockholders. However, before discussing the nature of this spatial hybridity, it is necessary to briefly outline what civic spaces look like off-line, stating what the public can expect from a civic space outside of social media, and to show what spatial hybridity looks like off-line.

Off-line, civic spaces are areas that are intended to be accessible to the public and to bring people together. Moeckli states that a place is public if it is accessible to everyone, meaning that equality — the rule that “everyone has the same rights to access and use public space” — is an inseparable element of the notion of public space [13]. For Humphreys, public or civic spaces are “nondomestic physical sites that are distinguished by their relative accessibility, such as dance clubs, parks, restaurants, bars, cafes, laundromats, and the street,” [14]. Privacy International [15] (2019), too, views civic spaces as “settings where people formulate ideas, discuss them with like-minded people and groups, raise dissenting views, consider possible reforms, expose bias and corruption, and organise to advocate for political, economic, social, environmental, and cultural change.” For these spaces to be public and shared, their inhabitants need to follow rules and laws established and enforced by institutions (Firmino and Duarte, 2016). However, a variety of authors have argued that, increasingly, traditional, off-line civic spaces have become privately owned, presenting an ownership conundrum. As a prologue to the discussion of ‘corpo-civic’ hybrid social media spaces, the following section focuses on the mid-and-late twentieth century emergence of shopping malls, gated communities and public areas governed through private security, in order to showcase different instances of spatial hybridity not unlike the one presented by social media platforms.

Both Sorkin (1992) and Crawford (1992) talk about the increasing ‘mallisation’ of the United States in the 1990s, with cities becoming a succession of malls, department stores and chains. Crawford, in particular, argues that changes in American city planning focusing heavily on spaces encouraging consumption have meant that public areas have become a commodity. She argues that shopping malls repackaged the city into a safer and cleaner suburban hub, turning these privately owned spaces into “a community and social center” [16]. Because of their corporate ownership and this focus on consumption, shopping malls present a spatial hybridity: they can be seen as a semi-public space, as they are an accessible area for public gathering, but owned by a private entity with an aim to sell goods and experiences.

Another example of off-line spaces presenting a spatial hybridity can be found in gated communities. Gated communities are housing developments or residential areas “with restricted access that makes normally public spaces private,” and feature walls, fences, guarded entrances and physical barriers [17]. For Ergun and Kulkul, “gated communities” exemplify the conflict between public and private zones in civic spaces, with the term signifying “a semi-public space which lies within the dichotomy of public and private space and is accepted as a buffer zone in which the attributes and uses of all types of spaces are interwoven” [18]. For them, the gated community is pseudo-public for its residents to the exclusion of non-residents, reproducing a common conflict in urban space “that incorporates the concepts of both public and private areas” [19].

A further example of a spatial hybrid is presented by civic or public spaces governed and surveilled through private security — be that through private security firms, or surveillance cameras — in what Firmino and Duarte call a “third territorial layer” [20]. The authors draw attention to the “unnegotiated” gaze of cameras, which are owned by unknown private actors and look over spaces that society expects to be under State control, with the tacit acceptance and support of the State [21]. Sorkin, too, mentions an “obsession with ‘security’” in spaces such as malls, resulting in increasing “manipulation and surveillance” of citizens, changing the role of city planning from integrator to segregator that excludes and/or herds undesirables [22].

The spatial hybridities presented by the above spaces raise questions about corporations’ and private entities’ interests in public life and in citizens’ everyday life, creating spaces in between a civic and a corporate entity. Indeed, these spaces are often accessible to everyone with Internet access, but imply both interaction with peers and consumption or use of corporate services and goods, or at the very least interaction with a private actor. This brings Rose to argue that citizenship no longer means mere interaction with the State, but that it entails “active engagement in a diversified and disperse variety of private, corporate and quasi-corporate practices, of which working and shopping are paradigmatic” [23].

The securitisation of off-line public spaces — be they privately owned, or privately surveilled — and its resulting exclusion of ‘undesirables’ are, for Moeckli (2016), preventive rather than punitive measures. While he argues that the State and private entities are and should be allowed to exclude people from spaces on a legal basis — e.g., when a crime is committed, or when a rule is broken — measures that exclude people from public space on a preventive basis without legal justification “interfere with (a range of) fundamental rights” [24]. These preventive measures are often directed against people who are perceived to be a risk, such as youths, foreigners and the like, in a clear violation of human rights that can make public spaces increasingly exclusionary [25].

Despite the privatisation and securitisation of off-line public spaces however, the private businesses governing them still have a responsibility towards society, are expected to follow rules set out by governments and to respect people’s rights.

Human rights are protected by the rule of law at an international level, as stated by the Universal Declaration of Human Rights (United Nations, 1948). Laws are a prerequisite for upholding liberty and equality: “By requiring authorities to act in accordance with laws declared publicly in clear terms in advance, the rule of law enables people to plan and act as autonomous rational beings and thus to exercise their liberties” [26]. Therefore, public spaces — whether they are privately owned or not — are expected to be governed by a clear set of rules equally applied to everyone, excluding people only once those rules are violated [27].

Aside from being subject to national and international law, private enterprises can be expected to have responsibilities and obligations towards society (Wulfson, 2001). For Griffin and Prakash (2014), this corporate responsibility manifests itself in a variety of ways: not only through philanthropy, but also through paying more than minimum wage, providing health care benefits, creating retirement funds or educational opportunities for employees, pollution abatement and the like. Brunk (2012), too, talks of “consumer perceived ethicality” in relation to a brand’s image, stating that consumer perceptions of brands and corporate entities being ethical is associated with them abiding by the law, avoiding doing any harm and having a positive impact on the community.

Off-line hybrid spaces are therefore still expected to follow a set of rules and behave ethically. However, as will become apparent in the following section, similar expectations, responsibilities and duties are not always expected of social media — and therefore online — spaces presenting these spatial hybridities.

 

++++++++++

III. Social media spatial hybridity

Social networking companies seem to prefer to define themselves as communications utilities or technology companies (Ball, 2018; Zuckerberg, 2018). They have been viewed as broadcast tools to voice one’s freedom of expression, a space for political discussion, connected with elections and political campaigns and movements such as the Arab Spring, #MeToo movement, #BlackLivesMatter and #OccupyWallStreet (Sloan and Quan-Haase, 2017; Smith, et al., 2014), offering “an opportunity for marginalised people to represent themselves” [28]. The founder of Facebook, Mark Zuckerberg (2018), said he started working in the technology sector as “it can be a democratizing force for putting power in people’s hands.”

Yet, a variety of authors disagree with the notion of social networks as a mere platform or utility. Social media’s hybridity became quickly apparent, initially with regards to the viewership and accessibility of the content posted on them, in what Baym and boyd defined a “conundrum of visibility” [29]. They argued that these platforms “complicate the very nature of public life” as they “blur boundaries between presence and absence, time and space, control and freedom, personal and mass communication, private and public, and virtual and real” [30].

The hybridity of social networking platforms is exemplified by Gillespie’s (2010) argument that the word ‘platform’ is nothing other than a smart business strategy by social media companies, allowing them to become appealing to users, creators and advertisers while also maintaining enough freedom from policy-makers and evading responsibilities to over-regulate content. The idea of the ‘platform’ has a quadruple duty for Gillespie, fitting with the egalitarian idea of giving everyone a voice, placing platforms as facilitators with no motive other than making content available, but also making them appealing to advertisers to show how they can be used to host their content, too. Finally, the term is also valuable in legal environments, placing platforms as neutral with regards to the content posted on them, “a vehicle for art rather than its producer or patron, where liability should fall to the users themselves” [31]. Platforms also allow for the sale and trading of goods (Busch, et al., 2018), and can be viewed as an online marketplace, similarly to a mall or a department store.

For the purpose of this essay, social networking platforms will be understood as virtual spaces. For Burkell, et al., publicness is about space: something is public “if it occurs in a space (real or virtual), where there can be no expectation of freedom from observation by others,” because by existing in that space citizens surrender any claim to privacy [32]. White and Le Cornu [33] (2011), too, write that people ‘meet’ on social media, creating “an impression of location and of social space”.

However, social media present a spatial hybridity similar to the aforementioned off-line spaces, where different actors come into play. Authors such as Burkell, et al. (2014) and Svensson (2018) have already pointed out their spatial hybridities, referring to social networks and their connected subcultures as a ‘liminal civic space’. For Svensson, the liminality is somewhere in-between acceptability and unacceptability of content posted (Svensson, 2018); for Burkell, et al., social media spaces “occupy a liminal territory between ‘open’ and ‘closed’” to users from different groups [34].

This essay wishes to shift the conversation about social media’s hybridity away from visibility and towards the expressive and ownership tensions within their spaces. In 2012, Baym and boyd predicted that social media were going to “redefine publicness” as we know it [35]. While the authors were referring to posts’ visibility, this essay argues that social media’s redefinition of publicness has materialised through the tension arising from their function in public life paired with their corporate ownership. Indeed, audiences are no longer doubting the ‘publicness’ of posting content on social media: they assume that the information posted on social networking platforms can travel with no boundaries, resulting in users losing control of their content whether they are posting from a space they consider private or from a public one — unless they deliberately decide to set their accounts to ‘private’, only for their contacts to see (Burkell, et al., 2014; Instagram, n.d.).

Despite the similarities that social networking platforms present with off-line spatially hybrid spaces, they are not governed in the same way, with their rules and safeguards, partly because of the fast, exponential growth they experienced and partly due to issues, breaches and abuse happening in different jurisdictions (Hardaker and McGlashan, 2016). This has produced a variety of crucial governance issues, three of which this essay will address: monopolies, human rights issues and lack of clarity towards users, both in terms of clarity of how data is collected and used, and in terms of how content is censored.

A variety of authors have been voicing concerns about concentration of social media ownership into few hands. For van Dijck, et al. (2019) [36], the companies attracting the main worries about concentration are Alphabet-Google, Amazon, Facebook, Apple and Microsoft (GAFAM), which they call ‘The Big Five’, an elite of gatekeepers in digital markets. Bartlett warns about private companies making decisions based on “shareholder interest, or the political views of the founders” [37], while Kaye, too, writes that:

“Today, a few private companies, driven to expand shareholder value, control social media. And yet the rules of speech for public space, in theory, should be made by relevant political communities, not private companies that lack democratic accountability and oversight. If left alone, the companies will gain ever greater power over expression in the public sphere.” [38]

Concerns over monopolies become even greater when social media algorithms governing what audiences see came into the picture. Greenfield (2013) [39] cautioned against one-size-fits-all autonomous systems such as algorithms set by private businesses regulating wider civic spaces and resources: indeed, if a handful of private, corporate companies run social networks, the same algorithms may be applied to the bulk of content automatically, without the use of human judgement for determining context or nuance, in order to preserve corporate interests. Similarly to offline preventive exclusion measures, algorithms are targeted to help companies limit financial damage or improve returns, and they can work on a pre-emptive basis, without using active, real-time human judgement, “applying machine-learning algorithms to historic data to infer and thereby predict future behavior” [40].

Social networking platforms have also been accused of restricting user rights such as freedom of expression. In order to limit the damages from problematic content being posted on them, they have introduced community guidelines, enforced through algorithmic moderation which, as already mentioned above, has meant that a handful of platforms apply the same moderation techniques to the majority of online content (Kaye, 2019; Paasonen, et al., 2019; van Dijck, et al., 2019; Gillespie, 2010). At times, this has resulted in algorithms replicating or even amplifying off-line inequality and enforcing unfair censorship (Kaye, 2019; Kumar, 2019; Paasonen, et al., 2019).

Social media infrastructure is replicating off-line privilege and discrimination, particularly of the gendered and racial kind, almost mirroring early social critiques of the Internet (Harvey, 2019; Lawson, 2018; Iandoli and Norris, 1997). Even before Facebook became a permanent fixture in our lives, Iandoli and Norris (1997), for instance, warned about information overloads and the possibility that the Internet and its by-products would replicate off-line inequalities. Paasonen, et al., too, write that women’s strategies to minimise risks off-line (e.g., policing how and how often they communicate) are being re-adopted online, and that “the Web rapidly reproduced and retrenched gender, sexed, and raced power relations” [41].

A further issue arising from social networking platforms’ spatial hybridity is the lack of clarity in social media moderation. Corporate companies owning social media have so far not shone any particular light on their judgement to allow or remove content: no platform guidelines case law, or reasoning behind decisions (Kaye, 2019). Kumar (2019) [42], too, writes that while platforms like YouTube do tweak their rules each time a crisis emerges, they miss “any formalised process of stakeholder participation” in deciding what content stays or goes off-line. This raises “critical questions about precarity of creator labour and the exploitative nature of the relationship between platforms and ‘produsers’”[43] and increases doubts over the idea that social media platforms’ main aim was to give users a voice [44].

The algorithms that run social media have been routinely deemed as opaque, with a variety of authors drawing attention to the lack of clarity and consistency in moderation (Bartlett, 2018; Kaye, 2019; Paasonen, et al., 2019; Kumar, 2019). For instance, Paasonen, et al. write that there is an “instability in how, and with what kinds of motives, things get tagged and flagged, and how ensuing boundaries of acceptability are being drawn.” [45]

While in the 1970s and 1980s so-called ‘techno-utopians’ were sceptical towards any form of state regulation of the Internet, the medium and its intended and unintended by-products’ economic, political and social relevance created a tension between the need for regulation and the idea of the Internet as a decentralised structure (Busch, et al., 2018). Although the space is now regulated nationally and internationally (Busch, et al., 2018), social media regulation still has to catch up with the medium’s exponential growth.

Similar to how off-line spaces saw private corporations enter public areas through malls and/or private security and insurance, corporate entities are in charge of our data, speech and content (Bartlett, 2018; Kaye, 2019). However, off-line spaces are governed through a set of laws and subject to public expectations of ethics and responsibility, and exclusion from them needs to be consistent with law and with respecting people’s human rights. Instead, social networking sites are currently governed by a set of in-platform laws made by private businesses that exclude ‘undesirable’ users and leave single individuals in precarious charge of their safety (Kaye, 2019; Kumar, 2019; Paasonen, et al., 2019), in an exclusion from public space that can be compared to medieval ‘banishment’, or sending people away from a specified area [46].

Due to the above issues therefore, this essay argues that there is a discrepancy between the regulation of off-line hybrid spaces and the regulation of similar social media spaces, to the detriment of users’ rights and of fair and consistent social media governance.

 

++++++++++

IV. A corpo-civic space

This essay advances a possible solution towards fairer governance of hybrid spaces such as social media by adapting third space theory to the Facebook/Instagram space, providing recommendations based on ethics and on international human rights standards to govern said spaces.

Spatially ambiguous spaces have been described by previous authors as ‘third’ spaces or places in a variety of contexts, mainly in relation to a buffer zone between what is understood as a space for work and as citizens’ homes. For instance, third places are for Oldenburg public spaces beyond work or home where individuals can interact and meet informally: “The third place is a generic designation for a great variety of public spaces that host the regular, voluntary, informal, and happily anticipated gatherings of individuals.” [47]. These spaces are, for the author, crucial towards the development of communities and the strengthening of society. If we consider these third places or spaces as still public and civic areas — and therefore as areas that are still under government control — then the entrance of these private, corporate actors in our civic life adds a new problematic layer to spatial governance.

Both Humphreys (2007) and Svensson (2018) apply third space theory to explain the spatial characteristics of social media platforms, stating third spaces are public spaces that host gatherings beyond home and work, where people are on a level playing field and discuss topics informally without branding the spaces as political settings. They argue that third spaces — both online and off-line — are an essential part of sociality. While this essay accepts some elements of third space theory for social media — their ‘otherness’ compared to work or home and their essential nature towards social interactions — it also updates this theory for the post-pandemic age of social media giants.

‘Digital’ third space theory can be understood differently during the coronavirus pandemic. Indeed, while social media platforms do provide a ‘buffer space’ between different modes, these modes are no longer ‘work’ and ‘home’, precisely because, as Baym and boyd (2012) argue, social media blur the boundaries between public and private and the entire working day has to be carried out from home, which becomes a work space and which is no longer fully private. Therefore, the discussion should no longer be focused on whether these platforms are public or private in terms of visibility of what is posted on them, or about the dynamics of a third space in between ‘home’ and ‘work’: it should instead shift to reconciling the platforms’ public function with the private nature of their ownership, and to the rules and expectations these companies should follow in the space they have created.

This essay therefore wishes to adapt third space theory to our current context, switching the poles of ‘work’ and ‘home’ with ‘corporate’ and ‘civic’ spaces [48]. In doing so, it wishes to advance a view of social media platforms as ‘corpo-civic’ spaces, to create a balance between platform duties and user expectations at a time where social networks are both a social and a work space, owned and administered by private corporations (Figure 1).

 

Corpo-civic spaces, 2020
 
Figure 1: Corpo-civic spaces, 2020.
 

If a civic space is like a “public square” accessible to everyone (Moeckli, 2016; Smith, et al., 2014), a space for people to express their own freedom of speech (Kaye, 2019), and a setting where users formulate ideas, discussing them with both like-minded people and those who disagree (Privacy International, 2019), social networks definitely present elements of the description of civic spaces.

Yet, these platforms are also corporate entities, owned by corporations profiting from users’ data, looking to appeal to advertisers and limit the damage of bad publicity (Kaye, 2019; Paasonen, et al., 2019; Bartlett, 2018; Gillespie, 2010). Therefore, this piece argues that as private companies, they have to follow a set of norms, rules and laws which are applicable to off-line businesses presenting similar spatial hybridity. Considering that off-line public spaces used by citizens for civic purposes are supposed to be governed in a way that respects citizens’ human rights, then social media spaces should behave ethically, agree to be regulated through national and/or transnational agreements and use international human rights standards for content moderation instead of preventively banishing undesirable users.

Similar to actors ruling off-line hybrid spaces, social networking companies should be expected to behave ethically. Iandoli and Norris (1997) already predicted the current situation, providing guidelines to prevent social networking and Internet companies from behaving unethically. These guidelines include building accessible infrastructure to prevent the creation of further social divides and creating easy-to-use technology (Iandoli and Norris, 1997). Greenfield (2008), too, devised ethical guidelines for computing and related systems, stating that these systems should be harmless to users, that they should be transparent with regards to their ownership and clear with customers about when these systems are operating on them or their content; he also added that users should be able to opt out of these systems although this particular principle is now unlikely in the age of social media, especially during the coronavirus pandemic. Therefore, if off-line businesses are expected to behave ethically, and to be transparent and clear when making decisions about their customers, it would be fair to say that similar expectations should be required of social networking platforms.

What is more, like off-line businesses, social media should be respecting international human rights law and the rights of their customers as quasi-citizens (Binns, 2019; Greenfield, 2013, 2008; Kaye; 2019; Yeung, 2018). Kaye (2019), in particular, states that human rights standards should be part of social media content moderation norms. He argues that social networks should promote user agency and diversity, and that they should be transparent, decentralising their decision-making to promote freedom of expression (Kaye, 2019). He adds that social media companies should be subject to industry-wide oversight, and to government control.

The importance of human rights guidelines is recognised beyond nation-state jurisdictions. Article 10 of the European Convention on Human Rights, for instance, states that everyone has the right to freedom of expression and information subject to restrictions that are “in accordance with law” and “necessary in a democratic society” (European Court of Human Rights, 2010). Following the logic of Article 10 and previous case law, publishing content that is shocking or offensive (within reason) is still within the European Convention of Human Rights’ scope (European Court of Human Rights, 2010; Selected judgments of the European Court of Human Rights, 1997, “Case of Oberschlick [no. 2] v. Austria”).

Considering that in off-line corpo-civic spaces the exclusion or punishment of citizens without legal reasons is considered a violation of human rights, this essay takes ethics and human rights law into consideration to define users’ rights and companies’ responsibilities within a corpo-civic space on social media. Therefore, this piece argues that in the corpo-civic spaces social media users currently find themselves in, they should be able to:

  1. Post different, challenging, shocking opinions without being censored if they do not infringe or limit other people’s rights;
  2. Share content that is within community guidelines even if it is ‘borderline’;
  3. Not be discriminated against according to gender, race, sexual orientation, religion and the like, according to Article 14 of the European Convention on Human Rights (European Court of Human Rights, 2010);
  4. Receive accurate, thorough and fair explanations on why and how their content is being used or moderated;
  5. Appeal or opt out of decisions made about them if they think they are unfair.

The above points would be more consistent with international human rights standards governing other types of speech and forms of expression, and would more successfully deliver social media founders promise to give users a platform that empowers them (Bartlett, 2018; Zuckerberg, 2018). In order to deliver this promise while maintaining their role as private companies, social media giants operating in a corpo-civic space should:

  1. Avoid harming and discriminating against their users;
  2. Be transparent about their decisions, moderation teams and bias;
  3. Provide as much clarity as possible to both singular users and to the wider world about the ins and outs of its moderation and data usage, allowing for repeated appeals if necessary;
  4. Work with user communities to introduce moderation that is fair and diverse and represents as many perspectives as possible;
  5. Promote accountability about their actions and decisions with users, the industry and governments.

 

++++++++++

V. Discussion

This article has applied ‘third space theory’ to social media platforms with a focus on ownership, comparing off-line businesses with social networking giants with the aim to provide a framework to reconcile platforms’ spatial hybridity between a public space and a corporate entity. The notion of social media platforms as a corpo-civic space has considered the fact that, although social networking platforms are currently owned by a handful of private corporations, said corporations are finding themselves operating certain civic space functions, such as having created a space for people to discuss and debate ideas, and to organise to create movements (Kaye, 2019; Privacy International, 2019; Sloan and Quan-Haase, 2017).

However, in proposing the notion of social media as a corpo-civic space, this article does not propose turning social networking platforms into not-for-profit organisations. It only states that, at present, the concentration of ownership of the main social networking platforms in too few hands is raising the above stated concerns about social media moderation (Kaye, 2019; Paasonen, et al., 2019; van Dijck, et al., 2019), and that such concerns need to be addressed from the position society currently finds itself in — namely, using a space with civic characteristics largely run by private companies at a time where the entire social space has moved online.

Towards this purpose, and following concerns about social media governance and moderation, this article argues it is not sustainable for social networking platforms to maintain their power and legitimacy in democracies without admitting to their civic role. While recently Facebook announced the creation of an independent oversight board, made of journalists, judges and politicians and aimed at promoting freedom of expression on the platform [49], the issue of social media governance goes beyond one social network and needs tighter — although transparent, clear and fair — government regulation to ensure recommendations are implemented [50] (Hern, 2020a). As both Kaye (2019) and Paasonen, et al. (2019) state, this is a democracy problem: it is necessary therefore for governments to break monopolies to prevent similar, ineffective and discriminatory styles of moderation to be the only way in which social media companies govern their spaces.

Of course, the idea of corpo-civic spaces assumes that users would have access to said spaces, and is limited to areas where citizens can benefit from the civic nature of social media. Furthermore, it is necessary to note that different countries, with different human rights standards and political systems, understand social spaces and their relationship with citizens’ freedoms differently. However, precisely because of this, a shared international understanding of social networking spaces and of the rules and regulations they should follow is needed. This article argues that the definition it proposes would be a fair compromise between the status quo and fairer moderation conditions, striking a balance between social media corporations’ business interests and user rights.

Finally, the idea of social networks as a corpo-civic space presents an obvious temporal limitation: both community guidelines governing social networking platforms and government legislation related to them are in constant, international flux. Therefore, the notion of a corpo-civic space only works until governments or international law decide to define social networking platforms as a different type of space.

 

++++++++++

VI. Conclusion

To conclude, in the words of David Kaye, social media giants now:

“Today, a few private companies, driven to expand shareholder value, control social media. And yet the rules of speech for public space, in theory, should be made by relevant political communities, not private companies that lack democratic accountability and oversight. If left alone, the companies will gain ever greater power over expression in the public sphere.” [51]

This article hopes that referring to social networking companies as a corpo-civic space — owned and administered largely by private corporations also performing a civic function, such as ensuring healthy public debate and to organise to create movements — will provide a contribution towards new, possible ways of understanding the ever-evolving space of social media moderation.

Defining social networks as a corpo-civic space takes note of the status quo and of current controversies in moderation, while proposing a state-regulated moderation to uphold users’ human rights and social networks’ initial and self-stated mission to put power in people’s hands. End of article

 

About the author

Carolina Are is a third-year Ph.D. student at the City University of London, focusing on online abuse, conspiracy theories, disinformation, content moderation and algorithm bias.
E-mail: Carolina [dot] Are [at] city [dot] ac [dot] uk

 

Notes

1. Third spaces or places are areas between work and home where informal gatherings take space. For Oldenburg: “The third place is a generic designation for a great variety of public spaces that host the regular, voluntary, informal, and happily anticipated gatherings of individuals” (Oldenburg, 1999, p. 16).

2. Humphreys, 2007, at http://dx.doi.org/10.1111/j.1083-6101.2007.00399.x.

3. https://techcrunch.com/2019/04/10/instagram-borderline/.

4. https://bloggeronpole.com/2019/07/instagram-apologises-to-pole-dancers-about-the-shadowban/.

5. https://www.vice.com/en_in/article/7xg5dd/instagram-apologises-for-blocking-caribbean-carnival-content.

6. https://bloggeronpole.com/2019/07/instagram-denies-censorship-of-pole-dancers-and-sex-workers/.

7. https://www.huffingtonpost.co.uk/entry/instagram-shadow-banning-is-real.

8. Are, 2019a, 2019b; Constine, 2019; Paasonen, et al., 2019, p. 62.

9. https://www.theguardian.com/technology/2020/apr/02/zoom-technology-security-coronavirus-video-conferencing.

10. https://www.nytimes.com/2020/04/03/technology/zoom-harassment-abuse-racism-fbi-warning.html.

11. https://www.theguardian.com/technology/2020/apr/02/experts-warn-of-privacy-risk-as-us-uses-gps-to-fight-coronavirus-spread.

12. https://www.theguardian.com/world/2020/mar/20/dont-take-any-chances-warning-of-woman-with-covid-19-shared-online.

13. Moeckli, 2016, p. 320.

14. Humphreys, 2007, p. 344.

15. https://privacyinternational.org/long-read/2852/protecting-civic-spaces.

16. Crawford, 1992, p. 23.

17. Blakely and Snyder, 1998, p. 53.

18. Ergun and Kulkul, 2019, p. 777.

19. Ibid.

20. Firmino and Duarte, 2016, p. 747.

21. Firmino and Duarte, 2016, p. 745.

22. Sorkin, 1992, p. xiii. Nikolas Rose, too, states how administering the “marginalia” and identifying risky individuals to exclude has become a key part of controlling public space (Rose, 2000, p. 333).

23. Rose, 2000, p. 327.

24. Moeckli, 2016, p. 139.

25. Moeckli, 2016, p. 139; Brown, 2013.

26. Moeckli, 2016, p. 131.

27. Ibid.

28. Vivienne, 2016, p. 10.

29. Baym and boyd, 2012, p. 322.

30. Baym and boyd, 2012, p. 320.

31. Gillespie, 2010, p. 358.

32. Burkell, et al., 2014, p. 977.

33. https://firstmonday.org/article/view/3171/3049.

34. Burkell, et al., 2014, p. 975.

35. Baym and boyd, 2012, p. 328.

36. https://policyreview.info/articles/analysis/transnational-materialities.

37. Bartlett, 2018, p. 147.

38. Kaye, 2019, p. 112.

39. https://urbanomnibus.net/2013/10/against-the-smart-city/.

40. Yeung, 2018, p. 508.

41. Paasonen, et al., 2019, p. 144.

42. https://policyreview.info/articles/analysis/algorithmic-dance-youtubes-adpocalypse-and-gatekeeping-cultural-content-digital.

43. Kumar, 2019. The author uses the term ”produsers” to define the tension between social media users that produce the content that helps social media platforms to sell and, essentially, to exist.

44. https://policyreview.info/articles/analysis/algorithmic-dance-youtubes-adpocalypse-and-gatekeeping-cultural-content-digital.

45. Paasonen, et al., 2019, p. 6.

46. Moeckli, 2016, p. 68.

47. Oldenburg, 1999, p. 16.

48. This essay switches the poles of third space theory from ‘work’ and ‘home’ to ‘corporate’ and ‘civic’ because hybrid nature of spaces in between the corporate and the civic, such as social media. Indeed, moving on from ‘public’ and ‘private’ debate related to a person’s visibility or privacy within a space, this essay applies the ‘thirdness’ element of said liminal spaces in between two different settings to tensions arising from these spaces’ ownership, as the governance issues caused by corporate actors overseeing a civic space is now more relevant towards governing social media than mere understandings of privacy.

49. https://www.theguardian.com/technology/2020/may/06/facebook-oversight-board-freedom-expression-helle-thorning-schmidt-alan-rusbridger.

50. Government regulation of speech is of course relative depending on the State in question and on political systems, and can be perceived as a form of censorship. However, towards this essay and with regards to the ‘corporate’ and ‘civic’ tension pointed out in social media settings, it would be fair to say that a form of transparent, clear and fair government control would be more appropriate than leaving social media governance to private interests.

51. Kaye, 2019, p. 52.

 

References

Carolina Are, 2019a. “Instagram denies censorship of pole dancers and sex workers,” Bloggeronpole.com (23 July), at https://bloggeronpole.com/2019/07/instagram-denies-censorship-of-pole-dancers-and-sex-workers/, accessed 6 April 2020.

Carolina Are, 2019b. “Instagram apologises to pole dancers about the shadowban,” Bloggeronpole.com (31 July), at https://bloggeronpole.com/2019/07/instagram-apologises-to-pole-dancers-about-the-shadowban/, accessed 6 April 2020.

James Ball, 2017. Post-truth: How bullshit conquered the world. London: Biteback Publishing.

Nancy K. Baym and danah boyd, 2012. “Socially mediated publicness: An introduction,” Journal of Broadcasting & Electronic Media, volume 56, number 3, pp. 320–329.
doi: https://doi.org/10.1080/08838151.2012.705200, accessed 21 May 2020.

Reuben Binns, 2019. “Human judgement in algorithmic loops: Individual justice and automated decision-making,” SSRN (11 September), at https://ssrn.com/abstract=3452030, accessed on 8 April 2020.
doi: http://dx.doi.org/10.2139/ssrn.3452030, accessed 21 May 2020.

Edward J. Blakely and Mary Gail Snyder, 1998. “Separate places: Crime and security in gated communities,” In: Marcus Felson and Richard B. Peiser (editors). Reducing crime through real estate development and management. Washington, D.C.: Urban Land Institute, pp. 53–70.

Donna Marie Brown, 2013. “Young people, anti-social behaviour and public space: The role of Community wardens in policing the ‘ASBO generation’.” Urban Studies, volume 50, number 3, pp. 538–555.
doi: https://doi.org/10.1177/0042098012468899, accessed 21 May 2020.

Katja H. Brunk, 2012. “Un/ethical company and brand perceptions: Conceptualising and operationalising consumer meanings,” Journal of Business Ethics, volume 111, number 4, pp. 551–565.
doi: https://doi.org/10.1007/s10551-012-1339-x, accessed 21 May 2020.

Jacquelyn Burkell, Alexandre Fortier, Lorraine (Lola) Yeung Cheryl Wong and Jennifer Lynn Simpson, 2014. “Facebook: Public space, or private space?” Information, Communication & Society, volume 17, number 8, pp. 974–985.
doi: https://doi.org/10.1007/s10551-012-1339-x, accessed 21 May 2020.

Andreas Busch, Patrick Theiner and Yana Breindl, 2018. “Internet censorship in liberal democracies: Learning from autocracies?” In: Julia Schwanholz, Todd Graham and Peter-Tobias Stoll (editors). Managing democracy in the digital age: Internet regulation, social media use, and online civic engagement. Cham, Switzerland: Springer, pp. 11–28.
doi: https://doi.org/10.1007/978-3-319-61708-4_2, accessed 21 May 2020.

Jesselyn Cook. 2020. “Instagram’s CEO says shadow banning ‘is not a thing.’ That’s not true,” Huffington Post (25 February), at https://www.huffingtonpost.co.uk/entry/instagram-shadow-banning-is-real, accessed 7 April 2020.

John Constine, 2019. “Instagram now demotes vaguely ‘inappropriate’ content,” TechCrunch (10 April), at https://techcrunch.com/2019/04/10/instagram-borderline/, accessed 21 May 2020.

Margaret Crawford, 1992. “The world in a shopping mall,” In: Michael Sorkin (editor). Variations on a theme park: The new American city and the end of public space. New York: Hill and Wang, pp. 3–30.

Ayça Ergun and Ceren Kulkul, 2019. “Defining semi-public space: A case study in the gated communities of Yaşamkent, Ankara,” Turkish Studies, volume 20, number 5, pp. 776–793.
doi: https://doi.org/10.1080/14683849.2018.1556565, accessed 21 May 2020.

European Court of Human Rights, 2010. “European convention on human rights, as amended by Protocols Nos. 11 and 14, supplemented by Protocols Nos. 1, 4, 6, 7, 12, 13 and 16,” at https://www.echr.coe.int/Documents/Convention_ENG.pdf, accessed 5 April 2020.

Rodrigo Firmino and Fabio Duarte, 2016. “Private video monitoring of public spaces: The construction of new invisible territories,” Urban Studies, volume 53, number 4, pp. 741–754.
doi: https://doi.org/10.1177/0042098014567064, accessed 21 May 2020.

Michel Foucault, 1980. Power/knowledge: Selected interviews and other writings, 1972–1977. Edited and translated by Colin Gordon. New York: Pantheon Books.

Tarleton Gillespie, 2010. “The politics of ‘platforms’,” New Media & Society volume 12, number 3, pp. 347–364.
doi: https://doi.org/10.1177/1461444809342738, accessed 21 May 2020.

Adam Greenfield, 2013. “Against the smart city.” Urban Omnibus (23 October), at https://urbanomnibus.net/2013/10/against-the-smart-city/, accessed 21 April 2020.

Adam Greenfield, 2008. “Some guidelines for the ethical development of ubiquitous computing,&lrdquo; Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, volume 366, number 1881 (31 July), pp. 3,823–3,831.
doi: https://doi.org/10.1098/rsta.2008.0123, accessed 21 May 2020.

Jennifer J. Griffin and Aseem Prakash, 2014. “Corporate responsibility: Initiatives and mechanisms,” Business & Society, volume 53, number 4, pp. 465–482.
doi: https://doi.org/10.1177/0007650313478975, accessed 21 May 2020.

Claire Hardaker and Mark McGlashan, 2016. “‘Real men don’t hate women’: Twitter rape threats and group identity.” Journal of Pragmatics, volume 91, pp. 80–93.
doi: https://doi.org/10.1016/j.pragma.2015.11.005, accessed 21 May 2020.

Alison Harvey, 2019. “Tits or GTFO: The aggressive architecture of the Internet,” Flowjournal.org (24 May), at http://www.flowjournal.org/2019/05/tits-or-gtfo-the-aggressive-architecture-of-the-internet-alison-harvey-university-of-leicester/, accessed 6 April 2020.

Alex Hern, 2020a. “Journalists, politicians and judges to sit on Facebook’s free speech panel,” Guardian (6 May), at https://www.theguardian.com/technology/2020/may/06/facebook-oversight-board-freedom-expression-helle-thorning-schmidt-alan-rusbridger, accessed 7 May 2020.

Alex Hern, 2020b. “Experts warn of privacy risk as US uses GPS to fight coronavirus spread,” Guardian (2 April), at https://www.theguardian.com/technology/2020/apr/02/experts-warn-of-privacy-risk-as-us-uses-gps-to-fight-coronavirus-spread, accessed 1 May 2020.

Lee Humphreys, 2007. “Mobile social networks and social practice: A case study of dodgeball.” Journal of Computer Mediated Communication, volume 13, number 1, pp. 341–360.
doi: http://dx.doi.org/10.1111/j.1083-6101.2007.00399.x, accessed 20 April 2020.

Ce Ce Iandoli and Wendy Norris, 1997. “Contradictions and asides: A social critique of the Internet,” Journal of Technology Studies, volume 23, number 2, pp. 35–41.
doi: http://dx.doi.org/10.21061/jots.v23i2.a.5, accessed 21 May 2020.

Instagram, n.d. “Controlling your visibility,” at https://help.instagram.com/116024195217477, accessed 18 May 2020.

David Kaye, 2019. Speech police: The global struggle to govern The Internet. New York: Columbia Global Reports.

Sangeet Kumar, 2019. “The algorithmic dance: YouTube’s Adpocalypse and the gatekeeping of cultural content on digital platforms,” >Internet Policy Review>, volume 8, number 2.
doi: http://dx.doi.org/10.14763/2019.2.1417, accessed 3 April 2020.

Caitlin E. Lawson, 2018. “Platform vulnerabilities: Harassment and misogynoir in the digital attack on Leslie Jones,” Information, Communication & Society, volume 21, number 6, pp. 818–833.
doi: https://doi.org/10.1080/1369118X.2018.1437203, accessed 21 May 2020.

Taylor Lorenz and Davey Alba, 2020. “‘Zoombombing’ becomes a dangerous organized effort,” New York Times (3 April), at https://www.nytimes.com/2020/04/03/technology/zoom-harassment-abuse-racism-fbi-warning.html, accessed 1 May 2020.

Daniel Moeckli, 2016. Exclusion from public space: A comparative constitutional analysis. Cambridge: Cambridge University Press.
doi: https://doi.org/10.1017/CBO9781316650875, accessed 21 May 2020.

Aamna Mohdin, 2020. “Woman who filmed coronavirus warning receives online abuse,” Guardian (20 March), at https://www.theguardian.com/world/2020/mar/20/dont-take-any-chances-warning-of-woman-with-covid-19-shared-online, accessed 1 May 2020.

Ray Oldenburg, 1999. The great good place: Cafés, coffee shops, bookstores, bars, hair salons, and other hangouts at the heart of a community. New York: Marlowe.

Susanna Paasonen, Kylie Jarrett and Ben Light, 2019. #NSFW: Sex, humor, and risk in social media. Cambridge, Mass.: MIT Press.

Kari Paul, 2020. “‘Zoom is malware’: Why experts worry about the video conferencing platform.” Guardian (2 April), at https://www.theguardian.com/technology/2020/apr/02/zoom-technology-security-coronavirus-video-conferencing, accessed 1 May 2020.

Privacy International, 2019. “Protecting civic spaces” (1 May), at https://privacyinternational.org/long-read/2852/protecting-civic-spaces, accessed 8 April 2020.

Nikolas Rose, 2000. “Government and control,” British Journal of Criminology, volume 40, number 2, pp. 321–339.
doi: https://doi.org/10.1093/bjc/40.2.321, accessed 21 May 2020.

Selected judgments of the European Court of Human Rights, 1997. “Case of Oberschlick (no. 2) v. Austria,” 1997. International Journal of Human Rights, volume 1, number 3, pp. 89–91.
doi: https://doi.org/10.1007/978-3-319-61708-4_2, accessed 21 May 2020.

Luke Sloan and Anabel Quan-Haase (editors), 2017. Sage handbook of social media research methods. London: Sage.
doi: https://dx.doi.org/10.4135/9781473983847, accessed 21 May 2020.

Marc A. Smith, Lee Rainie, Ben Shneiderman and Itai Himelboim, 2014. “Mapping Twitter topic networks: From polarized crowds to community clusters,” Pew Research Center (20 February), at https://www.pewresearch.org/internet/2014/02/20/mapping-twitter-topic-networks-from-polarized-crowds-to-community-clusters/, accessed 4 April 2020.

Michael Sorkin (editor). Variations on a theme park: The new American city and the end of public space. New York: Hill and Wang.

Göran Svensson, 2018. “Social media as civic space for media criticism and journalism hate.” In: Julia Schwanholz, Todd Graham and Peter-Tobias Stoll (editors). Managing democracy in the digital age: Internet regulation, social media use, and online civic engagement. Cham, Switzerland: Springer, pp. 201–221.
doi: https://doi.org/10.1007/978-3-319-61708-4_11, accessed 21 May 2020.

Sharine Taylor, 2019. “Instagram apologises For blocking Caribbean carnival content,” Vice (31 July), at https://www.vice.com/en_in/article/7xg5dd/instagram-apologises-for-blocking-caribbean-carnival-content, accessed 5 April 2020.

United Nations, 1948. “Universal Declaration of Human Rights” (10 December), at https://www.un.org/en/universal-declaration-human-rights/, accessed 21 May 2020.

José van Dijck, David Nieborg and Thomas Poell, 2019. “Reframing platform power,” Internet Policy Review, volume 8, number 2.
doi: https://doi.org/10.14763/2019.2.1414, accessed 21 May 2020.

Sonja Vivienne, 2016. Digital identity and everyday activism: Sharing private stories with networked publics. Basingstoke: Palgrave Macmillan.
doi: https://doi.org/10.1057/9781137500748, accessed 21 May 2020.

Myrna Wulfson, 2001. “The ethics of corporate social responsibility and philanthropic ventures,” Journal of Business Ethics, volume 29, numbers 1–2, pp. 135–145.
doi: https://doi.org/10.1023/A:1006459329221, accessed 21 May 2020.

Karen Yeung, 2018. “Algorithmic regulation: A critical interrogation,” Regulation & Governance, volume 12, number 4, pp. 505–523.
doi: https://doi.org/10.1111/rego.12158, accessed 21 May 2020.

Mark Zuckerberg, 2018. “A blueprint for content governance and enforcement,” Facebook, (15 November), at https://www.facebook.com/notes/mark-zuckerberg/a-blueprint-for-content-governance-and-enforcement/10156443129621634/, accessed 6 April 2020.

 


Editorial history

Received 14 April 2020; revised 8 May 2020; revised 14 May 2020; accepted 20 May 2020.


Creative Commons License
This paper is licensed under a Creative Commons Attribution 4.0 International License.

A corpo-civic space: A notion to address social media’s corporate/civic hybridity
by Carolina Are.
First Monday, Volume 25, Number 6 - 1 June 2020
https://firstmonday.org/ojs/index.php/fm/article/download/10603/9549
doi: http://dx.doi.org/10.5210/fm.v25i6.10603