First Monday

Misuse or misdesign? Yik Yak on college campuses and the moral dimensions of technology design by Qinglan Li and Ioana Literat

Yik Yak, a location-based, anonymous social media app, has been gaining negative attention as a platform that often gives voice to bullying, racism and sexism on college campuses across the country. Integrating research on digital anonymity and cyberbullying, this paper analyzes the key features of Yik Yak and discusses the ethical dimensions of technology design, as illustrated by the Yik Yak case study. Based on this analysis and integrating previous research findings on interaction in digital spaces, we conclude by providing a set of guidelines for integrating ethical considerations into the process of designing social apps, and offer a few directions for further research in this area.


Theoretical background
Yik Yak and the socio-ethical dimensions of technology design
Towards a more ethical design process: Recommendations for mobile app design




In late 2014, a series of student demonstrations against racial intolerance were carried out at Colgate University — an elite liberal arts college in Hamilton, New York, with a largely white student body — prompted in part by racist messages posted on the anonymous social media app Yik Yak. The student organization behind this protest, the Association of Critical Collegians (ACC), held a campus sit-in for three straight days, provided a forum for minority students to share their feelings and personal stories, and presented a proposed action plan to the college administrators (Stone and Kingkade, 2014). On the surface, the demonstrations felt like a massive success: a quarter of the entire student body attended the sit-in and the school administration responded with 21 measures to be implemented. However, a parallel protest emerged on Yik Yak, where anonymous users in the campus community disparaged the sit-in, posting and upvoting increasingly racist and vicious messages targeting minority students at Colgate. Soon enough, the messages on Yik Yak started to become personal and violent, mentioning key ACC members by name and threatening them with physical violence. The university contacted both local and state police regarding the situation, but no actions could be taken against the posters, because Yik Yak refused to provide any identifying information about its users. As a result of the threats, some students were forced to leave school for a short period for safety reasons, and many decided to move home (New, 2014).

Yik Yak, the app at the core of this conflict, was created in late 2013 by two recent college graduates from Furman University in an attempt to create a more democratic social media network, where users can freely express their thoughts and have the posts read widely without needing a large number of followers or friends (Mahler, 2015). The premise of the app was simple: say anything you want publicly and anonymously, and see what others are thinking within a 1.5 to 10-mile radius around you. Or, in Yik Yak terminology, choose a “herd” (such as the name of your school), post anonymous messages (called “yaks”) within this “herd,” as well as upvote or downvote other users’ posts. Prior to officially shutting down on 28 April 2017, the app had gained significant popularity on college campuses across the United States, easily surpassing similar anonymity-based apps like Whisper and Secret.

The app has gained significant popularity on college campuses across the United States, easily surpassing similar anonymity-based apps like Whisper and Secret. However, in view of its features — particularly, the combination between anonymity and hyperlocality — Yik Yak has stirred a great deal of controversy due to the racist, homophobic and misogynist “yaks” that users posted at hundreds of universities, including threats of mass violence on more than a dozen college campuses, such as the University of North Carolina (Brown, 2015), Michigan State University (Kozlowski, 2015), Penn State University (DiStefano, 2014) and others (Ward, 2015). Gang rape threats were posted on Yik Yak at Kenyon College and Middlebury College (Mahler, 2015). At the University of Mary Washington, a representative of the feminist group on campus was eventually murdered after months of relentless Yik Yak threats (Kingkade, 2015). As one New York Times article accurately put it, “think of [Yik Yak] as a virtual community bulletin board — or maybe a virtual bathroom wall at the student union. Much of the chatter is harmless. Some of it is not” (Mahler, 2015).

By synthesizing literature from a variety of fields and integrating relevant findings on virtual anonymity and cyberbullying, this article crafts a close analysis of the key features that might facilitate the prominence of uncivil behavior on Yik Yak. We suggest that, in the case of Yik Yak, the whole is greater than the sum of its parts, as it is the particular combination of three main features — anonymity, hyperlocality and voting aggregation — that can turn the app into fertile ground for vitriol, particularly in the sociocultural context of the college campus. Looking at the lessons that we can draw from this now extinct but at one point very successful app, our interest is in the moral dimensions of technology design. Are digital technologies like social media morally neutral — a blank canvas open to use and misuse, according to different aims and contexts — or does the design itself promote specific behaviors and attitudes, from an ethical and social standpoint?

It is also important to clarify that we do not wish to imply that all content on Yik Yak is negative or hurtful; indeed, as recent empirical research has shown, the majority of yaks are trivial and benign (Black, et al., 2016) and posting behavior might not be all that different from non-anonymous feeds such as Twitter (Saveski, et al., 2016). The anonymity feature of Yik Yak motivates some users seeking a space for honest expression and exposure to diverse opinions (Kang, et al., 2016). Schlesinger, et al. (2017) find that the combination of the platform’s anonymity, ephemerality, and hyper-locality — which they refer to collectively as “situated anonymity” — can provide a forum for sharing honest opinions, a space for trying on individual identities, a “support structure for coping” with minor or major psychosocial challenges [1], and encouragement for “participants to commit to an online community by giving them an active role in the creation of an emergent, iterative group identity” [2]. But, in the same time, as these incidents on college campuses indicate, instances of cyberbullying, uncivil behavior and hate speech were frequent on the app (Dewey, 2014; Sigl, 2015) and Horsman (2016) explored the implications of legal and regulatory challenges of anonymity in these instances for the design of applications like Yik Yak. Due to the severity of the consequences that these processes can have — for the victims as well as the social and cultural life of the campus — we find this analysis of the moral dimensions of app design both necessary and valuable.



Theoretical background

Anonymity in computer-mediated communication

The concept of anonymity has been widely investigated in computer-mediated communication (CMC) environments, both domestically and internationally (Barlett, 2015; Barlett, et al., 2016; Boczkowski, 1999; Christopherson, 2007; Hardaker, 2010; Kang, et al., 2013; Sproull and Kiesler, 1991, 1986; Suler, 2004). On the Internet, the lack of visual and auditory cues, the asynchronous and ephemeral nature of communication, and the isolation from one’s communication partners contribute to create a very particular communication environment. Given the fluid, multifaceted nature of personal identity (McRobbie, 1994; Hall, 1987), researchers have been interested in what motivates and affects the ways people display themselves differently online (boyd, 2014; Cover, 2015), particularly when participating anonymously (Ellison, et al., 2016; Kang, et al., 2016). Studies show that there is a propensity for anonymous online users to suspend their consciousness of consequences and thus adopt a sense of disinhibition in their expressions (Mason, 2008; Sproull and Kiesler, 1991; Suler, 2004). Combined with the asynchronous nature of online communications, anonymity in CMC seems to lead to a series of behavioral outcomes that are drastically different from face-to-face communication, as research has demonstrated across various online platforms or digital applications (Bernstein, et al., 2011; Rogers, 2010; Schoenebeck, 2013; Sproull and Kiesler, 1991).

Studies have shown that anonymity, as well as the disinhibition it leads to, can result in either positive or negative behaviors depending on context (Christopherson, 2007; Kang, et al., 2016; Suler, 2004); when the disinhibition is “toxic,” it encourages cyberbullying behavior (Mason, 2008). In general, anonymity was found to foster a sense of impunity, a loss of self-awareness and a likelihood of acting upon normally inhibited impulses in a way that is markedly inconsistent with a person’s off-line self (Siegel, et al., 1986). — Anonymity reduces empathic responses in cyberspace, where virtual distance not only separates the bully from the victim but also greatly reduce the social and affective cues that generate empathy (Williard, 2005; Trolley, et al., 2006; Mason, 2008). In particular, researchers have found that lack of eye-contact and accountability play a significant role in the sense of anonymity that leads to toxic disinhibition and aggressive online behavior (Lapidot-Lefler and Barak, 2012; DeAndrea, et al., 2012). With the protection of distance and anonymity, CMC users can exercise aggression against other real humans without feeling accountable for their actions (Hardaker, 2010). According to Suler (2004), “dissociative anonymity” is one of the six main factors that interact with each other in creating this effect: when people have the opportunity to separate their actions online from their in-person lifestyle and identity, they feel less vulnerable about self-disclosing and acting out. In a process of dissociation, they no longer have to feel responsible for their behaviors through an internalized perception of their online self as a “compartmentalized self,” one that makes the actor feel “as if superego restrictions and moral cognitive processes have been temporarily suspended from the online psyche” [3].

It is important to recognize that this “disconnect” is not always negative. Granting people the ability to control the extent to which others access one’s self, anonymity also builds up the privacy we as social beings require for sustainable psychological well-being (Ellison, et al., 2016; Kang, et al., 2016; Werner, et al., 1992). More specifically, as Pederson (1997) showed with his factor analysis, anonymity provides three key functions as related to privacy: recovery (the sense of refuge and relaxation resulted from active self contemplation), catharsis (emotional purging by expressing unhindered thoughts and feelings to others), and autonomy (the chance to experiment with new behaviors without fear of social consequences). The ability to express thoughts and emotions that would be otherwise socially unacceptable can be healthy as it allows people to take a different perspective on life, explore unfamiliar interest, or even find support groups that will give them a sense of belonging. Online anonymity can enable users to experiment with new behaviors and identities, which is particularly important for young people who often use the Internet anonymously to explore their identity, a healthy step in their journey to adulthood (Maczewski, 2002). For instance, LGBTQ persons might find solace and support by participating online and connecting with others in an anonymous manner (Craig and Mclnroy, 2014), and Ellison, et al. (2016) find that teens strategically use anonymous social media platforms to pursue goals related to healthy psychosocial development such as relationships, self-learning, and identity management.

Beyond these individual-level effects, anonymity can also influence how individuals behave within groups. As early as 1969, Zimbardo’s classic deindividuation theory argued that anonymity afforded by crowds leads to a deindividuated state where an individual loses his or her sense of self-awareness and makes it more likely for that individual to engage in anti-normative or anti-social behavior (Zimbardo, 1969). Research has shown that anonymity in group contexts — both online and off-line — can lead to group polarization (Sia, et al., 2002), bystander apathy (Kraft, 2011; Markey, et al., 2002), and social loafing (Hoeksema-van Orden, et al., 1998; Sheppard and Taylor, 1999). The social identity model of deindividuation effects, also known as SIDE theory, is a reinterpretation of classic deindividuation theory that places significant emphasis on the context-specific variables in a social situation. SIDE suggests that the ability to individuate each member of a group undermines the perceptual unity of the group (Spears and Lea, 1992); as anonymity obscures personal features and interpersonal differences, the relative importance of interpersonal concerns is diminished, in favor of a focus on the group as a whole (Spears and Lea, 1992). SIDE theory encompasses two components that speak to the role of anonymity in CMC. The first is a cognitive component of anonymity which focuses on how group dynamics and individual behavior within groups is mediated by anonymity and the strength of an individual’s identification with the group (Joinson, 2003; Lea, et al., 2001; Spears, et al., 2001). The second is a strategic component of SIDE theory which involves the intentional use of anonymity in CMC as an attempt to take advantage of the benefits afforded by anonymity (e.g., equalization, improving status) (Le Hénaff, et al., 2015; Spears and Lea, 1994). Probing further into the relationship between individual characteristics and anonymity as predicted by the SIDE model, Christie and Dill (2016) analyzed the influence of individual characteristics on interpersonal relations in CMC, finding that individual levels of self-esteem, sense of autonomy, and social anxiousness affect the impact of anonymity in interpersonal interactions in CMC — those high in self esteem and low in social anxiousness were more likely to exhibit aggressive behaviors when anonymous, and those high in conformity were more likely to fall in to line with the emerging group norms. These findings have important implications for leveraging in these characteristics in designing anonymous online communication platforms or applications like Yik Yak.

Anonymity and cyber-bullying on social and mobile media

Recent Pew studies show that approximately 40 percent of Internet users report being the subject of online abuse at some point, a percentage much higher than suspected, with underrepresented users most often targeted (Aboujaoude, et al., 2015; Drake, 2014; Duggan, 2014). To understand the impact of anonymity on incivility in CMC contexts, many empirical studies have analyzed the comments section of news Web sites where commenters can hide their identity. For example, by analyzing 900 randomly chosen user comments on articles about immigration, half from newspapers that allowed anonymous postings, such as the Los Angeles Times and the Houston Chronicle, and half from those that did not, such as USA Today and the Wall Street Journal, researchers found that the level of civility dramatically improved in online conversations when anonymity is removed, suggesting not just a correlation between anonymity and incivility but direct causation (Santana, 2014). More specifically, news agencies have found that the conversations on their Web sites tend to be more civil when readers were required log in with their Facebook ID (Hill, 2012). By allowing anonymity on some news articles on their websites and removing it on some blogs, Los Angeles Times was able to compare the civility of the dialogue. The difference in discourse they observed under anonymous vs- non-anonymous settings was “stunning,” according to its online managing editor Jimmy Orr. “On the articles, it immediately plunged into the lowest common denominator — racism, threats, vulgarity. It was night-and-day.” (Sonderman, 2011)

Although social media is not usually anonymous, the possibility for people to hide or falsify at least part of their identity makes it a hotbed for incivility and in many cases, threats and harassments targeted at specific individual. Facebook, Twitter and Youtube are all among the most popular breeding ground for bullying behavior especially among young people; indeed, nine in ten teenagers say they have witnessed cruelty by their peers on social networks (Lenhart, et al., 2011). Online bullying can take on a variety of forms through various media channels (Ybarra and Mitchell, 2004). While traditional forms of bullying may lead to a range of problems, such as internalizing disorders (Hawker and Boulton, 2000), externalizing disorders (Nansel, et al., 2003), social difficulties (Forero, et al., 1999), physical health problems (Kumpulainen, et al., 1998) and suicide ideation (Kim, et al., 2005), cyberbullying in some cases has shown to exacerbate these consequences. For example, cyberbullying has accounted for higher depression rates and other negative mental health symptoms and evoked stronger negative feelings over and above traditional bullying victimisation (Gradinger, et al., 2009; Juvonen and Gross, 2008; Reid, et al., 2004).

Studies have shown that the more time adolescents spend online the more likely they are to be cyberbullied (Berson, et al., 2007); given the constant connectivity provided by mobile devices, cyberbullying — especially on mobile apps — is on the rise (Görzig and Frumkin, 2013). Looking at cyberbullying via mobile devices, Smith, et al. (2008) found that cyberbullying incidences were most common through phone calls, text messages and instant messages (IM). Moreover, cyberbullying on mobile platforms was found to be associated with negative mental health consequences such as higher psychological difficulties and intensity of harm (Görzig and Frumkin, 2013), which suggests that cyberbullying on mobile devices has particular features that might exacerbate the negative mental health consequences of cyberbullying (Gradinger, et al., 2009).

Bullying is often seen as a group process (Salmivalli, 2010) and research utilizing naturalistic observations found that peers are present in 85 percent of all bullying episodes (Hawkins, et al., 2001). As a result, researchers have studied how bystanders react to bullying. For example, Salmivalli, et al. (1996) found that assistants of bullies join the ringleader while reinforcers provide positive feedback to the bully by laughing or cheering. Online, bystanders can easily witness and engage in cyberbullying behaviors (Chang, 2010; Kowalski, et al., 2008; Ševčíková, et al., 2012) such as forwarding or posting harmful messages, but often do not realize that their actions mean they are participants (Kraft, 2011). Those who might want to defend a victim or confront a bully in an online environment may be unaware of other witnesses and unable to see their reactions, which may then lead to diffusion of responsibility caused by the bystander effect, discussed previously. Researchers agree that a key factor that influences both the reinforcing and the defensive cyber-bullying behavior online is the anonymity feature afforded by digital spaces (Barlett, 2015; Barlett, et al., 2016; Joinson, 2003; Suler, 2004, McKenna and Bargh, 2000). This, coupled with the lack of adult supervision, decreased chances of retaliation, pervasiveness or “always on” quality of the internet can create an attractive environment for bullies (Juvonen and Gross, 2008; Wright, 2013) and leave victims with little respite or refuge (Rogers, 2010; Slonje and Smith, 2008). The lack of facial cues and immunity of social responsibility can create conditions that could either encourage intentional or unwitting aggression that reinforces cyberbullying, or, on the other hand, can allow participants to provide social and emotional support without fear of confrontation with the perpetrator (Baker, 2014).



Yik Yak and the socio-ethical dimensions of technology design

Turning our attention to Yik Yak as a case study, what can we learn from this extensive body of literature that would facilitate a better understanding of the dynamics of anonymous hyperlocal apps like Yik Yak? And, taking this further, what does this tell us about the moral dimensions of technology design? We have chosen to focus on Yik Yak as a case study not only because it was arguably the most prominent and successful among anonymous social media — enjoying particular popularity on college campuses — but also because, as we will argue here, its specific features combined to create a perfect storm of digital incivility, with important social and ethical implications that could extend to other mobile technologies with similar features. In the following, we will facilitate a close critical analysis of these key features — anonymity (“yakkers”); hyperlocality (“herds”); and reliance on voting aggregation (“upvotes”, “downvotes”, “yakarma”) — in order to discuss the way in which the design of social technologies might shape user behavior, and the ethical implications thereof.

Anonymity (“yakkers”)

As Potts (2014) aptly puts it, Yik Yak perhaps offered the greatest guarantee of anonymity combined with the least accountability ever seen on the social Web. Upon opening up the Yik Yak app for the first time, all that was needed in order to start posting was a verifiable phone number. Users could create a username — called a “handle” — which could be changed at any time, but no identifiable information was requested or displayed to others when interacting within the app.

As reviewed in the theoretical background, anonymity has been found to spur aggressive behavior and incivility, particularly in digital contexts, where the online disinhibition effect intensifies these dynamics. Furthermore, users who see that ideas or attitudes which are otherwise socially acceptable are being freely posted and even upvoted on Yik Yak might be encouraged to participate. Such users might post their own comments or even take advantage of the app’s anonymity protection in order to “one-up” the other posters. Indeed, research shows that anonymous digital environments like Yik Yak are especially effective at encouraging group polarization and instances of “one-upmanship,” as like-minded individuals, under the shield of anonymity, become more extreme in their thinking after observing or participating in a group discussion (Sia, et al., 2002). The anonymity of Yik Yak also facilitates the bystander effect, as users become less likely to take action in defense of the victim when they notice personal threats and bullying behavior on the anonymous app.

Hyperlocality (“herd”)

Another key feature of Yik Yak, and an important complement to its anonymity, is its hyperlocality. Although users can “peek” into the Yik Yak “herd” at other campuses, only those who are physically in the vicinity of a campus can actually post to that “herd”. The radius varies between 1.5 to 10 miles, depending on population density and number of users. The rhetoric of the app emphasized hyperlocality: according to the description of the app on Apple’s App Store reads: “Yik Yak is a location-based social network that helps you connect with the people right around you. By letting you express yourself, exchange thoughts, and explore your world, Yik Yak helps you feel at home within your local community.” (Interestingly, anonymity was not even mentioned in the official descriptions of the app, neither here nor on the Yik Yak Web site.) The few empirical studies existing on Yik Yak have confirmed the paramount importance of location in terms of the social function of “yaks” within a college “herd” (Northcut, 2015; Black, et al., 2016). For example, by coding “yaks” at Missouri University of Science and Technology, Northcut (2015) found that location is integral to understanding one-third of yaks in the observed &dquo;herd.” Similarly, in a comprehensive content analysis of Yik Yak, Black, et al. (2016) collected 4001 anonymous posts from 42 different U.S. college campuses and confirmed that location-specific posts are the most frequently observed kind of yak across communities.

Thus, the hyperlocality is key, especially in combination with the anonymity feature. Knowing that these anonymous comments are coming from people that are physically close, the negative effects of anonymity are amplified and cyberbullying messages can become actual threats given the physical proximity. Anonymity is a crucial characteristic that distinguishes cyberbullying from traditional bullying, in which the perpetrators are usually known by others in school or in other social contexts (Anderson and Sturm, 2007). However, what makes Yik Yak particularly dangerous is its ability to combine characteristics of cyberbullying and traditional bullying, by enabling anonymity and location-based communication. Thus, the cyberbullies can be protected by the shield of anonymity, while the victims are further tormented by the knowledge that their harassers are known to them, or at least members of the same community and in close physical proximity.

Voting aggregation (“upvotes”, “downvotes”, “yakarma”)

In a style similar to Reddit, Yaks can be “upvoted” or “downvoted”: yaks with the most upvotes will be moved up on the herd’s feed, whereas yaks with the most downvotes will be moved down. In addition, a new content regulation feature — introduced in response to complaints regarding cyberbullying and hate speech — stipulates that any post that receives more than five “downvotes” will be automatically removed from the feed. A “yakarma” score was also introduced to indicate user activity and amount of feedback both given and received, offering more incentives for user participation.

In cases of cyberbullying, the upvote-downvote feature of Yik Yak essentially creates an online anonymous replica of the group dynamics that characterize traditional bullying situations, as suggested by Salmivalli, et al. (1996). The bully in this case would be the anonymous “yaker” who posts the bullying message, the victim is the targeted individual, while the other users of the app within the same “herd” are bystanders who can either cheer for the bully by “upvoting,” or support the victim by “downvoting” the post. Because of the anonymous nature of Yik Yak and the ease of upvoting and downvoting, it is extremely easy for a bystander to become a supporter of the bully, requiring very little effort and no risk or accountability. In fact, the most damaging effect of the app is for the victim to see a bullying post “upvoted” by other users — as in the case of the leader of Colgate University’s Association of Critical Collegians, mentioned at the beginning of this article — given that the original poster and the upvoters are anonymous peers in close physical proximity. Furthermore, research has shown that bullying is more likely to continue if peer bystanders are present when it occurs (O’Connell, et al., 1999) and, due to the “perfect storm” combination between anonymity, hyperlocality and the user regulated feed, this kind of reinforcement process is especially prevalent on Yik Yak.

Discussion: Misuse or misdesign?

The creators of Yik Yak claimed they came up with the idea after noticing that there were only a handful of popular Twitter accounts at their university, almost all belonging to prominent students like athletes (Mahler, 2015). By providing an anonymous and open platform for widespread hyperlocal participation, Yik Yak thus aimed to give everyone a voice and, in a sense, equalize participation surrounding campus-related issues. And indeed, as mentioned in the theoretical background, anonymity can have a positive effect in certain contexts, especially in terms of democratizing participation and encouraging freedom of expression (Kang, et al., 2016; Maczewski, 2002; Ren, et al., 2010). Yet, while Yik Yak could and did enable benign and sometimes even useful communication (Black, et al., 2016; Hess, 2015), the app also seemed to frequently invite and facilitate anti-normative behavior such as cyberbullying, hate speech and incivility, as illustrated by the numerous controversies that Yik Yak has stirred up on college campuses. In this context, a complicated yet valuable question to ask is: is this a misuse of an otherwise neutral technology, or a problematic technological design?

In his famous essay “Do artifacts have politics?”, Langdon Winner (1980) proposed a “theory of technological politics” that urges us to pay attention to the characteristics of technological systems. Winner illustrated how seemingly neutral technologies can indeed be inherently authoritarian or democratic, and how technological systems can reinforce certain power dynamics. In Winner’s words, “the adoption of a given technical system unavoidably brings with it conditions for human relationships that have a distinctive political cast — for example, centralized or decentralized, egalitarian or inegalitarian, repressive or liberating” [4]. However, beyond the features of the technology, the larger sociocultural context is a crucial determinant of technology use, and Yik Yak is a telling example in this case. The app is inextricably linked to social dynamics and institutionalized patterns of power and authority including deep-rooted discrimination based on race and gender. The vitriol on Yik Yak not only mirrors the larger context of societal racism, sexism, homophobia, class politics, and the way these tensions flare up, but also shapes and reforms public discourse, especially in the transitional college years, which often bring about the clash of different values within the confines — physical, intellectual, emotional — of the college campus. Thus, while Yik Yak, as with other similar mobile technologies, is arguably an egalitarian system with potentially liberating features, its practical use, particularly in college contexts, provides evidence for the tendency of open anonymous technological systems to amplify inequality (Taylor, 2014) and increase status differences (Weisband, 1994) where social hierarchies are in place.

As argued above, the social and cultural context plays a central role in shaping the use of technologies, and individuals have agency as users of technologies; the classic cultural deterministic argument — guns don’t kill people, people do — stresses the moral neutrality of technological artifacts, and their openness, their versatility of use. However, as Sacasas (2014) argued, when considering the moral implications of any given technology, asking only how people will use a certain tool is far from sufficient, as technologies possess an “inescapable, layered, and multi-faceted” moral dimension (Sacasas, 2014). Indeed, we need to think about how the use of a certain technology will impact its users. What habits will the use of the technology instill? What behaviors will it cultivate? What will the use of the technology encourage users to notice or ignore? What desire does the use of this technology generate or dissipate? Using the example of a hammer, Sacasas (2014) notes:

A hammer may indeed be used to either build a house or bash someone’s head in. In this view, technology is morally neutral and the only morally relevant question is this: What will I do with this tool? But is this really the only morally relevant question one could ask? For instance, pursuing the example of the hammer, might I not also ask how having the hammer in hand encourages me to perceive the world around me? Or, what feelings having a hammer in hand arouses?

In this sense, it is worth considering the kind of mindset mobile communication technologies like Yik Yak are putting their users in, and the kinds of uses they are inviting and enabling. As we have argued here, the app’s features — especially the potent combination between anonymity, hyperlocality and riskless social endorsement via voting aggregation — create a specific kind of environment, which is particularly fertile for anti-normative behavior. Therefore, based on existing research on digital anonymity and cyberbullying, we suggest that these consequences are not incidental, but rather inherent in this combination of features and therefore in the design of the app itself. Rather than just enabling incivility, hate speech and cyber-bullying, technology — especially on mobile platforms that are “always on” — with the design characteristics such as Yik Yak, is in fact encouraging these hateful activities and normalizing its consequences. As we move toward an increasingly more mobile, digital and connected society, app developers should be particularly mindful of inherent moral direction of the technology they design, taking caution and control of the kind of activities or thoughts their product would induce.



Towards a more ethical design process: Recommendations for mobile app design

As the above analysis shows, the design and features of a technology can have a direct and substantial impact on the kind of behaviors it encourages. We further Bowler, et al.’s (2015) call that “interface and interaction design ... should take into account and seek to mediate a complex set of social values to have an effect on cyberbullying behaviors” [5].

Learning from the case of Yik Yak, we offer a few guidelines for designing socially responsible digital platforms, by emphasizing the incorporation of moral considerations into design process.

1) Think about ethical dimensions of technology use when defining goals

Designers of social technologies should start thinking about the moral dimension of their product at the initial ideation stage, when defining goals. Several questions posed by Sacasas (2014) in his ethics inquiry prove especially valuable here:

What habits will this app instill?

What kind of behaviors will the use of this app cultivate or displace?

What kind of mindset will the use of this app puts its users in?

How will the use of this app affect how its user think of other people?

What feelings does the use of this app generate in its user towards others?

What desires does the use of this app generate?

What possibilities for action does this app foreclose? Is it good that these actions are directed?

What are the potential harms to the user oneself, others, or the world that might result from the use of this app?

Of course, depending on the nature of their technology and the services it provides, there may be other, more specific questions to be asked. The designers should also determine how these goals would translate into actual features and functionality, and consider carefully whether these goals will be seen in the same way by the users. Remember that these choices are not exclusively about design per se, but more about the mindset you are putting your users in by implementing or leaving out a certain feature or design. Participant design recommendations can provide further insight into the interaction between design choices and user values and behavior (Bowler, et al., 2015); Fleischmann, et al. (2017) further this call to build from the “bottom up” to develop a professional code of ethics guiding design based on users’ values and reality.

2) Research the use and misuse of similar technologies

It is essential for any technology developer to research the uses and misuses of similar technologies, particularly in terms of social, cultural and ethical consequences. For example, if Yik Yak’s developers could have learned from the experience of Whisper, Secret and Juicy Campus, they might have prevented some of the most negative consequences of Yik Yak use, by introducing safety features and moderation mechanisms from the very launch of the app. For example, the founder of Juicy Campus, Matt Ivester, shut down the Web site in 2008 after a few failed attempts to urge users to be less cruel. His final note read: “It gets to you when you realize that the site you thought would be kind of fun is really more hurtful than interesting or helpful. From an emotional standpoint, I’m a good nice person and I found myself in this position where I was running this company that was a cyberbullying platform” (Reinsberg, 2013). Similarly, the founders of one of the most successful anonymous social media apps, Secret, decided to shut down the app after dealing with cyberbullying, racism and sexism (Constine, 2016; Thielman, 2015). It is worth noting that the farewell message posted by Yik Yak’s founders on 28 April 2017 did not make any mention of these negative effects and controversies; rather, it framed the decision to shut down the app in very positive terms, and spoke of gratitude, passion and future moves for the company (Statt, 2017).

3) Set up clear community guidelines and regulations

Designers of any social platform should ensure that users have a clear and facile way to report or flag threatening comments, and make community guidelines — as well as the consequences of not adhering to these guidelines — clear from the beginning. The Yik Yak team implemented these much too late, once the reputation of the app and its potential for antinormative behavior were already established. Yik Yak didn’t have a “report” feature until its developers were pressured by the widespread media coverage of cyberbullying incidents and racial controversies surrounding its use. Also, it wasn’t until 2015 that Yik Yak introduced a set of rules that would automatically appear on a user’s screen before their first yak. The first rule of Yik Yak read that “You do not bully or specifically target others. This includes but is not limited to defaming, abusing, harassing, stalking, and threatening others.” The app also warned users that “if your yaks are repeatedly reported or flagged, you will be suspended” and, as mentioned previously, if a yak received five downvotes it would be permanently removed from the feed. Later, content filters were also added in order to prevent full names from being posted on the app, and to issue warning messages that pop up when users type certain sensitive words like “bomb” (Hess, 2015). However, while these are laudable safeguards, they were made available much too late to realistically cause a positive shift in user behavior.

4) Set up mechanisms for social accountability

Mechanisms for ensuring social accountability are crucial for any social interaction platform. Research has shown that requiring users to set up a public profile — even if this is just a username and a profile picture — generally decreases aggressive behavior by reminding users of the humanity and individuality of their fellow community members (Santana, 2014). Again, Yik Yak was slow to realize the importance of such features. A later update of the app required users to have a handle, a profile pic, and a short bio — all of which had previously been optional. The update also allowed users to connect Yik Yak with their Snapchat, Instagram, Twitter, Tumblr, Pinterest, Facebook and LinkedIn account (Horn, 2016). Of course, these features were optional, as this information could be falsified and users could continue to post anonymously but nonetheless, it represented a step in the right direction.

5) Test, iterate, improve

Designers should test their prototype with the target audience and collect extensive feedback from potential users. This is an essential step from the perspective of ethical design; if any unforeseen uses — especially anti-normative ones — emerge in these testing phases, developers should adjust their plans by adding, deleting or twisting specific features. Furthermore, beyond the target user group, it is also important to consider how the technology could be used differently by other kinds of users, and how it might be used by individuals versus groups. In addition to users, designers should also consider other relevant stakeholders that might be involved in or affected by the use of their app; in Yik Yak’s case, for example, especially given its connection to the context of higher education, it would have been worth consulting with educational researchers, counselors, school administrators, etc. Finally, the iteration and improvement process should not stop once the technology is launched. Post-launch, it is crucial to provide a clear and helpful feedback system and customer service contacts, as the designers continue to monitor uses and problems, improving the platforms according to the feedback in a timely manner.




As we have aimed to illustrate here, the design of social technologies — in combination, of course, with other contextual factors — can have a significant impact on the types of uses and behaviors that are facilitated; therefore, moral considerations should be a crucial part of the process of technological design. This is particularly important when it comes to digital technologies and social media, as digital participation is most often framed in an idealized way. As several scholars have noted (see, for instance, Fish, et al., 2011; Jenkins and Carpentier, 2013; Kelty, 2012; Literat, 2016), the claim that all digital participation is necessarily positive and invariably empowering is problematic, hiding the complexities and diversity inherent in participatory experiences with digital tools.

In the same time, further empirical research is needed in order to better understand the impacts of anonymous social media like Yik Yak, and the way that participants experience these platforms. Given the negative reputation of apps like Yik Yak, it is particularly important to achieve a more nuanced understanding of the interactions that take place, and to consider the positive effects that participation on these apps might have as well, especially in terms of equalizing participation and strengthening community ties. Furthermore, it would be interesting for scholars to look at the role of cultural factors in the the use of anonymous and hyperlocal platforms, and investigate whether the use of such technologies would differ in other contexts or user groups.

Last but not least, the legal implications of anonymous social media use remain a valuable topic for continued examination. For example, can someone sue for harassment or defamation experienced on anonymous networks like Yik Yak? Can and should the app creators be held responsible on the legal front if a tragedy happens, like at the University of Mary Washington? Given the sustained popularity of anonymous and semi-anonymous apps on college campuses — where apps like Kik, Blind and Whisper hope to fill the void recently left by Yik Yak — future research should also look at how schools, educators and administrators might take proactive measures on or monitor the use of digital technologies that could negatively affect the functioning of educational institutions. End of article


About the authors

Qinglan (Angel) Li obtained her M.A. degree at Teachers College, Columbia University.
E-mail: ql2229 [at] tc [dot] columbia [dot] edu

Ioana Literat is Assistant Professor in the Communication, Media & Learning Technologies Design program at Teachers College of Columbia University.
E-mail: il2311 [at] tc [dot] columbia [dot] edu



1. Schlesinger, et al., 2017, p. 6,917.

2. Schlesinger, et al., 2017, p. 6,921.

3. Suler, 2004, p. 322.

4. Winner, 1980, p. 128.

5. Bowler, et al., 2015, p. 1,289.



E. Aboujaoude, M.W. Savage, V. Starcevic, and W.O. Salame, 2015. “Cyberbullying: Review of an old problem gone viral,” Journal of Adolescent Health, volume 57, number 1, pp. 10–18.
doi:, accessed 19 June 2017.

T. Anderson and B. Sturm, 2007. “Cyberbullying from playground to computer,” Young Adult Library Services, volume 5, number 2, pp. 24–27.

M. Baker, 2014. “Cyberbullying and the bystander: What promotes or inhibits adolescent participation?” unpublished doctoral dissertation, University of Exeter, at, accessed 19 June 2017.

C.P. Barlett, 2015. “Anonymously hurting others online: The effect of anonymity on cyberbullying frequency,” Psychology of Popular Media Culture, volume 4, number 2 pp. 70–79.
doi:, accessed 19 June 2017.

C.P. Barlett, D.A. Gentile, and C. Chew, 2016. “Predicting cyberbullying from anonymity,” Psychology of Popular Media Culture, volume 5, number 2, pp. 171–180.
doi:, accessed 19 June 2017.

M.S. Bernstein, A. Monroy-Hernández, D. Harry, P. André, K. Panovich, and G. Vargas, 2011. “4chan and /b/: An analysis of anonymity and ephemerality in a large online community,” Proceedings of the Fifth International AAAI Conference on Weblogs and Social Media, pp. 50–57, and at, accessed 19 June 2017.

I.R. Berson, M.J. Berson, and J.M. Ferron, 2007. “Emerging risks of violence in the digital age: Lessons for educators from an online study of adolescent girls in the United States,” Journal of School Violence, volume 1, number 2, pp. 51–71.
doi:, accessed 19 June 2017.

E.W. Black, K. Mezzina, and L.A. Thompson, 2016. “Anonymous social media — Understanding the content and context of Yik Yak,” Computers in Human Behavior, volume 57, pp. 17–22.
doi:, accessed 19 June 2017.

L. Bowler, C. Knobel, and E. Mattern, 2015. “From cyberbullying to well–being: A narrative–based participatory approach to values–oriented design for social media,” Journal of the Association for Information Science and Technology, volume 66, number 6, pp. 1,274–1,293.
doi:, accessed 19 June 2017.

d. boyd, 2014. It’s complicated: The social lives of networked teens. New Haven, Conn.: Yale University Press.

W.N. Brown, 2015. Feb 6). “Schools banning phone Yik Yak for promoting hate speech,” (6 February), at, accessed 19 June 2017.

P.J. Boczkowski, 1999. “Mutual shaping of users and technologies in a national virtual community,” Journal of Communication, volume 49, number 2, pp. 86–108.
doi:, accessed 19 June 2017.

C. Chang, 2010. “Internet safety survey: Who will protect the children?” Berkeley Technology Law Journal, volume 25, number 1, pp. 501–527, and at, accessed 19 June 2017.

C. Christie and E. Dill, 2016. “Evaluating peers in cyberspace: The impact of anonymity,” Computers in Human Behavior, volume 55, part A, pp. 292–299.
doi:, accessed 19 June 2017.

K.M. Christopherson, 2007. “The positive and negative implications of anonymity in Internet social interactions: ‘On the Internet, nobody knows you’re a dog’,” Computers in Human Behavior, volume 23, number 6, pp. 3,038–3,056.
doi:, accessed 19 June 2017.

J. Constine, 2016. “Yik Yak’s CTO drops out as the hyped anonymous app stagnates,” TechCrunch (6 April), at >, accessed 19 June 2017.

R. Cover, 2015. Digital identities: Creating and communicating the online self. Boston: Academic Press/Elsevier.

S.L. Craig and L. Mclnroy, 2014. “You can form a part of yourself online: The influence of new media on identity development and coming out for LGBTQ youth,” Journal of Gay & Lesbian Mental Health, volume 18, number 1, pp. 95–109.
doi:, accessed 19 June 2017.

D.C. DeAndrea, S. Tom Tong, Y. Liang, T.R. Levine, and J.B. Walther, 2012. “When do people misrepresent themselves to others? The effects of social desirability, ground truth, and accountability on deceptive self-presentations,” Journal of Communication, volume 62, number 3, pp. 400–417.
doi:, accessed 19 June 2017.

C. Dewey, 2014. “How do you solve a problem like Yik Yak?” Washington Post (7 October), at, accessed 19 June 2017.

J.N. DiStefano, 2014. “Penn State student arrested in Yik Yak social media threat,” Morning Call (13 October), at, accessed 19 June 2017.

B. Drake, 2014. “The darkest side of online harassment: Menacing behavior,” Pew Research Center (1 June), at, accessed 19 June 2017.

M. Duggan, 2014. “Online harassment: Summary of findings,” Pew Research Center (22 October), at, accessed 19 June 2017.

N.B. Ellison, L. Blackwell, C. Lampe, and P. Trieu, 2016. “‘The question exists, but you don’t exist with it’: Strategic anonymity in the social lives of adolescents,” Social Media + Society, volume 2, number 4.
doi:, accessed 19 June 2017.

K.R. Fleischmann, C. Hui, and W.A. Wallace, 2017. “The societal responsibilities of computational modelers: Human values and professional codes of ethics,” Journal of the Association for Information Science and Technology, volume 68, number 3, pp. 543–552.
doi:, accessed 19 June 2017.

A. Fish, L.F.R. Murillo, L. Nguyen, A. Panofsky, and C.M. Kelty, 2011. “Birds of the internet: Towards a field guide to the organization and governance of participation,” Journal of Cultural Economy, volume 4, number 2, pp. 157–187.
doi:, accessed 19 June 2017.

R. Forero, L. McLellan, C. Rissel, and A. Baumann, 1999. “Bullying behaviour and psychosocial health among school students in New South Wales, Australia: Cross sectional survey,” British Medical Journal, volume 319, number 7206 (7 August), pp. 344–348, and at, accessed 19 June 2017.
doi:, accessed 19 June 2017.

A. Görzig and L. Frumkin, 2013. “Cyberbullying experiences on-the-go: When social media can become distressing,” Cyberpsychology, volume 7, number 1, article 4.
doi:, accessed 19 June 2017.

P. Gradinger, D. Strohmeier, and C. Spiel, 2009. “Traditional bullying and cyberbullying: Identification of risk groups for adjustment problems,” Zeitschrift für Psychologie/Journal of Psychology, volume 217, pp. 205–213.
doi:, accessed 19 June 2017.

S. Hall, 1987. “Minimal selves,” In: Identity: The real me: Postmodernism and the question of identity. London, England: Institute of Contemporary Arts, pp. 44–46.

C. Hardaker, 2010. “Trolling in asynchronous computer-mediated communication: From user discussions to academic definitions,” Journal of Politeness Research, volume 6, number 2, pp. 215–242.
doi:, accessed 19 June 2017.

D.S.J. Hawker and M.J. Boulton, 2000. “Twenty years’ research on peer victimization and psychosocial maladjustment: A meta-analytic review of cross-sectional studies,” Journal of Child Psychology and Psychiatry, volume 41, number 4, pp. 441–455.
doi:, accessed 19 June 2017.

D.L. Hawkins, D.J. Pepler, and W.M. Craig, 2001. “Naturalistic observations of peer interventions in bullying,” Social Development, volume 10, number 4, pp. 512–527.
doi:, accessed 19 June 2017.

A. Hess, 2015. “The upside of Yik Yak,” Slate (10 March), at, accessed 19 June 2017.

A. Hill, 2012. “If you can’t say anything nice, come log on to the Internet,” Mercury News (29 August), at 21427144/if-you-cant-say-anything-nice-come-log, accessed 19 June 2017.

C.Y.D. Hoeksema-van Orden, A.W.K. Gaillard, and B.P. Bruunk, 1998. “Social loafing under fatigue,” Journal of Personality and Social Psychology, volume 75, number 5, pp. 1,179–1,190.
doi:, accessed 19 June 2017.

L. Horn, 2016. “Yik Yak is no longer anonymous,” Vocativ (18 August), at, accessed 19 June 2017.

G. Horsman, 2016. “The challenges surrounding the regulation of anonymous communication provision in the United Kingdom,” Computers & Security, volume 56, pp. 151–162.
doi:, accessed 19 June 2017.

H. Jenkins and N. Carpentier, 2013. “Theorizing participatory intensities: A conversation about participation and politics,” Convergence, volume 19, number 3, pp. 265–286.
doi:, accessed 19 June 2017.

A.N. Joinson, 2003. Understanding the psychology of Internet behaviour: Virtual worlds, real lives. Basingstoke, Basingstoke, Hampshire: Palgrave Macmillan.

J. Juvonen and E.F. Gross, 2008. “Extending the school grounds? Bullying experiences in cyberspace,” Journal of School Health, volume 78, number 9, pp. 496–505.
doi:, accessed 19 June 2017.

R. Kang, L. Dabbish, and K. Sutton, 2016. “Strangers on your phone: Why people use anonymous communication applications,” CSCW ’16: Computer Supported Cooperative Work and Social Computing, pp. 359–370.
doi:, accessed 19 June 2017.

R. Kang, S. Brown, and S. Kiesler, 2013. “Why do people seek anonymity on the internet? Informing policy and design,” CHI ’13: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2,657–2,666.
doi:, accessed 19 June 2017.

C. Kelty, 2012. “From participation to power,” In: A. Delwiche and J.J. Henderson (editors). Participatory cultures handbook. New York: Routledge, pp. 22–31.

Y.S. Kim, Y.-J. Koh, and B. Leventhal, 2005. “School bullying and suicidal risk in Korean middle school students,” Pediatrics, volume 115, number 2, pp. 357–363.
doi:, accessed 19 June 2017.

T. Kingkade, 2015. “Complaint claims university where student was killed failed to act on relentless Yik Yak threats,” Huffington Post (7 May), at, accessed 19 June 2017.

R.M. Kowalski, S.P. Limber, and P.W. Agatston, 2008. Cyber bullying: Bullying in the digital age. Malden, Mass.: Blackwell.

K. Kozlowski, 2015. “Yik Yak catches flak from Mich. universities,” Detroit News (10 April), at, accessed 19 June 2017.

E. Kraft, 2011. “Online bystanders: Are they the key to preventing cyberbullying?” at, accessed 19 June 2017.

N. Lapidot-Lefler and A. Barak, 2012. “Effects of anonymity, invisibility, and lack of eye-contact on toxic online disinhibition,” Computers in Human Behavior, volume 28, number 2, pp. 434–443.
doi:, accessed 19 June 2017.

B. Le Hénaff, N. Michinov, O. Le Bohec, and M. Delaval, 2015. “Social gaming is inSIDE: Impact of anonymity and group identity on performance in a team game-based learning environment,” Computers & Education, volume 82, pp. 84–95.
doi:, accessed 19 June 2017.

M. Lea, R. Spears, and D. de Groot, 2001. “Knowing me, knowing you: Anonymity effects on social identity processes within groups,” Personality and Social Psychology Bulletin, volume 27, number 5, pp. 526–537.

A. Lenhart, M. Madden, A. Smith, K. Purcell, K. Zickuhr, and L. Rainie, 2011. “Teens, kindness and cruelty on social network sites,” Pew Research Center (9 November), at, accessed 19 June 2017.

I. Literat, 2016. “Interrogating participation across disciplinary boundaries: Lessons from political philosophy, cultural studies, education, and art,” New Media & Society, volume 18, number 8, pp. 1,787–1,803.
doi:, accessed 19 June 2017.

M. Maczewski, 2002. “Exploring identities through the Internet: Youth experiences online,” Child and Youth Care Forum, volume 31, number 2, pp. 111–129.
doi:, accessed 19 June 2017.

J. Mahler, 2015. “Who spewed that abuse? Anonymous Yik Yak app isn’t telling,” New York Times (8 March), at, accessed 19 June 2017.

P.M. Markey, S.M. Wells, and C.N. Markey, 2002. “Social and personality psychology in the culture of cyberspace,” Advances in Psychology Research, volume 9, pp. 9–94.

K.L. Mason, 2008. “Cyberbullying: A preliminary assessment for school personnel,” Psychology in the Schools, volume 45, number 4, pp. 323–348.
doi:, accessed 19 June 2017.

K.Y.A. McKenna and J.A. Bargh, 2000. “Plan 9 from cyberspace: The implications of the Internet for personality and social psychology,” Personality and Social Psychology Review, volume 4, number 1, pp. 57–75.
doi:, accessed 19 June 2017.

A. McRobbie, 1994. Postmodernism and popular culture. London: Routledge.

T.R. Nansel, M.D. Overpeck, D.L. Haynie, W.J. Ruan, and P.C. Scheidt, 2003. “Relationships between bullying and violence among US youth,” Archives of Pediatrics and Adolescent Medicine, volume 157, number 4, pp. 348–353.
doi:, accessed 19 June 2017.

J. New, 2014. “‘Can you hear us now?’” Inside Higher Ed (24 September), at, accessed 19 June 2017.

K.M. Northcut, 2015. “Dark side or insight? Yik Yak and culture on campus,” Proceedings from 2015 IEEE International Professional Communication Conference (IPCC).
doi:, accessed 19 June 2017.

P. O’Connell, D. Pepler, and W. Craig, 1999. “Peer involvement in bullying: Insights and challenges for intervention,” Journal of Adolescence, volume 22, number 4, pp. 437–452.
doi:, accessed 19 June 2017.

D.M. Pederson, 1997. “Psychological functions of privacy,” Journal of Environmental Psychology, volume 17, number 2, pp. 147–156.
doi:, accessed 19 June 2017.

L. Potts, 2014. Social media in disaster response: How experience architects can build for adaptation. New York: Routledge.

P. Reid, J. Monsen, and I. Rivers, 2004. “Psychology’s contribution to understanding and managing bullying within schools,” Educational Psychology in Practice, volume 20, number 3, pp. 241–258.
doi:, accessed 19 June 2017.

H. Reinsberg, 2013. “How Juicy Campus’ founder became the poster boy for Internet niceness,” BuzzFeed News (19 November), at, accessed 19 June 2017.

Y. Ren, R. Kraut, S. Kiesler, and P. Resnick, 2010. “Encouraging commitment in online communities,” In: R.E. Kraut and P. Resnick (editors). Building successful online communities: Evidence-based social design. Cambridge, Mass.: MIT Press, pp. 77–124.

V. Rogers, 2010. Cyberbullying: Activities to help children and teens to stay safe in a texting, twittering, social networking world. London: Jessica Kingsley Publishers.

M. Sacasas, 2014. “Do artifacts have ethics?” (29 November), at, accessed 19 June 2017.

C. Salmivalli, 2010. “Bullying and the peer group: A review,” Aggression and Violent Behavior, volume 15, number 2, pp. 112–120.
doi:, accessed 19 June 2017.

C. Salmivalli, K. Lagerspetz, K. Björkqvist, K. Österman, and A. Kaukiainen, 1996. “Bullying as a group process: Participant roles and their relations to social status within the group,” Aggressive Behavior, volume 22, number 1, pp. 1–15.

A.D. Santana, 2014. “Virtuous or vitriolic: The effect of anonymity on civility in online newspaper reader comment boards,” Journalism Practice, volume 8, number 1, pp. 18–33.
doi:, accessed 19 June 2017.

M. Saveski, S. Chou, and D. Roy, 2016. “Tracking the Yak: An empirical study of Yik Yak,” MIT Media Lab, at, accessed 19 June 2017.

A. Schlesinger, E. Chandrasekharan, C.A. Masden, A.S. Bruckman, W.K. Edwards, and R.E. Grinter, 2017. “Situated anonymity: Impacts of anonymity, ephemerality, and hyper-locality on social media,” CHI ’17: Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 6,912–6,924.
doi:, accessed 19 June 2017.

A. Ševčíková, D. Šmahel, and M. Otavová, 2012. “The perception of cyberbullying in adolescent victims,” Emotional and Behavioural Difficulties, volume 17, numbers 3–4, pp. 319–328.
doi:, accessed 19 June 2017.

S.Y. Schoenebeck, 2013. “The secret life of online moms: Anonymity and disinhibition on,” Proceedings of the Seventh International AAAI Conference on Weblogs and Social Media, pp. 555–562.

J.A. Sheppard and K.M. Taylor, 1999. “Social loafing and expectancy-value theory,” Personality and Social Psychology Bulletin, volume 25, number 9, pp. 1,147–1,158.
doi:, accessed 19 June 2017.

C.-L. Sia, B.C.Y. Tan, and K.-K. Wei, 2002. “Group polarization and computer-mediated communications: Effects of communication cues, social presence, and anonymity,” Information Systems Research, volume 13, number 1, pp. 70–90.
doi:, accessed 19 June 2017.

J. Siegel, V. Dubrovsky, S. Kiesler, and T.W. McGuire, 1986. “Group processes in computer-mediated communication,” Organizational Behavior and Human Decision Processes, volume 37, number 2, pp. 157–187.
doi:, accessed 19 June 2017.

J. Sigl, 2015. “Viewpoint: Hate speech on Yik Yak is a Catch-22,” USA Today (19 May), at, accessed 19 June 2017.

R. Slonje and P.K. Smith, 2008. “Cyberbullying: Another main type of bullying?” Scandinavian Journal of Psychology, volume 49, number 2, pp. 147–154.
doi:, accessed 19 June 2017.

J. Sonderman, 2011. “News sites using Facebook comments see higher quality discussion, more referrals,” (18 August), at, accessed 19 June 2017.

R. Spears and M. Lea, 1994. “Panacea or panopticon? The hidden power in computer-mediated communication,” Communication Research, volume 21, number 4, pp. 427–459.
doi:, accessed 19 June 2017.

R. Spears and M. Lea, 1992. “Social influence and the influence of the ‘social’ in computer-mediated communication,” In: M. Lea (editor). Contexts of computer-mediated communication. New York: Harvester Wheatsheaf, pp. 30–65.

R. Spears, M. Lea, and T. Postmes, 2001. “Social psychological theories of computer-mediated communication: Social pain or social gain,” In: W.P. Robinson and H. Giles (editors). New handbook of language and social psychology. New York: Wiley, pp. 601–623.

L. Sproull and S. Kiesler, 1991. Connections: New ways of working in the networked organization. Cambridge, Mass.: MIT Press.

L. Sproull and S. Kiesler, 1986. “Reducing social context cues: Electronic mail in organizational communications,” Management Science, volume 32, number 11, pp. 1,492–1,512.
doi:, accessed 19 June 2017.

N. Statt, 2017. “Yik Yak, once valued at $400 million, shuts down and sells off engineers for $1 million,” The Verge (28 April), at, accessed 29 June 2017.

A. Stone and T. Kingkade, 2014. “Racist posts on Yik Yak prompt student protest at Colgate University,” Huffington Post (24 September), at, accessed 19 June 2017.

J. Suler, 2004. “The online disinhibition effect,” CyberPsychology & Behavior, volume 7, number 3, pp. 321–326.
doi:, accessed 19 June 2017.

A. Taylor, 2014. The people’s platform: Taking back power and culture in the digital age. New York: Metropolitan Books.

S. Thielman, 2015. “Controversial anonymous networking app Secret to close down,” Guardian (29 April), at, accessed 19 June 2017.

B. Trolley, C. Hanel, and L.L. Shields, 2006. Demystifying & deescalating cyber bullying in the schools: A resource guide for counselors, educators and parents. Bangor, Me.:

M. Ward, 2015. “Yik Yak threats on college campuses: Missouri arrests highlight growing problem,” International Business Times (12 November), at, accessed 19 June 2017.

S. Weisband, 1994. “Overcoming social awareness in computer-supported groups: Does anonymity really help?” Computer-Supported Cooperative Work, volume 2, number 4, pp. 285–297.
doi:, accessed 19 June 2017.

C.M. Werner, I. Altman, and B.B. Brown, 1992. “A transactional approach to interpersonal relations: Physical environment, social context, and temporal,” Journal of Social and Personal Relationships, volume 9, number 2, pp. 297–323.
doi:, accessed 19 June 2017.

N. Williard, 2005. “Educator’s guide to cyberbullying and cyberthreats,” at, accessed 19 June 2017.

L. Winner, 1980. “Do artifacts have politics?” Daedalus, volume 109, number 1, pp. 121–136.

M.F. Wright, 2013. “The relationship between young adults’ beliefs about anonymity and subsequent cyber aggression,” Cyberpsychology, Behavior, and Social Networking, volume 16, number 12, pp. 858–862.
doi:, accessed 19 June 2017.

M. Ybarra and K.J. Mitchell, 2004. “Online aggressor/targets, aggressors, and targets: A comparison of associated youth characteristics,” Journal of Child Psychology and Psychiatry, volume 45, number 7, pp. 1,308–1,316.
doi:, accessed 19 June 2017.

P.G. Zimbardo, 1969. “The human choice: Individuation, reason, and order versus deindividuation, impulse, and chaos,” In: W.J. Arnold and D. Levine (editors). Nebraska symposium on motivation, 1969. Lincoln: University of Nebraska Press, pp. 237–307.


Editorial history

Received 9 September 2016; revised 28 May 2017; accepted 30 May 2017.

Creative Commons License
“Misuse or misdesign? Yik Yak on college campuses and the moral dimensions of technology design” by Qinglan Li & Ioana Literat is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.

Misuse or misdesign? Yik Yak on college campuses and the moral dimensions of technology design
by Qinglan Li and Ioana Literat.
First Monday, Volume 22, Number 7 - 3 July 2017