A self-efficacy informed approach to anonymously locating digital disruptors
First Monday

A self-efficacy informed approach to anonymously locating digital disruptors by James F. Popham

Young, politically motivated, and technologically savvy individuals have been instrumental in bringing about social change through the first decades of the twenty-first century. These tech-savvy “disruptors” anonymously champion counter-hegemonic discourse and ideology by manipulating networked forms of communication. The shielding effects of these anonymous interactions also pose significant challenges for the observation and study of disruptors. The current study proposes that elements of the theory of self-efficacy, particularly mastery experiences, can be leveraged to anonymously locate disruptors from a generalized sample based on their. It employs an adapted version of the Computer Self-Efficacy Scale with a large non-random sample to test this hypothesis. Principal component analysis of the scale identifies three components in the scale — “simple,” “moderate,” and “difficult” task efficacy — that account for a majority of the variation in the sample. Components are then compared with other measures of technological skill and internet usage characteristics to better confirm the scale’s effectiveness in locating disruptors.






A growing theme in the sociological study of the Internet is the role of “disruptors” in challenging the homogeneity of digital communities at a multitude of scales [1]. These entities play an active but largely anonymous role in shaping the topography of the Internet as well as its content to form and reform populist narratives on issues ranging from social to leisure to political. To illustrate one must look no further than Facebook, which has both actively and passively mediated social movements as a host for discursive exchange and also a de facto technological censor (Jackson, 2014). The same can be said for individuals who actively seek to inject, alter, or otherwise affect popular discourse and the general nature of the internet through communicative actions, such as those undertaken by the loose collective known Anonymous (Wiedemann, 2014). Thus, through the processes of informationalization, a relatively few technologically savvy individuals who are also socially motivated now have the potential to affect real change through digital communications. Castells [2] labels these individuals as “actors of social change:” young, motivated, and technologically savvy individuals who leverage the tools of mass self-communication to champion resistance against entrenched forms of authority and engage in networks of counter-hegemonic exchange using Internet and communication technologies (ICT).

These disruptors leverage the anonymity of Internet technology to form “involvement shields” [3] that empower and legitimize disruptive, and often criminalized, activities online. For example, many experts point to disruptors and their online activism as catalyzing the Arab Spring and the Black Lives Matter movements of 2011 and 2013 respectively (e.g., Bonilla and Rosa, 2015; Gerbaudo, 2012). Generally speaking, these actions stem from an increasingly customizable Internet where actors have a greater capacity to control/divert information and reprogram knowledge networks. Advancements like social networking and the Web 2.0 empower individuals to create, engage, dismiss, and generally customize information sources, shaping newsfeeds to tautologically reflect and confirm personal ideology (McLoughlin and Lee, 2010; Andrejevic, 2011). A principal argument here is the role of self-realized technological mastery, which empowers actors to leverage this control by manipulating the information constituted on networked communication platforms (Theocharis, 2012; Wolover, 2016). While they may not necessarily engage in physical acts associated with protest (Lindsey, 2013), their heightened capacities for communication built upon extensive digital networks and technological mastery play instrumental roles in shaping discourse amongst those who will (Carroll and Hackett, 2006).

For academia, these observations pose a challenging paradigm: While the anonymity afforded by the internet empowers greater levels of communication and dissent, it also shields the disruptors from direct observation. The question, then, is one of identification — how might these actors be anonymously located in order to better understand their ideas, opinions, and activities? This reflects challenges posed by the ongoing incorporation of technology into day-to-day activities, which develop a certain level of technological proficiency (or savviness) amongst large portions of the population and particularly those who use technology at great frequency (Galor and Moav, 2000; Warschauer and Matuchniak, 2010). Thus with the rapid popularization of Internet technologies in the twenty-first century it becomes increasingly difficult to parse out the savvy actors from the masses. As a result disruptors are often poorly defined in research — for instance, Castells [4] defines them as “younger groups of the population (aged 16–34 mainly), those who are technically savvy in digital communication, and those more prone to rebellion against what they perceive as an unbearable social order.” Further complicating this challenge is the high rate of technological turnover, as measures of tech savviness often rely on concepts and technologies that become obsolete before the research is published (Lievrouw, 2012).

This paper is built upon the assumption that disruptors are a particular subset of the population possessing a high level of competency with digital technologies over and above reasonably-experienced computer users. Discussed by Fieldhouse and Nicholas [5], this competency espouses “a common sense approach to and awareness of the problems and pitfalls of exploring the highways of the Internet.” In so doing, information-wise actors develop the “ability to exercise judgment, discernment, and prudence” whilst engaging in online communications. They are able to sift through and parse out misleading, incorrect, harmful, or malicious information, monitoring both the media and the medium (Tufekci and Wilson, 2012; Bolton, et al., 2013); as Katz and Gonzalez (2016) state, these competencies are requisite of meaningful digital connectivity.

The problem of locating disruptors can be addressed through parallel research that frames the technological gap as a digital divide related to individual self-efficacy with computers and communications technology. Self-efficacy is a relational measure of one’s self-confidence in their abilities within a given field, and its cascading effect on self-directed pedagogy — termed “generative capability” by Bandura [6]. The self-efficacy theory posits that there are four sources for this cascade: mastery experience, based on successful task attainment; vicarious experience, or drawing from the success of social models; social persuasion, external positive appraisals and motivations; and affective arousal, the physiological response to attaining goals and objectives (Bandura, 1977). The pedagogical mechanics that establish pathways to the competencies requisite of disruptors have often been connected to the individual’s measure of technological self-efficacy, and particularly their mastery experiences (e.g., Plaisance, 2016; Hatlevik, et al., 2015; Bronstein, 2014; Shank and Cotton, 2014; etc.).

To this end, Bandura [7] notes that mastery experiences are the “most effective way of creating a strong sense of efficacy.” By his account, attaining circumscribed goals through action, particularly those of great complexity, builds a robust belief in one’s ability to undertake related and more advanced tasks in the future. From this success they develop mastery, or the cognitive, behavioural, and self-regulatory toolkit required to effectively recreate and improve upon the positive results that they had previously experienced (Bandura, 1995). The resiliency of these toolkits depends on the measure of difficulty associated with the related tasks: Something that an actor easily achieves does not have the same bearing on their sense of agency as a situation that requires long-term commitment and perseverance (Shillair, et al., 2015).

Given the steep learning curve requisite of deep immersion online (Hatlevik, et al., 2015), the self-efficacy theory generally, and mastery experiences specifically, provide a fitting model for parsing the technologically savvy disruptor. Indeed, Bandura [8] suggested in a revisit of the self-efficacy theory that “the new realities of the information era require advanced cognitive and self-management competencies to fulfill complex occupational roles and to manage the maze of demands of contemporary life,” effectively dovetailing self-efficacy with digital modernization. Drawing on the knowledge present in modern applications of self-efficacy theories (Eastin and LaRose, 2000; Durndell and Haag, 2002; Bélanger and Carter, 2009; Kim and Glassman, 2013; McKinley and Ruppel, 2014), and particularly the notion of mastery experiences, this paper therefore attempts to discern the technologically savvy disruptor from the relatively capable masses through adaptation of a well-established self-efficacy scale, and then validates the findings by comparing the social media usage metrics of respondents with high reported technological self-efficacy to those with lower ratings.




In this context we can infer that the technologically-savvy actors present higher levels of self-efficacy with particularly difficult computing-related tasks. The rapid turnover of technological innovation encountered through the information revolution coupled with the increasing saturation of Internet technologies in institutions of all form has firmly entrenched a digital frontier in constant flux, requiring specialized knowledge to fully navigate. An important point drawn from technological efficacy is that individuals will also make efforts to modify and control the institutions that they interact with, based on their perceived self-efficacy in that domain [9]. As tech-savvy individuals develop capacities related to negotiating the information available online, they also begin to better negotiate problems of choice and ultimately become highly selective in their knowledge-generating activities. While most individuals can navigate some of these technologies effectively, the actors capable of disruption should likely possess multiple competencies. In order to keep abreast this rapid state of change one must engage in ongoing learning that surpasses general knowledge (Dahlman and Westphal, 1981; Morahan-Martin and Schumacher, 2000; Durndell and Haag, 2002). The “tech savvy” label therefore constitutes someone who has a high sense of self-efficacy in relation to digital ability, and a capacity to engage in ongoing learning (e.g., Eastin and Larose, 2000; Norris, 2001; Teo, et al., 2011).

Moreover, self-efficacy research in relation to ICT has repeatedly confirmed the significance of mastery experiences in driving this knowledge development (e.g., Liu and Li, 2016; Revelo, et al., 2017). As a primarily individual and anonymized experience, Web-driven learning relies on internalized performance cues that most directly relate with Bandura’s (2001) discussion of the relationship between ICT mastery experiences and self-management/self-renewal. Therefore, an adapted model of self-efficacy focussing on indicators of mastery experience with digital and information technologies could play a significant role in predicting their generative capability and ultimately their level of tech-savviness (Eastin and Larose, 2000; Whitty and McLaughlin, 2007; Liu and Li, 2016; etc.). To wit, a recent meta-analysis and study conducted by France Bélanger and Lemuria Carter (2009) explored the factors that played a predictive role in the type of e-government services that Internet users accessed. Their study confirmed those who indicated greater familiarity with every-day Internet tools were more likely to engage in new online behaviours and communities than those who had limited exposure to computing. Additionally, Hatlevik, et al. (2015) measured a series of indicators influencing digital competence, including self-efficacy, and drew a significant relationship between competence and the quality of online information collected by secondary students.

More specifically, several studies have directly connected self-efficacy with technological capacities. For example, Durndell and Haag’s (2002) work contrasted levels of computer self-efficacy against computer anxiety, attitudes toward the Internet, and overall Internet experience. Their measure for self-efficacy was based upon a foundational computing questionnaire that had been validated earlier by Torkzadeh and Koutero (1994). Durndell and Haag’s (2002) adaptation of the Computer Self Efficacy Scale (CSE) included 29 statements and sought to measure four factors of computer usage (Beginner Skills; Mainframe Skills; Advanced Skills; File/Software Skills), presupposed upon Bandura’s (1986) assertion that mastery should be measured under situational circumstance. Their research indicated that respondents’ level of computing self-efficacy had a significant predictive effect on their level of anxiety with computers and also their reported levels of Internet use.

Recently, Kim and Glassman (2013) connected self-efficacy with computer usage by applying a modernized scale. As is presented in the current study, Kim and Glassman identify the importance of developing a set of measures that reflect modern technological advancements in order to better differentiate groupings of Internet users and identify the significance of self-efficacy on their generative capacities. Interestingly, Kim and Glassman [10] posit that the “distal experiences” of individuals outside of computing expertise also influence self-efficacy and thus their Internet Self-Efficacy Scale (ISS) focuses on differential emotive/interpretative measures rather than strictly technological ones. Their resulting study identifies three factors of Internet self-efficacy (reactive/generative; differentiation; organization; communication; search) that significantly correlate with test measures (‘personal information outcome’ and ‘Internet anxiety’). Kim and Glassman’s (2013) work again reinforces the value of self-efficacy models, and particularly measures of mastery experiences, as effective tools for identifying the technologically savvy individual.

Interestingly, studies of self-efficacy in relation to technological competency tend to polarize in one of two approaches: The authors either rely on standardized general measures of self-efficacy or aim for modernized measures that encapsulate technological advancements. In the case of the former, generalized measures may fall short of clarifying observed relationships, and are limited in their capacity to further explore specific concepts such as mastery experience. Moreover the best-established generalized measures of self-efficacy are designed to test its generative effect on learning and may not be directly applicable to technology (Compeau and Higgins, 1995). On the other hand, modernized scales emphasizing technological abilities have been effectively operationalized in a number of capacities; however, as might be predicted, they quickly fall victim to the temporal limitations of academic research when contrasted against the rate of technological change. For instance, blogging plays a central role in the survey that informed Kim and Glassman (2013; described earlier), a format that technology writers frequently cite as dead or replaced by micro-blogging (Heet, 2016). A set of measures for technological self-efficacy is therefore needed that bridges these challenges: specific enough to identify tech-savvy disruptors yet not beholden to temporal limitation.

A second important relationship in the literature is the extent to which online competence influences trust of the Internet, as well as trust in one’s self to mollify risk associated with online activities. The inference here is that individuals acting upon a sense of technological self-efficacy will report a heightened level of distrust and/or utilize specialized tools to mitigate the risks that they associate with interacting online (Tamjidyamcholo, et al., 2013; Celik and Yesilyurt, 2013). Again, several key studies have drawn connections between self-efficacy and trust. For example, Hsu, et al. (2007) conclude that individuals will only participate in virtual communities when they have a sense of trust that the “underlying technology infrastructure and control mechanisms are capable of facilitating transactions according to its confident expectations” [11]. Further, Lwin and Williams (2003) demonstrate the antecedents to identity protection measures undertaken by online consumers; these include trust in one’s abilities, as well as mistrust of exposure to elements of behavioural control and which are both significantly related to an individual’s interpretation of their safety online. This relationship was again demonstrated by Livingstone and Helsper (2009), who explore the relationship between opportunity and risk for teenagers when acting online. The authors found that while more advanced users exposed themselves to greater risks (associated with sharing personal information online), they did so knowingly and developed a series of strategies to mitigate said risks. Overall, these and other studies (Eynon and Malmberg, 2011; Staksrud and Livingstone, 2009; Davies and Eynon, 2013; etc.) illustrate that as technological self-efficacy increases, trust for the Internet decreases and risk mitigation activities increase. Thus the assumption of a covariate relationship between measures of efficacy and trust in one’s capacity to mitigate online risks provides a valuable tool for validating the process of identifying technologically savvy individuals and linking them with disruptors.




Electronic survey

Data to test these hypotheses was collected via an electronic survey administered using an online platform (fluidsurveys.com). The use of digital tools has increasingly become an accepted practice over the past decade, and repeatedly validated through empirical studies, particularly when the subject is relevant to the means of study and especially when youth are the targeted group (Andrews, et al., 2003; Van Kelm and Jankowski, 2006). Data collection consisted of three overlapping campaigns beginning in April 2015 and concluding the following September, and yielded a non-random sample of 614 valid responses. Generalized demographics are presented in Table 1. The online survey consisted of demographic questions, general Internet usage questions, and a 20-point questionnaire for evaluating technological self-efficacy. These were collected as part of a greater, 109-measure survey used for a dissertation.


Table 1: General demographics of survey respondents.
Currently Enrolled PSE41467.3  
Graduated PSE16326.5  
Other not completed365.9  
Active on social media61499.8  
Years using Internet-connected device61510014.64.08
Overall expertise with computers61399.74.46.75


Self-efficacy and computing

The principal means of testing for self-efficacy was adapted from Torkzadeh and Kouftero’s (1994) Computer Self Efficacy Scale (CSE); however, due to the age of this study a modernized version was sought. Durndell and Haag’s (2002) revision of the study eliminated several dated concepts from the CSE and incorporated modern computing. All told, their adapted CSE presents participants with a 29-point index of statements related to using and/or manipulating ICTs, and asks participants to rate their response to each statement using a five-point Likert scale. Durndell and Haag’s (2002) work has been utilized in upwards of 400 studies of computer self-efficacy; while some adaptations have been made, these changes are largely related to the subject of study rather than improvement to the scale (Joyce and Kirakowski, 2015). To that end, Hargittai (2010) and Hargittai and Hsieh (2012) make the case for modernizing instruments measuring Internet capacity due to “the fast-changing nature of Internet tools and services” [12]. Consequently some adaptations were made to the CSE to include modern social network-related terms, and to eliminate obscure concepts (e.g., “handling a floppy disc correctly”). In total, 21 measures were retained and adapted, which asked users to rate their comfort with a given task using a five-point Likert scale. The tasks increase in difficulty, from “using an operating system” to “using digital currencies (Bitcoin),” and Likert-style responses were categorized from one (“extremely uncomfortable”) to five (“extremely comfortable”). Herein, this measure is referred to as the Adapted Computer Efficacy Scale (ACES — Appendix A).

Measurements of online trust

The measures of trust were derived from a questionnaire that was designed and validated by Lwin and Williams (2003). This matrix is a 23-point attitudinal self-report scale and was adapted from earlier work presented in Ajzen (1991). The results from this study informed Lwin and Williams’ (2003) development of a “privacy calculus” [13], a model that has subsequently gained traction as a tool for assessing an individual’s their trust of the online environment (Schreiner and Hess, 2015). For the purposes of this study, Lwin and Williams’ (2003) questionnaire was adapted in several ways: the measures were modified to address a) changes to Internet usage with a focus on the rapid integration of social media into personal lives; and b) recent revelations of wide scale personal information from state and corporate interests. Additionally, the trust index was formatted as a five-point Likert-style scale to ensure continuity with the CSE explored above. One subscale of this measure, identified in previous research (Popham, 2016), was retained for the purposes of the current study. This component, indicating individuals’ sense of their capacity to take protective action against data theft (labelled protective actions), represents of six variables with factor loadings all greater than .600 and an eigenvalue of 3.37 (Appendix B). Participant’s scores for these variables were summed and converted into a scale for the purposes of convergent validity.

Internet usage characteristics

As outlined earlier in discussions of a network society perspective, technological efficacy is an indicator of experience users will have in online realms (Davies and Eynon, 2013). The reciprocal relationship of increased capacity and increased exposure suggests that individuals who are more tech-savvy will have greater exposure and more regular usage of technically advanced services and Web sites (Staksrud and Livingstone, 2009). Participants in the current study were asked if they are active on social networking sites and, if they responded positively, they were then presented with a second question which listed a number of social network services as potential options. These options ranged from widely popular services (e.g., Facebook) to more specialized ones (e.g., Tumblr). A follow-up question asked participants to indicate the frequency that they used each selected site using five point scale ranging from more than once a day to less than once a week. The responses to these questions were used to produce two measures of internet use; the first provides a measure of the total number of Web services used by respondents, and the second a measure of their overall usage frequency. Further, to avoid excessive listwise deletions, these constructs were limited to the six largest social media Web sites: Facebook, Twitter, Instagram, YouTube, Reddit, and LinkedIn (Alexa.com, 2017).




Principal component analysis

Given the breadth of the ACES, a principal components analysis (PCA) dimensional reduction process was used to identify any significant characteristics within the ACES. This approach assumes that difficult-to-measure latent patterns are present (i.e., a faithful measure of one’s mastery experiences, etc.), and also that communality exists within the index. Ultimately this process provides an effective, limited subset of variables that is highly reflective of variables from which it was drawn (DiStefano, et al., 2009). In total, 20 of the 21 ACES measures were used for PCA. One variable — “your overall expertise with computers” — was retained for use as a comparator due to its relative breadth compared to the remainder of ACES variables.

The suitability of the ACES index for PCA was then assessed prior to analysis. Inspection of the correlation matrix showed that all variables had at least one correlation coefficient greater than 0.3, and the overall Kaiser-Meyer-Olkin (KMO) measure was 0.91 with individual KMO measures all greater than 0.8 (Kaiser, 1974). Bartlett’s Test of Sphericity was statistically significant (p < 0.05), indicating that the data was likely factorizable. The ACES PCA revealed three components which had eigenvalues greater than one and which explained 40 percent, 13 percent, and eight percent of the total variance, respectively, and 60 percent cumulatively. Based on the linearly increasing difficulty of tasks presented in the ACES, it was assumed that some correlation between factors may be present and therefore an oblique rotation (Direct Oblimin) of the solution was used. The resulting pattern matrix presented presented a moderately ‘simple structure’ with all loadings for each component above 0.6 (Jolliffee, 2002); however, it should be noted that several moderately strong cross-loadings persist, particularly with component 1. These results consistent with stepwise levels of difficulty in the ACES, with strong loadings of “Difficult” tasks on Component 1 (λ =7.94, α = .903), “Simple” tasks on Component 2 (λ =2.53, α = .863), and “Moderate” tasks on Component 3 (λ =1.63, α = .730).


Table 2: ACES pattern matrix.
VariableComponent 1
Component 2
Component 3
1. Assembling a computer from parts.832.450.005
2. Formatting a hard drive.830.336-.033
3. Installing an operating system.798.489.027
4. Writing a computer program.776.250.162
5. Using Rich Site Summaries (RSS).739 .256.366
6. Using digital currencies (e.g., Bitcoin) .705.140.201
7. Creating a Web site.705.363.426
8. Avoiding online fraud (e.g., phishing).647.511.253
9. Using a Web browser.167.774.239
10. Bookmarking a Web site.144.757.236
11. Researching an issue or question.248.732.239
12. Using an operating system.513.730.308
13. Downloading files.457 .722.262
14. Setting/changing software preferences .495.721.148
15. Adding a Web resource to favourites.269.676.310
16. Doing an advanced file search.554.645.214
17. Maintaining a Web site.353.310.790
18. Tagging photos in social media .007.420.718
19. Creating digital A/V content.034.436.687
20. Using social media.496.163.673


Convergent validity


Two procedures were undertaken to test the validity of the ACES model. The first, a series of ANOVA tests, investigated the nature of the relationship between each each subset and the respondent’s self-rated overall experience with computers. Fit tests tests were conducted for each subset; notably, the overall experience score for the “moderately comfortable” and “extremely comfortable” groupings in all subsets were not normally distributed. While this violates one of the assumptions of ANOVA, the large sample size and relatively balanced nature of the population between within the groups suggests that the tests will be robust to non-normality (Glass, et al., 1972; Field, 2013). Additionally, no responses were collected for category one, or “extremely uncomfortable,” limiting ANOVA tests to four categories rather than five. The results of each one-way ANOVA hypothesis test are presented in Table 3.


Table 3: ANOVA for ACES subsets by self-rated overall computing abilities.
Simple tasks and computing skills3, 546161.5 .470.000
Moderate tasks and computing skills3, 54614.6.074.000
Difficult tasks and computing skills3, 54682.4.312.000


The ANOVA tests demonstrated significant differences between the reference variable groupings and each subset; scoring F(3,546) = 161.5, p < .0005 for simple tasks, F(3,546) = 14.6, p < .0005 for moderate tasks, and F(3,546) = 82.4, p < .0005 for difficult tasks. These observations were paired with very large effect sizes for the simple tasks (η2 = .470) and difficult tasks (η2 = .312) components, as well as a medium effect size for the moderate tasks component (η2 = .074) (Cohen, 1988). Furthermore, a Games-Howell post-hoc analysis identified a linear, positive, and statistically significant relationships (p < .005) for each subset score between overall expertise categories. Generally, this demonstrates that respondents from each successive category of overall expertise with computers have higher mean scores in the dependent variable.

Bivariate analysis

A correlation matrix was produced to test the effectiveness of the ACES. Specifically, the component-based scales for “simple,” “moderate,” and “difficult” computing tasks were compared against the composite scores for the number of social media sites used (limited to the six largest sites listed above); the frequency that these sites were used; and the composite score for trust in one’s ability to protect their identity and actions online. The results are presented in Table 4. As was anticipated, the self-efficacy variables produce moderate-to-strong correlations with one-another in a generally linear fashion, indicating the predicted stepwise incremental change. Additionally, moderate relationships are noted between the moderate tasks group and both social media variables but are generally absent or weak for the basic and difficult tasks groups. Finally, a moderate relationship is present between protective actions and the difficult tasks grouping.


Table 4: Bivariate analysis of ACES groupings through correlation.
Notes: * Correlation is significant at the 0.05 level; ** Correlation is significant at the 0.01 level.
X1Simple tasks61536.**.603**.129**.089*.281**
X2Moderate tasks61514.43.56 .347**.352**.308**.168**
X3Difficult tasks61521.99.10  .052.042.399**
X4Social media use (Biggest 6)6152.61.44   .893**.018
X5Social media frequency (Biggest 6)56611.15.17    .015
X6Protective actions61517.84.74     





This study confirms that an Adapted Computer Efficacy Scale provides an effective tool for parsing technologically savvy individuals through the use relational ICT concepts. The data presented above illustrates the effectiveness of the three-component structure of the Adapted Computer Self-Efficacy Scale (ACES) for locating and identifying technologically savvy individuals using updated concepts and terminology that primarily focus on ICT mastery experiences. Moreover, it provides insights into their actions as disruptors, particularly in relation to their sense of trust and indicators of social media usage.

Self-efficacy and tech-savviness

These concepts were arrayed in a stepwise manner reflective of their relative difficulty, and primarily focussed on tasks/activities indicative of mastery experiences. In alignment with past ICT self-efficacy research (e.g., Durndell and Haag, 2002; Kim and Glassman, 2013), factor analysis indicated three discrete and statistically significant user-groups, qualitatively labeled according to the level of difficulty of the tasks associated with each group. Specifically, the “simple” group’s high efficacy scores primarily consisted of fundamental computing concepts which might be considered requisite for general populist technology usage; whereas the “moderate” group included more interactive and procedures that are generally less optimized than modern fundamentals; and the “difficult” group incorporated the most technically difficult and abstract concepts. It is important to note that these factors correlate in a positive linear fashion, suggesting that respondents are efficacious in their grouping as well as all groupings of lower difficulty (Wei, et al., 2011). Moreover, understanding the scale as one of stepwise difficulty helps to mitigate the cross-loadings described above: An individual with a higher level of technological capabilities will certain express self-efficacy at all three levels, rather than relegating their capacity to one. These findings were further validated by ANOVA and post-hoc test findings that indicated a linear relationship between technical ability and one’s mastery experiences.

Tech-savviness and social media participation

Correlations between self-efficacy grouping and Web usage characteristics presented interesting results. Usage of the six largest social media platforms, and level of participation therein, was most significantly correlated with the simple and moderate task group while correlation with the difficult group was virtually nil. One explanation may be that the level of optimization and ultimately simplification of social media platforms most closely aligns with the skills and mastery experiences of these groups (Wang, et al., 2015); however, they might also indicate that the technologically savvy disruptors desist or abstain from using social media. This second proposition is further supported by correlation between self-efficacy grouping and the protective actions variable — unlike the Web usage relationships, this measure is most strongly correlated with the difficult efficacy grouping, indicating that those with the greatest level of technical mastery were most likely to feel confident in their ability to mask their online presence and avoid Web-borne threats (Stutzman, 2006; Young and Quan-Haase, 2013).


Given the empirical format of this paper it is important to note that there is the potential for sample frame error. The electronic survey tool used to collect data for this study included a series of questions about music piracy and as such the true disruptors may have been discouraged from participating given the relative legal risks. Moreover, the self-selection nature of data collection limited the breadth of respondents and therefore its generalizability. This is exemplified in a recent study by Dupree, et al. [14] wherein the authors found that Internet users adopt different “privacy personas” relative to their level of efficacy with technology, and choose to disengage with different online activities (including forms of social networks) accordingly. Paired with the generative capability principle discussed above, we can surmise that technological self-efficacy is instrumental in further retrenching critical consideration of the information sources and engagement therein (Hocevar, et al., 2014); and may, therefore, limit the significance of social media as a form of measure for online activities. Future studies would benefit from a more directly-designed survey that more directly examines social media usage characteristics to better understand how disruptors operate; it may also be worthwhile to develop a purposive sampling framework that assess known computer experts, such as ICT developers and white-hat/black-hat/grey-hat hackers.

Second, it is also important to note that the pace of technological growth entails an inherent risk in future applicability of the measures detailed in the ACES. While every effort was made to develop a relational set of mastery experiences that are somewhat resilient to technological change, it becomes exceedingly impossible to anticipate where technology may take the public next (Kim and Glassman, 2013). For example, this study is grounded in “traditional” computing concepts and fails to incorporate mobile technologies, which are rapidly becoming the number one tool for social media usage (Lella and Lipsman, 2016). Future studies should therefore consider including concepts related to mobile computing such as “unlocking” and “rooting” phones.

Third, conceptual applications should also be considered. The current study limited its exploration of digital activities to social media, and therefore may have overlooked other forms of action that politically motivated individuals may engage in. While the subversive use of established media by counter-power groups has occurred throughout history across multiple forms (e.g., punk rock, zines, pirate radio, etc. — see Hemphill and Leskowitz, 2012), the modern Internet affords the opportunity to engage many more and granular forms of protest. Specifically, rebellion against social order might entail any number of actions from the relatively benign “hashtag activism” (Fang, 2015) to acts of digital subterfuge and hactivism (Karatzogianni, 2016). The negotiation of these differences remains relatively untested and ambiguously defined (Vlavo, 2015; Popham, 2018); entering into the fray of this discussion was beyond the scope of the current study. An illustrative example raised by Bandura, et al. [15] is one’s career choice. Actors who have a “strong sense of personal efficacy consider a wide range of career options, show greater interest in them, prepare themselves better for different careers and have greater staying power in their chosen pursuits.” Similarly, as tech-savvy individuals develop capacities related to negotiating the information available online, they may also begin to better negotiate problems of choice and ultimately become highly selective in their knowledge-generating activities. Future studies should consider pairing these activities with the technologically savvy to further test the significance of ICT skill in engaging in or encouraging social change.




The omnipresence of the Internet in everyday social interaction is now a fact of the twenty-first century. This era of hyper-connectivity has given rise to the primacy of social media as a tool for communication that extends beyond traditional boundaries, and in so doing provides a foundational platform by which counter-hegemonic ideas might gain traction and, eventually, legitimation. These communicative evolutions have borne opportunity for a new generation of disruptors, leaders who can dynamically shape knowledge networks to conform with ideological shifts (Sheller, 2004). This paper contributes toward the study of these disruptors by providing an empirical methodology for anonymously identifying said actors.

Specifically, operationalizing the ACES provided a discrete set of components based on relational computing mastery experiences. To this end respondents with the highest rating of self-efficacy in the “difficult” grouping, and thus those most likely to align with the disruptros defined above, self-identified competency in technologically abstract or difficult tasks. Future studies focussing on the technologically savvy may therefore rely on these eight measures to identify actors: Assembling a computer from parts; formatting a hard drive; installing an operating system; writing a computer program; using rich site summaries (e.g., RSS); using digital currencies (e.g., Bitcoin); creating a Web site; and protecting themselves from phishing. Furthermore, this paper demonstrates that these disruptors have a strong sense of efficacy in their capacity to obfuscate their personal information online and otherwise take protective action against Web-borne threats to their personal identity. Interestingly, the most technologically savvy self-efficacy group also indicated that they used the most popular social media services with lower frequency and intensity. Taken together, these two observations suggest that the technologically savvy disrutpors may have a sense of “silent listeners” on corporate-controlled social media services (Stutsman, et al., 2012; Tufekci and Wilson, 2012). Future studies might consider investigating further the factors influencing desistance and/or abstinence from the major services that now dominate the spectrum of social media services. End of article


About the author

James F. Popham, Ph.D., is an assistant professor of criminology at Wilfrid Laurier University. His research focuses on deviance and social control in digital realms, and his current interests include studies of deviant online personality creation; critical interpretations of personal data privacy as they relate to different sectors; the mediating role of the internet in harmful interactions; and, empowerment through communication.
E-mail: jpopham [at] wlu [dot] ca



The author would like to acknowledge Hongming Cheng, Carolyn Brooks, Jennifer Poudrier, Robert Hudson, and Joan Brockman for their support in developing this research.



1. Rimal, et al., 2015, p. 863.

2. Castells, 2015, p. 9.

3. Goldsmith and Brewer, 2015, p. 116.

4. Castells, 2015, p. 223.

5. Fieldhouse and Nicholas, 2008, p. 49.

6. Bandura, 1977, p. 37.

7. Bandura, 1995, p. 3.

8. Bandura, 1997, p. xi.

9. “Selection processes,” Bandura, 1997, p. 160.

10. Kim and Glassman, 2013, p. 1,422.

11. Hsu, et al., 2007, p. 160.

12. Hargittai and Hsieh, 2012, p. 96.

13. Lwin and Williams, 2003, p. 269.

14. Dupree, et al., 2016, p. 1.

15. Bandura, et al., 2001, p. 188.



I. Ajzen, 1991. “The theory of planned behavior,” Organizational Behavior and Human Decision Processes, volume 50, number 2, pp. 179–211.
doi: https://doi.org/10.1016/0749-5978(91)90020-T, accessed 8 July 2018.

Alexa.com, 2017. “The top 500 sites on the Web: Global,” at http://www.alexa.com/topsites, accessed 8 July 2018.

M. Andrejevic, 2011. “Exploitation in the data mine,” In: C. Fuchs, K. Boersma, A. Albrechtslund, and M. Sandoval (editors). Internet and surveillance: The challenges of Web 2.0 and social media. New York: Routledge, pp. 71–88.

A. Bandura, 2001. “Social cognitive theory of mass communication,” Media Psychology, volume 3, number 3, pp. 265–299.
doi: https://doi.org/10.1207/S1532785XMEP0303_03, accessed 8 July 2018.

A. Bandura (editor), 1995. Self-efficacy in changing societies. Cambridge: Cambridge University Press.

A. Bandura, 1986. “The explanatory and predictive scope of self-efficacy theory,” Journal of Social and Clinical Psychology, volume 4, number 3, pp. 359–373.
doi: https://doi.org/10.1521/jscp.1986.4.3.359, accessed 8 July 2018.

A. Bandura, 1977. “Self-efficacy: Toward a unifying theory of behavioral change,” Psychological Review, volume 84, number 2, pp. 191–215.
doi: http://dx.doi.org/10.1037/0033-295X.84.2.191, accessed 8 July 2018.

A. Bandura, C. Barbaranelli, G. V. Caprara, and C. Pastorelli, 2001. “Self-efficacy beliefs as shapers of children’s aspirations and career trajectories,” Child Development, volume 72, number 1, pp. 187–206.
doi: https://doi.org/10.1111/1467-8624.00273, accessed 8 July 2018.

F. Bélanger and L. Carter, 2009. “The impact of the digital divide on e-government use,” Communications of the ACM, volume 52, number 4, pp. 132–135.
doi: https://doi.org/10.1145/1498765.1498801, accessed 8 July 2018.

Y. Bonilla and J. Rosa, 2015. “#Ferguson: Digital protest, hashtag ethnography, and the racial politics of social media in the United States,” American Ethnologist, volume 42, number 1, pp. 4–17.
doi: https://doi.org/10.1111/amet.12112, accessed 8 July 2018.

W. K. Carroll and R.A. Hackett, 2006. “Democratic media activism through the lens of social movement theory,” Media, Culture & Society, volume 28, number 1, pp. 83–104.
doi: https://doi.org/10.1177/0163443706059289, accessed 8 July 2018.

M. Castells, 2015. Networks of outrage and hope: Social movements in the Internet age. Second edition. Cambridge: Polity Press.

V. Celik and E. Yesilyurt, 2013. “Attitudes to technology, perceived computer self-efficacy and computer anxiety as predictors of computer supported education,” Computers & Education, volume 60, number 1, pp. 148–158.
doi: https://doi.org/10.1016/j.compedu.2012.06.008, accessed 8 July 2018.

J. Cohen, 1988. Statistical power analysis for the behavioral sciences. Second edition. Hillsdale, N.J.: L. Erlbaum Associates.

C. J. Dahlman and L.E. Westphal, 1981. “The meaning of technological mastery in relation to transfer of technology,” Annals of the American Academy of Political and Social Science, volume 458, number 1, pp. 12–26.
doi: https://doi.org/10.1177/000271628145800102, accessed 8 July 2018.

C. Davies and R. Eynon, 2013. “Studies of the Internet in learning and education: Broadening the disciplinary landscape of research,” In: W.H. Dutton (editor). Oxford handbook of Internet studies. Oxford: Oxford University Press, pp. 328–351.
doi: https://doi.org/10.1093/oxfordhb/9780199589074.013.0016, accessed 8 July 2018.

C. DiStefano, M. Zhu, and D. Mindrila 2009. “Understanding and using factor scores: Considerations for the applied researcher,” Practical Assessment, Research & Evaluation, volume 14, number 20, pp. 1–11, and https://pareonline.net/pdf/v14n20.pdf, accessed 8 July 2018.

A. Durndell and Z. Haag, 2002. “Computer self efficacy, computer anxiety, attitudes towards the Internet and reported experience with the Internet, by gender, in an East European sample,” Computers in Human Behavior, volume 18, number 5, pp. 521–535.
doi: https://doi.org/10.1016/S0747-5632(02)00006-7, accessed 8 July 2018.

J. L. Dupree, R. Devries, D. M. Berry, and E. Lank, 2016. “Privacy personas: Clustering users via attitudes and behaviors toward security practices,” CHI ’16: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 5,228–5,239.
doi: https://doi.org/10.1145/2858036.2858214, accessed 8 July 2018.

M. S. Eastin and R. LaRose, 2000. “Internet self-efficacy and the psychology of the digital divide,” Journal of Computer-Mediated Communication, volume 6, number 1.
doi: https://doi.org/10.1111/j.1083-6101.2000.tb00110.x, accessed 8 July 2018.

R. Eynon and L.-E. Malmberg, 2012. “Understanding the online information-seeking behaviours of young people: The role of networks of support,” Journal of Computer Assisted Learning, volume 28, number 6, pp. 514–529.
doi: https://doi.org/10.1111/j.1365-2729.2011.00460.x, accessed 8 July 2018.

A. Field, 2013. Discovering statistics using IBM SPSS statistics. Fourth edition. London: Sage.

M. Fieldhouse and D. Nicholas, 2008. “Digital literacy as information savvy: The road to information literacy,” In: C. Lankshear and M. Knobel (editors). Digital literacies: Concepts, policies and practices. New York: Peter Lang, pp. 47–72.

O. Galor and O. Moav, 2000. “Ability-biased technological transition, wage inequality, and economic growth,” Quarterly Journal of Economics, volume 115, number 2, pp. 469–497.
doi: https://doi.org/10.1162/003355300554827, accessed 8 July 2018.

P. Gerbaudo, 2012. Tweets and the streets: Social media and contemporary activism. London: Pluto Press.

G. V. Glass, P. D. Peckham, and J. R. Sanders, 1972. “Consequences of failure to meet assumptions underlying the fixed effects analyses of variance and covariance,” Review of Educational Research, volume 42, number 3, pp. 237–288.
doi: https://doi.org/10.3102/00346543042003237, accessed 8 July 2018.

E. Hargittai, 2010. “Digital na(t)ives? Variation in Internet skills and uses among members of the ‘net generation’,” Sociological Inquiry, volume 80, number 1, pp. 92–113.
doi: https://doi.org/10.1111/j.1475-682X.2009.00317.x, accessed 8 July 2018.

E. Hargittai and Y. P. Hsieh, 2012. “Succinct survey measures of Web-use skills,” Social Science Computer Review, volume 30, number 1, 95–107.
doi: https://doi.org/10.1177/0894439310397146, accessed 8 July 2018.

O. E. Hatlevik, G. B. Guðmundsdóttir, and M. Loi, 2015. “Digital diversity among upper secondary students: A multilevel analysis of the relationship between cultural capital, self-efficacy, strategic use of information and digital competence,” Computers & Education, volume 81, pp. 345–353.
doi: https://doi.org/10.1016/j.compedu.2014.10.019, accessed 8 July 2018.

K. P. Hocevar, A. J. Flanagin, and M. J. Metzger, 2014. “Social media self-efficacy and information evaluation online,” Computers in Human Behavior, volume 39, pp. 254–262.
doi: https://doi.org/10.1016/j.chb.2014.07.020, accessed 8 July 2018.

M.-H. Hsu, T. L. Ju, C.-H. Yen, and C.-M. Chang, 2007. “Knowledge sharing behavior in virtual communities: The relationship between trust, self-efficacy, and outcome expectations,” International Journal of Human-Computer Studies, volume 65, number 2, pp. 153–169.
doi: https://doi.org/10.1016/j.ijhcs.2006.09.003, accessed 8 July 2018.

B. F. Jackson, 2014. “Censorship and freedom of expression in the age of Facebook,” New Mexico Law Review, volume 44, number 1, pp. 121–148, and at http://digitalrepository.unm.edu/nmlr/vol44/iss1/6/, accessed 8 July 2018.

I.T. Jolliffe, 2002. Principal component analysis. Second edition. New York: Springer.

V. S. Katz and C. Gonzalez, 2016. “Toward meaningful connectivity: Using multilevel communication research to reframe digital inequality,” Journal of Communication, volume 66, number 2, pp. 236–249.
doi: https://doi.org/10.1111/jcom.12214, accessed 8 July 2018.

Y. Kim and M. Glassman, 2013. “Beyond search and communication: Development and validation of the Internet Self-efficacy Scale (ISS),” Computers in Human Behavior, volume 29, number 4, pp. 1,421–1,429.
doi: https://doi.org/10.1016/j.chb.2013.01.018, accessed 8 July 2018.

L. A. Lievrouw, 2012. “The next decade in Internet time: Ways ahead for new media studies,” Information, Communication & Society, volume 15, number 5, pp. 616–638.
doi: https://doi.org/10.1080/1369118X.2012.675691, accessed 8 July 2018.

R. A. Lindsey, 2013. “What the Arab Spring tells us about the future of social media in revolutionary movements,” Small Wars Journal, volume 9, number 7, at http://smallwarsjournal.com/jrnl/art/what-the-arab-spring-tells-us-about-the-future-of-social-media-in-revolutionary-movements, accessed 8 July 2018.

X. Liu and X. Li, 2016. “What motivates online disagreement expression: The influence of self-efficacy, mastery experience, vicarious experience, and verbal persuasion,” In: X. Li (editor). Emerging media: Uses and dynamics. New York: Routledge, pp. 197–221.

S. Livingstone and E. Helsper, 2010. “Balancing opportunities and risks in teenagers’ use of the Internet: The role of online skills and Internet self-efficacy,” New Media & Society, volume 12, number 2, pp. 309–329.
doi: https://doi.org/10.1177/1461444809342697, accessed 8 July 2018.

M. O. Lwin and J. D. Williams, 2003. “A model integrating the multidimensional developmental theory of privacy and theory of planned behavior to examine fabrication of information online,” Marketing Letters, volume 14, number 4, pp. 257–272.
doi: https://doi.org/10.1023/B:MARK.0000012471.31858.e5, accessed 8 July 2018.

C. J. McKinley and E. K. Ruppel, 2014. “Exploring how perceived threat and self-efficacy contribute to college students’ use and perceptions of online mental health resources,” Computers in Human Behavior, volume 34, pp. 101–109.
doi: https://doi.org/10.1016/j.chb.2014.01.038, accessed 8 July 2018.

C. McLoughlin and M. J. W. Lee, 2010. “Personalised and self regulated learning in the Web 2.0 era: International exemplars of innovative pedagogy using social software,” Australasian Journal of Educational Technology, volume 26, number 1, pp. 28–43, and at https://ajet.org.au/index.php/AJET/article/view/1100, accessed 8 July 2018.

J. Morahan-Martin and P. Schumacher, 2000. “Incidence and correlates of pathological Internet use among college students,” Computers in Human Behavior, volume 16, number 1, pp. 13–29.
doi: https://doi.org/10.1016/S0747-5632(99)00049-7, accessed 8 July 2018.

P. Norris, 2001. Digital divide: Civic engagement, information poverty, and the Internet worldwide. Cambridge: Cambridge University Press.

J. Popham, 2018. “Microdeviation: Observations on the significance of lesser harms in shaping the nature of cyberspace,” Deviant Behavior, volume 39, number 2, pp. 159–169.
doi: https://doi.org/10.1080/01639625.2016.1263085, accessed 8 July 2018.

J. Popham, 2016. “The Internet as a catalyst for microdeviation: An integrated theory of digital music piracy,” Ph.D. thesis, Department of Sociology, University of Saskatchewan, at http://hdl.handle.net/10388/ETD-2016-04-2553, accessed 8 July 2018.

R. N. Rimal, A. H. Chung, and N. Dhungana, 2015. “Media as educator, media as disruptor: Conceptualizing the role of social context in media effects,” Journal of Communication, volume 65, number 5, pp. 863–887.
doi: https://doi.org/10.1111/jcom.12175, accessed 8 July 2018.

R. A. Revelo, C. D. Schmitz, D. T. Le, and M. C. Loui, 2017. “Self-efficacy as a long-term outcome of a general education course on digital technologies,” IEEE Transactions on Education, volume 60, number 3, pp. 198–204.
doi: https://doi.org/10.1109/TE.2016.2635624, accessed 8 July 2018.

M. Schreiner and T. Hess, 2015. “Why are consumers willing to pay for privacy? An application of the privacy-freemium model to media companies,” ECIS 2015 Completed Research Papers, number 164, at http://aisel.aisnet.org/ecis2015_cr/164/, accessed 8 July 2018.
doi: https://doi.org/10.18151/7217470, accessed 8 July 2018.

M. Sheller, 2004. “Mobile publics: Beyond the network perspective,” Environment and Planning D: Society and Space, volume 22, number 1, pp. 39–52.
doi: https://doi.org/10.1068/d324t, accessed 8 July 2018.

R. Shillair, S. R. Cotten, H.-Y. Tsai, S. Alhabash, R. LaRose, and N. J. Rifon, 2015. “Online safety begins with you and me: Convincing Internet users to protect themselves,” Computers in Human Behavior, volume 48, pp. 199–207.
doi: https://doi.org/10.1016/j.chb.2015.01.046, accessed 8 July 2018.

E. Staksrud and S. Livingstone, 2009. “Children and online risk: Powerless victims or resourceful participants?” Information, Communication & Society, volume 12, number 3, pp. 364–387.
doi: https://doi.org/10.1080/13691180802635455, accessed 8 July 2018.

F. Stutzman, 2006. “An evaluation of identity-sharing behavior in social network communities,” Journal of the International Digital Media and Arts Association, volume 3, number 1, pp. 10–18.

A. Tamjidyamcholo, M.S. Bin Baba, H. Tamjid, and R. Gholipour, 2013. “Information security — Professional perceptions of knowledge-sharing intention under self-efficacy, trust, reciprocity, and shared-language,” Computers & Education, volume 68, pp. 223–232.
doi: https://doi.org/10.1016/j.compedu.2013.05.010, accessed 8 July 2018.

Y. Theocharis, 2012. “Cuts, tweets, solidarity and mobilisation: How the Internet shaped the student occupations,” Parliamentary Affairs, volume 65, number 1, pp. 162–194.
doi: https://doi.org/10.1093/pa/gsr049, accessed 8 July 2018.

G. Torkzadeh and X. Koufteros, 1994. “Factorial validity of a computer self-efficacy scale and the impact of computer training,” Educational and Psychological Measurement, volume 54, number 3, pp. 813–821.
doi: https://doi.org/10.1177/0013164494054003028, accessed 8 July 2018.

Z. Tufekci and C. Wilson, 2012. “Social media and the decision to participate in political protest: Observations from Tahrir Square,” Journal of Communication, volume 62, number 2, pp. 363–379.
doi: https://doi.org/10.1111/j.1460-2466.2012.01629.x, accessed 8 July 2018.

Y. Wang, M. Niiya, G. Mark, S. M. Reich, and M. Warschauer, 2015. “Coming of age (digitally): An ecological view of social media use among college students,” CSCW ’15: Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, pp. 571–582.
doi: https://doi.org/10.1145/2675133.2675271, accessed 8 July 2018.

M. Warschauer and T. Matuchniak, 2010. “New technology and digital worlds: Analyzing evidence of equity in access, use, and outcomes,” Review of Research in Education, volume 34, number 1, pp. 179–225.
doi: https://doi.org/10.3102/0091732X09349791, accessed 8 July 2018.

K.-K. Wei, H.-H. Teo, H. C. Chan, and B. C. Y. Tan, 2011. “Conceptualizing and testing a social cognitive model of the digital divide,” Information Systems Research, volume 22, number 1, pp. 170–187.
doi: https://doi.org/10.1287/isre.1090.0273, accessed 8 July 2018.

M. T. Whitty and D. McLaughlin, 2007. “Online recreation: The relationship between loneliness, Internet self-efficacy and the use of the Internet for entertainment purposes,” Computers in Human Behavior, volume 23, number 3, pp. 1,435–1,446.
doi: https://doi.org/10.1016/j.chb.2005.05.003, accessed 8 July 2018.

C. Wiedemann, 2014. “Between swarm, network, and multitude: Anonymous and the infrastructures of the common,” Distinktion: Journal of Social Theory, volume 15, number 3, pp. 309–326.
doi: https://doi.org/10.1080/1600910X.2014.895768, accessed 8 July 2018.

D. J. Wolover, 2016. “An issue of attribution: The Tunisian revolution, media interaction, and agency,” New Media & Society, volume 18, number 2, pp. 185–200.
doi: https://doi.org/10.1177/1461444814541216, accessed 8 July 2018.

A. L. Young and A. Quan-Haase, 2013. “Privacy protection strategies on Facebook: The Internet privacy paradox revisited,” Information, Communication & Society, volume 16, number 4, pp. 479–500.
doi: https://doi.org/10.1080/1369118X.2013.777757, accessed 8 July 2018.


Appendix A: Adapted Computer Self Efficacy Scale (ACES)

Using a five-point scale, with one being lowest (not comfortable at all) and five being highest (completely at ease), tell us how comfortable you are with each of the following tasks:

  • Your overall expertise with computers
  • Using an operating system
  • Using a Web browser
  • Using social media
  • Researching an issue or question
  • Downloading files
  • Creating a Web site
  • Writing a computer program
  • Formatting a hard drive
  • Installing an operating system
  • Assembling a computer from parts
  • Bookmarking a Web site
  • Doing an advanced file search
  • Adding a Web resource to favourites
  • Tagging photos in social media
  • Setting/changing software preferences
  • Maintaining a Web site
  • Creating digital A/V content
  • Avoiding online fraud (e.g., phishing)
  • Using Rich Site Summaries (RSS)
  • Using digital currencies (e.g., Bitcoin)


Appendix B: Adapted Privacy Calculus Scale: Protective actions

Using a five=point scale, with one being lowest (strongly disagree) and five being highest (strongly agree), tell us how much you agree with each of the following statements

  • I know how to protect my personal information from private corporations
  • I know how to protect my personal information from identity thieves
  • I know how to protect my personal information from government agencies
  • I know how to protect my personal information online
  • I am confident that I can avoid being caught doing something wrong online
  • I frequently use technology to hide my digital trail


Editorial history

Received 28 July 2017; accepted 10 July 2018.

Copyright © 2018, James F. Popham.

A self-efficacy informed approach to anonymously locating digital disruptors
by James F. Popham.
First Monday, Volume 23, Number 8 - 6 August 2018
doi: http://dx.doi.org/10.5210/fm.v23i8.8046

A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2020. ISSN 1396-0466.