Research into information overload has been extensive and cross–disciplinary, producing a multitude of suggested causes and posed solutions. I argue that many of the conclusions arrived at by existing research, while laudable in their inventiveness and/or practicality, miss the mark by viewing information overload as a problem that can be understood (or even solved) by purely rational means. Such a perspective lacks a critical understanding in human information usage: much in the same way that economic models dependent on rationality for their explanations or projections fail (often spectacularly, as recent history attests), models that rely too heavily upon the same rational behavior, and not heavily enough upon the interplay of actual social dynamics — power, reputation, norms, and others — in their attempts to explain, project, or address information overload prove bankrupt as well. Furthermore, even research that displays greater awareness of the social context in which overload exists often reveals a similar rationality in its conceptualization. That is, often the same “social” approaches that offer potential advantages (in mitigating information overload) over their “non–social” counterparts paradoxically raise new problems, requiring a reappraisal of overload that takes social issues into account holistically.
“Information,” scope, and volume
Definitions, concepts, and contexts
Past as prologue: History, symptoms, and effects
Paradox lost, paradox regained
Homo informaticus: Rationality and the “information person”
Toward a holistic social theory of information overload
Both the popular and academic presses warn of the growing problems created by an ever–increasing flow of information. The Economist (2010) warns us of “monstrous amounts of data” in a new special report. A recent article in IEEE Spectrum refers to our “infoglut” as “the disease of the new millennium” . CNN, in turn, pleads, “How can we cope with information overload?” (Mollman, 2010). It would seem that information overload — or the cognitive overload to which information superabundance contributes — continues to track the growth curve of information itself, distracting attention, hampering decision–making, and lowering productivity in and out of the workplace. Interest, often business–related, in isolating the causes and combating this “ill” in organizations has spawned copious amounts of research, working groups, and indeed entire industries (decision support, knowledge management, social networking tools, etc.) committed to the fight.
Research has been extensive and cross–disciplinary, producing a multitude of suggested causes and posed solutions. I argue that many of the conclusions arrived at by existing research, while laudable in their inventiveness and/or practicality, miss the mark by viewing information overload as a problem that can be understood (or even solved) by purely rational means. Such a perspective lacks a critical understanding in human information usage: much in the same way that economic models dependent on rationality for their explanations or projections fail (often spectacularly, as recent history attests), models that rely too heavily upon the same rational behavior, and not heavily enough upon the interplay of actual social dynamics — power, reputation, norms, and others — in their attempts to explain, project, or address information overload prove bankrupt as well. Furthermore, even research that displays greater awareness of the social context in which overload exists often reveals a similar rationality in its conceptualization. That is, often the same “social” approaches that offer potential advantages (in mitigating information overload) over their “non–social” counterparts paradoxically raise new problems, requiring a reappraisal of overload that takes social issues into account holistically.
“Information,” scope, and volume
Mason, et al. (1995) describe an “epistemic hierarchy” in which information is produced. First the mind draws distinctions and creates data out of chaos. Such data, categorized, organized, and rationalized based on the processing mind’s perspective, undergoes a transformation into information. Subsequent scrutiny to authenticate or verify it results in knowledge. Beyond that, it becomes broadly codified and integrated both with knowledge from other disciplines and with a culture or society. The resultant byproduct, wisdom, then, is both figuratively and literally the farthest thing from data, involving “forgetting as much as remembering and is made up of insights and understandings as to what is true, right, and lasting” . This paper deals primarily with data and information, the two of which are often conflated . It also examines information overload chiefly as it occurs within the organizational sphere, in part because of the abundance of research pertaining to this context, but also because of the greater standardization of information use and thus the increased potential to draw general conclusions from it. However, a judicious amount of non–business examples present themselves in a small amount of relevant cases. Lastly, this paper does examine multiple overload “categories,” in part because they contribute insight germane to the overarching conversation, and in part due to the ambiguities surrounding the concepts and terminology themselves.
Definitions, concepts, and contexts
Much like art or obscenity, the concept of information overload is difficult to define, but we “can all recognize the condition … when we see it” . Less subjective definitions exist; one of the clearer ones notes that information overload conveys “the simple notion of receiving too much information” . Scholarly terminology, runs the gamut: “cognitive overload (Vollman, 1991), sensory overload (Libowski, 1975), communication overload (Meier, 1963), knowledge overload (Hunt and Newman, 1997), and information fatigue syndrome (Wurman, 2001)” . Edmunds and Morris (2000) add “analysis paralysis” to the list . On top of this, Wurman argues that such overload leads, paradoxically, to information anxiety, the reaction to the gap between all the information we understand and what we think we ought to understand, the “black hole between data and knowledge” . Others suggest “interaction” or “transaction overload” (Mathiassen and Sørensen, 2002). More recently, researchers advocate the broader rubric of “technology overload,” under which would fall “system feature overload,” communication overload, and information overload (Karr–Wisniewski and Lu, 2010).
Whether examined broadly or constrained within a single rubric such as business or management, the context in which information overload occurs varies widely. “Information overload is frequently referred to in the literature of a range of disciplines such as medicine, business studies, and the social sciences as well as in computing and information science” . Within an organization, overload has been examined among decision–makers as well as individual contributors, as well as within specific business areas (accounting, management of information systems (MIS), organizational science, and marketing) (Eppler and Mengis, 2004). Edmunds and Morris note that Butcher (1998) details three dimensions of management research into information overload: personal information overload and implications on problem–solving and decision–making; organizational information overload in which a surfeit of “documents” creates a sclerotic effect on productivity; and, customer information overload’s effect on spending habits.
Conceptual approaches that attempt to quantify information overload show even greater disparity. Eppler and Mengis (2004) outline four general categories: characteristics of the information itself (both qualitative and quantitative); limitations in the individual’s ability to process information (compared to the amount of information received); organizational issues such as formal or informal processes; and, information technology characteristics that govern how information is generated, transmitted, and received. In addition, subjective approaches examine individuals’ emotional responses (anxiety, confusion, low motivation) to information overload as a qualitative measure of its effects.
Individual cognitive capacity is often presented as a central cause of information overload. Miller’s seminal work illuminates human limitations both in “bandwidth” and in numeric processing: a human can process about seven “chunks” of data at a time , and tends to subitize items in groups of fewer than seven but estimate items in groups of greater than seven (Miller, 1956). Other cognitive bounds may benefit from a computer science conceptualization. Multitasking, frequently cited as a contributing factor in information overload, may be thought of in terms of a central processing unit’s ability to handle different tasks simultaneously . Further, interruptions and distractions, also examined in overload analyses, can be considered analogous to the context switch that a computer must undergo every time it sets aside one task and returns to another. Each of these limitations brings a quantifiable cost to bear on the individual’s information processing capacity.
Characteristics of the information itself may produce or exacerbate overload. Issues of volume or quantity, where supply exceeds processing capacity, pose problems (Eppler and Mengis, 2004). Data rate plays into this as well, as do signal–to–noise dynamics as observed by Klapp (Edmunds and Morris, 2000). Schick’s and Lawrence’s temporal approach causally links overload to supply (or, more specifically, the amount of information processing required) and the amount of time an individual has at his disposal to perform such processing (Schick, et al., 1990). Information (or even data) organization factors may affect information overload as well (Zaki and Hoffman, 1988). Qualitative traits, (Eppler and Mengis, 2004), such as uncertainty, diversity, ambiguity, novelty, complexity, intensity, quality or value, can all result in overload. Usage patterns or preferences such as multitasking or polychronicity may also influence factors affecting overload (Hecht and Allen, 2005).
Organizational factors may present themselves in information overload. If organizations can be considered information processing systems (O’Reilly III, 1980), they become subject to some of the same cognitive “computing” limitations outlined earlier with regard to individual causes of information overload. Yet in paradoxical analogy to a high–performing organization exhibiting strength greater than the sum of its parts, organizations may actually display magnified susceptibility to information overload. In addition to individual causes of overload, the need among managers to share, verify, or preemptively store information contributes to overload (Edmunds and Morris, 2000). More generally, organizational conditions such as increased collaboration, information centralization (or, conversely, disintermediation) may play a role as well (Eppler and Mengis, 2004).
Lastly, few would argue against technology’s impact on information overload, although disparity persists over whether that impact has been, on the whole, positive or negative. It is perhaps the only cause or contributor to information overload also used, paradoxically, as a tool to mitigate the problem it helped create (Schultz and Vandenbosch, 1998). A large body of research on systems such as e–mail exists; as early as nearly a quarter century ago, researchers hypothesized that e–mail not only accelerates the exchange of information, but also leads to the exchange of new information too (Sproull and Kiesler, 1986). Many other e–mail studies demonstrate quantitative and qualitative impacts (Edmunds and Morris, 2000). Extensive analysis has also been performed on so–called “push” technologies, Internet/intranet/extranet deployments, high–storage capacity, lower duplication costs, and speed of access (Eppler and Mengis, 2004).
Past as prologue: History, symptoms, and effects
The symptoms, if not the terminology, of information overload are hardly novel, dating back at least as far as the late nineteenth century, when indications of the burgeoning problem began to appear. In a recent letter to the editor of the Chronicle of Higher Education, the dean of an information school remarked that
“[p]roblems associated with information overflow or overload have been of concern for a long time. In 1881, Dr. George Beard wrote American nervousness: Its causes and consequences, a supplement to nervous exhaustion. Beard believed that a chief cause of nervous exhaustion was the proliferation of reading material brought about by the invention of the high–speed printing press in the nineteenth century. With the increased output of the periodical press — newspapers and magazines — there was suddenly too much to read in too little time. At its most benign, the nervousness could result in headaches or dyspepsia; but Beard warned that at its most acute, it could lead to insanity” (Cloonan, 2010).
Likewise, Edmunds and Morris (2000) point to the advertisement of a desk designed specifically for filing documents in the 1880s, and an account of the drastic growth in case law over the same period as examples. Without making overly light of the problem’s seriousness at its humble origins, more pressing issues in the Industrial Age likely distracted attention from information overload, and the relatively high cost (and thus slow pace) of information technology innovation  arguably served to dampen the problem’s growth. The proportion of any labor force employed specifically as information workers would have been miniscule — Shenk estimates four percent in 1850 (Edmunds and Morris, 2000), compared with 95 percent in 2000 (Mason, et al., 1995) . Technological advances including a steady doubling of processing power and the advent of the Internet have powered dramatic increases in the amount of information accessible and a counterintuitive pressure, particularly in business realms, to obtain more information (2000). The resulting abundance of — and desire for more (and/or higher quality) — information has come to be perceived in some circles, paradoxically, as the source of as much productivity loss as gain.
Categorized, the effects of information overload evoke a sort of overload of their own. Edmunds and Morris (2000) counterpose the abundance of information and the dearth of useful information: the promise of plentiful information, dimmed by the difficulty — and hence the cost in time and effort — of culling through it all in order to discover the items one needs. Overload can manifest itself in a number of ways: information retrieval limitations (e.g., deteriorating search strategies, difficulty in identifying relevant information, problems reaching target audience), non–standard information processing and organization (e.g., inconsistent and non–discrete categorization, insufficient analysis, misinterpretation), lower effectiveness in decision–making (e.g., decreased quality or accuracy, reduced efficiency), or individual discomfort (e.g., increased stress, increased acceptance of error, decrease in learning) (Eppler and Mengis, 2004)
A variety of countermeasures to information overload’s effects exist, some prescribed, some observed, and some merely posited. Personal factors (e.g., improved time management, augmented information literacy, improved personal information management), information characteristics (e.g., increased quality, organization, visualization, or interfaces), task/process parameters (e.g., uniform procedures, information handling strategies), organizational design (e.g., coordination, hiring/scheduling decisions), and information technology — are all named as mitigating factors (Eppler and Mengis, 2004) . Similarly, data delivered in summarized form (as opposed to raw) increased the quality of subsequent decisions (Chervany and Dickson, 1974), and arguably reduced overload (although there were unintended negative consequences as well). The need to keep current, and the requisite effort to do so, might be mitigated with systematic reviews of information (Edmunds and Morris, 2000). Some advocate technological approaches, which range from the practical — such as improving information retrieval techniques for greater precision and recall (Montebello, 1998), enhanced filters, or better mixes of “pushed” versus “pulled” information (Holtz, 2008) — to the fanciful, including intelligent agents (Mathiassen and Sørensen, 2002), Semantic Web designs (Breslin, et al., 2009), and more. Cognitive traits can work against overload as well. Higher amounts of polychronicity, when neither excessive nor deficient, can improve job fit and well–being (Hecht and Allen, 2005). Heavy media multitaskers show lower performance in task–switching than light media multitaskers (Ophir, et al., 2009). Hybrid solutions exist as well: in a study on a groupware implementation, the lack of increase in information overload is attributed to the technology itself and the human propensity toward selectivity as a filtering mechanism (Schultz and Vandenbosch, 1998).
Paradox lost, paradox regained
In spite of decades of research yielding many strategies for mitigating information overload, the problem still exists to an extent that suggests intractability. Brynjolfsson’s identification of the Productivity Paradox — the existence of an orders–of–magnitude increase in IT–delivered computing power with no consequent increase in business productivity (1993) — was explained and at least partially refuted in later years by, among others, Brynjolfsson himself (Karr–Wisniewski and Lu, 2010). Yet we find ourselves in a similar quandary now. The crush of ready information produces productivity losses and gains, both real and perceived, and many of the factors cited above can contribute to either effect. How have we come so far and yet achieved so little insight?
One might argue that the rate of technological change has simply outpaced human cognitive ability, that new tools and strategies to stem the flow of information do not scale to match the increase in volume. Some have postulated that new technologies (specifically, social media) have accretively added to the mess or even created an entirely new type of overload: “social information overload” (Passant, et al., 2009), and look to technological improvements for answers. However, as technology becomes increasingly tailored to better facilitate social paradigms, we cannot hope for insight into this paradox without examining the problem in a social context. A deterministic, rational approach, where people behave predictably according to preordained rationales, no longer works.
Homo informaticus: Rationality and the “information person”
Fundamental modern economic theory, as conceptualized by John Stuart Mill and crystallized in Adam Smith’s Wealth of nations, depends upon a conceit of agents who act in predictable ways. Economists’ “basic unit of study” , then, became an archetype of a human individual embodying specific traits brought to bear upon the economy in question. Rudiments of most modern Western theory
“were associated with the concept of economic man , the cause and consequence of economic activity. During the earliest periods economic man was ‘a relatively low–level abstraction thought to be descriptive of human nature. This description stressed self–interestedness, the securing of pleasure and the avoidance of pain, and rational calculation based on excellent knowledge of market conditions’ ” .
Simon (1955) highlights the idealized characteristics of economic man and the rationality whereby he acts: “This man is assumed to have knowledge of the relevant aspects of his environment which, if not absolutely complete, is at least impressively clear and voluminous. He is assumed also to have a well–organized and stable system of preferences, and a skill in computation that enables him to calculate, for the alternative courses of action that are available to him, which of these will permit him to reach the highest attainable point on his preference scale” . Despite the instructive benefits of a simplified, rationalized research subject, conclusions drawn from it rely upon a significant set of implicit assumptions: a fixed modus operandi that reliably places the interests of one’s self before those of others, a focus (per Mill) purely on pleasure as a “good”, and an intellectual calculus that not only possesses keen market insight, but also makes efficacious use of such insight.
Such assumptions, as recent global economic woes aver, can prove problematic. An individual does not always behave “rationally,” in any sense of the word, and moreover, even the possession of complete information does not guarantee that an individual will either inform himself completely or act optimally — much less predictably — given such information. Capurro (2005) observes that “there is no possibility for us to fill the gap between information and knowledge and, consequently, between trust and anxiety. There is no mood–free rational economy. Even more, moods are not the opposite to rationality but rationality itself is already in a mood of a knower who trusts (or not) sense data and his/her (imperfect) predicting capacity”. Economic rationality (the term), then, is as much a construct as economic man; human rationality in itself is as dependent on mood as any other behavioral trait.
Cognizant of such failings, and dubious of the model’s suitability as a keystone for subsequent theorization, Simon proposed a behavior model of rational choice to “replace the global rationality of economic man with a kind of rational behavior that is compatible with the access to information and the computational capacities that are actually possessed by organisms, including man, in the kinds of environments in which such organisms exist” . Drawing in part from psychological theory of rational behavior, Simon constructed definitions of “approximate” rationality that aimed to create a more realistic actor within the model, a “choosing organism of limited knowledge and ability” . This new model would take into account, among other things, variation in the information–gathering process, acknowledging both cost and the related extent of the process (given a non–zero cost). Economic man, he surmised, does not have to act rationally to the point of (absolute) optimization — he only has to act rational enough: “Under favorable circumstances, [a simple pay–off] procedure may require the individual to gather only a small amount of information — an insignificant part of the whole mapping … If the search for an a having the desirable properties is successful, he is certain that he cannot better his choice by securing additional information” . By injecting a step of ascertaining how far the information–gathering task must extend, Simon rendered the rational actor’s decision–making process variable rather than fixed, streamlining the process under favorable conditions but complicating it otherwise.
Simon’s subsequent work yielded the concept of “bounded rationality,” the sense that our rationality is always influenced by other factors. “… [T]he execution of our rational capacities makes use of our resources, and our temporal, computational, and motivational resources, and whatever else we need for deciding, are always limited”  “Satisficing,” taking just as much information as needed, has been described as both a common adjustment in overload situations and an exemplar of bounded rationality (Bawden and Robinson, 2008).
Presciently, Simon’s Behavioral Model also envisions some of the socio–technical challenges still prevalent in the issues surrounding information overload. He notes that his attention to the information–gathering and –processing phases of decision–making may
“suggest approaches to rational choice in areas that appear to be far beyond the capacities of existing or prospective computing equipment. The comparison of the I.Q. of a computer with that of a human being is very difficult. If one were to factor the scores made by each on a comprehensive intelligence test, one would undoubtedly find that in those factors on which the one scored as a genius the other would appear a moron — and conversely. A survey of possible definitions of rationality might suggest directions for the design and use of computing equipment with reasonably good scores on some of the factors of intelligence in which present computers are moronic.” 
Aware of the strengths and weaknesses in both human and non–human computational abilities, Simon notes that each party’s set of attributes complement those of the other’s, raising important implications for design, not to mention use — anticipating to some small degree Ackerman’s (2000) socio–technical gap.
Roberts’ apt search for an analogous “information man” reflects just such a gap. Information man, he asserts, represents a basic unit of study for information users — essentially actors within an economy of information. Similar to economic man, information man displays the same habits and predilections (engaging in “rational” acts based on complete knowledge of information sources in seeking out “optimal” information, dwelling in an infocentric world shaped only by information use and unaffected by outside factors, and acting only within a conspicuously artificial environment such as a formal information system of a single organization). Information scientists, therefore, have conceptualized only the most primitive of information man, based on the same set of assumptions manifested in early economic man, and by theorizing on such a construct, have achieved only the same unenlightening results as economists basing theories on rudimentary concepts of rationality. “The frustratingly dead–end character of user studies based upon simplistic behavioural assumptions, and of quantitative work unillumined by systematically sought explanation, has led to developments which broadly parallel those observed in economics, although over a much shorter time span” . Rigid exclusion of qualitative studies has prevented the capture of specific information about the user and the organization within which she interacts, preventing a fuller understanding of information man.
Attempts at a modern rationality as it applies to information overload proliferate, and many of them incorporate social dynamics. Some question basic assumptions about rationality itself in the design of information systems, arguing that the design process benefits from an increased awareness — and conscious manipulation — of symbolism as it pertains to an organization’s socio–technical interactions (Hirschheim and Newman, 1991). Kumar, et al. (1998) expand on Kling’s first and second rationalities (the first, an econo– and technocentric rationality where humans and systems work in harmony toward the economic interest of the organization; the second, a more bounded rationality that allows for investigation of human and social phenomena, in which power and politics play a role and on occasion, work against the interests of the greater organization) to construct a third way that takes into account the advances of the second rationality, but with an increased emphasis on cooperation and trust, which, in a real–world setting, emerged as the predominant values driving socioeconomic behavior (Kumar, et al., 1998). However, although analyzing information economies through such a framework offers significant insight, it raises the same concerns as those associated with previous rationalities — even bounded rationality still projects the biases of the framework onto the object of analysis. Furthermore, such a framework displays worrisome similarities to the first rationality (in which humans and systems work harmoniously), except that now, instead basing itself on perfect cooperation, the model accepts a slightly less idealized version. By contrast, so–called “local rationality” examines a manager’s propensity to apply rationality to pieces of information that may not be the ones most salient to a decision’s broader context. Thus, the very presence of information may have dysfunctional consequences even if decision–makers do not process it incorrectly, suggesting that information’s effect on decision–making and its effect on performance are mutually exclusive. On a larger scale, it offers insight as to why information–“naïve” organizations can outperform those with comparatively sophisticated information systems (Glazer, et al., 1992), and it underscores the vexing inconsistencies inherent in rationality–based analysis.
In a review of relevant literature, such perspectives have existed in the minority, marginalized (bounded?) by other rationalities, purely technical approaches, or cognitive analyses. Most analysis dependent upon social dynamics tends to display a form of “rationality” by focusing on a single social attribute or set of attributes, resulting in arguments underpinned by the implicit assumption that the particular dynamic studied exists in a vacuum, unaffected by other social factors (or, for that matter, technological or cognitive ones). Yet there exists a large enough body of literature to support a more holistic social theorization toward information overload.
Toward a holistic social theory of information overload
As Brown and Duguid (2000) established a decade ago, technology design and use stand to gain when examined within their encapsulating social context. But as our information technology platform grows in sophistication, its closer approximations of social interaction tend to heighten rather than resolve many social issues. Existing research — even work not directly focused on information overload — shows that issues of cooperation, motivation, social networks, power/politics, reputation, knowledge sharing, notifications, and norms consistently reveal themselves in contexts of information use that affects overload. These issues often work in concert or at odds with one another, facilitating, magnifying, or counteracting each other’s effects upon information overload.
In organizations, cooperation varies in situations of abundance versus scarcity, and yet there is no scholarly consensus on the correlation between scarcity and cooperation. Applying a social dilemma perspective to the question, Aquino and Reed (1998) suggest that scarcity in an organizational setting can create the perception of a divergence of interest between an individual and the group, which can result in competition and conflict that negatively impact the organization’s operations. Such effects are moderated by two factors: ability of the members to communicate, and the distribution of access to the shared resources (Aquino and Reed, 1998). But in an information context, do such dynamics transfer reliably? Information is, in many cases, infinitely replicable, hence non–rivalrous, and in fact a potential catalyst of information generation. Should information overload, then, lead to greater amounts of cooperation? No; in fact, just as with physical resources, the opposite is likely true. A study done on knowledge sharing among managers in the People’s Republic of China revealed that individual factors, more than other variables, had significant impacts: greed decreased sharing, and self–efficacy increased it (Lu, et al., 2006).
Knowledge sharing, itself a social act — and one that can be fraught with ritualistic undertones (Traweek, 1988), MacKenzie with Spinardi (1996) brings its social characteristics to bear on information use and, by extension, overload. Lu, et al. (2006) also found that organizational support increased utilization of information and communication technologies (ICT) resulting in more knowledge sharing, which might influence overload either positively or negatively. Further, the distinction between types of knowledge can complicate things as well. High–quality information can have low value (Zhao, et al., 2008) because of lack of relevance or other factors. A failure to capture or meaningfully render implicit (tacit) knowledge can result in an inability to learn from past experience (Zhao, et al., 2008); unfortunately, organization–supported knowledge sharing through higher ICT utilization proves more effective in dissemination of explicit, rather than implicit, knowledge (Lu, et al., 2006). Tacit knowledge may also impair knowledge renewal (Rong and Grover, 2009). Increased awareness of the social aspects underpinning knowledge sharing, then, may aid in the act itself, although the sharing of implicit knowledge still poses greater challenges.
Social network characteristics and analysis play an important role in overload as well. Managers who spent more time gathering information were more likely to perceive a strategic issue in a context of uncertainty as a threat; this was mitigated by how diverse a body of information they found (Anderson and Nichols, 2007). It stands to reason, therefore, that the amount of reliance placed on one’s social network and the composition of one’s network (in terms of degree, tie strengths, etc.) would have powerful impacts upon the manager’s ability to find information — and the level of diversity of the information retrieved — and thus her perception of the issue itself. Furthermore, the manager’s level of information overload, both real and perceived, would exert significant weight in this equation as well: would a manager in an overload situation be more/equally/less likely to rely on her social network than upon other sources?
Anderson (2008) examines this question in an empirical study on managers’ information gathering behaviors, positing that individual differences in motivation define a manager’s willingness to maximize information–gathering benefits in their social networks. The study’s results demonstrate that individual network characteristics do affect the information benefits one can derive, but that these effects are stronger for managers motivated to utilize them. The findings here imply that these two traits result in improved information gathering. Cognizant that better and/or more information is not a panacea, one can argue with some credibility that information overload, narrowly defined, might be mitigated by social network characteristics and the motivation to take advantage of them.
However, social qualities and motivation in and of themselves do not necessarily confer benefits. Robert and Dennis (2005) uncover an intriguing paradox about media containing variant levels of “social presence,” the ability to convey the psychological impression that people are physically present. The use of so–called rich media high in social presence increases motivation but decreases the ability to process information, whereas lean media low in social presence decreases motivation but increases the ability to process information. Thus rich media (high in social presence) has the simultaneous, contradictory capacity to enhance and hamper performance. Such a paradox poses a fascinating question for investigators of information overload: if any medium of any level of social presence — from e–mail to face–to–face communication — raises the risk of information overload (either via decreased motivation or decreased processing capability), what can we infer about the pervasive nature of information overload itself?
Issues of power also present themselves prominently in the information overload equation, and the results are paradoxical as well. Perceived information overload intensity seems to have a stronger relation “to power distance than to the volume of written information or number of information transactions processed by an individual” (Kock, et al., 2009). This “information overload paradox” ((Kock, et al., 2009) highlights the lack of attention given to power relations in analyzing perceived overload. Similarly, knowledge renewal is strongly correlated with an IT individual’s perception of her department’s dynamism, career satisfaction (both intrinsic and extrinsic), tolerance of ambiguity, and level of “delegation” (knowledge sharing via organization structural means). Further, tacit knowledge, because of its extreme difficulty of transfer, may actually hamper knowledge renewal based on issues of authority, specifically, the perception that an individual does not truly “own” the knowledge and is instead merely a “user” of it (Rong and Grover, 2009). Elements of power relations run through motivation to renew knowledge: the size of one’s social network, the relative dynamism of one’s department, and the amount of tacit knowledge one possesses.
Reputation, too, can be seen through a lens of power. In an internal knowledge market flush with information, attention becomes the resource in contention (Hansen and Haas, 2001). The more selective an approach a knowledge supplier takes, filtering low quality information and providing only the highest quality items, the higher the supplier’s reputation as a valued resource and the more attention it garners. For lower–quality resources, a vicious cycle ensues: the more information of lower utility it posts, the less time information–overloaded consumers can afford to spend culling through it all. Ultimately the consumers abandon the resource, putting their attention on a resource with higher–quality information. The low–quality resource’s response is to publish more (lower–quality) information in an attempt to win back those overloaded consumers (Hansen and Haas, 2001). As it becomes more difficult to glean knowledge from the sheer amounts of information, therefore, the supplier with a reputation for the capability to minimize overload — by providing a lower volume of higher–quality information — exerts the most power.
Or does it? In a modern enterprise environment, such power assumes a multivalent quality. Zammuto, et al. (2007) postulate that simulation, an Enterprise 2.0 “affordance” that offers the capability to explore what–if scenarios, creates a kind of virtual data reality whose infinitely replicable nature may ease decision making or add to an accumulation of data. It “can favor or shape a variety of uses … from empowering action to information overload” (Zammuto, et al., 2007). Speculative data models may result in higher–quality knowledge, but by their nature they require higher quantities as well, increasing information overload for all who seek to benefit from the knowledge, thus throwing off the old rationality of the competitive knowledge market. In such cases, power relations shift too: as business intelligence moves increasingly into the hands of information “consumers” (in the form of ad hoc simulative reporting), knowledge becomes decentralized and personalized, and the model becomes even more “pull”–oriented than before. Control of overload, therefore, reverts almost entirely back to the consumer in this case, although the greater technological power (in the form of virtually endless data “realities”) increases its likelihood. Furthermore, issues of identity, authority and motivation could be expected to appear, as high–currency agents in the old knowledge economy adjust to the new equilibrium. Suppliers accustomed to attention (and its concomitant benefits) and consumers who enjoyed higher status/productivity/self–perception based on their knowledge connections would inevitably undergo some type of transition, although the impact of such a change is not clear.
Similarly, questions abound when examining notifications and norms in this context. In today’s overloaded work environment, multitasking during meetings via the use of smart phones and laptops has increased in prevalence. Individuals who perceive themselves as suffering from overload tend to “e–multitask” more frequently (Stephens and Davis, 2009). Equally or perhaps more significantly, individuals who observe others’ e–multitasking behavior, and who deem that behavior acceptable, will increase their own meeting multitasking behaviors, and this reveals itself both at the individual and organizational levels (Stephens and Davis, 2009). This has important implications because unlike other overload–related behaviors, multitasking effectively may depend more on cognitive than social factors: working memory, fluid intelligence, and attention are predictors of multitasking performance, but polychronicity itself and extraversion are not (Konig, et al., 2005). This finding exemplifies one of the central paradoxes in the contemporary overloaded work environment: social pressures that provoke behavior perceived by the group or individual to mitigate problems, when in actuality such behavior either depends upon individual social/psychological/technological characteristics for efficacy, or it exacerbates such problems outright.
Co–evolution of tool and user can contribute to such uncertainty as well. E–multitasking, discussed previously, can take the form of multi–communication, simultaneous overlapping conversations. This activity, now a commonplace interaction in business settings, is accompanied by its own set of norms and affected by perceptions about one’s own effectiveness at it. As errors made during interactions may influence future interactions (Reinsch, et al., 2008), potential risks may be higher while, paradoxically, the act of multicommunication inherently divides — and thus reduces — attention paid to each conversation. Again, if attention is power in more rational contexts of overload, adaptations of both user and tool can turn such a notion on its head, changing the nature of overload itself.
To expand upon O’Reilly’s (1980) notion of an IT organization as an information processing system, it is not just a tool for automating existing processes: it has become an enabler of changes to the organization itself that in turn lead to productivity gains (Zammuto, et al., 2007). The IT organization’s evolution from the purely technical to the socio–technical has precipitated changes not only in the way information is generated, shared, and gathered, but also in the way it is absorbed as well. In contrast to other learning theories such as behaviorism, cognitivism, or constructivism, connectivism reflects the exponential increase in knowledge itself and its rapidly shrinking “half–life”, emphasizing the accelerating pace of knowledge acquisition and the increased role that social connections play (Siemens, 2004). Most individuals will work in multiple fields during their working life rather than a single one, necessitating at least one major retraining phase. Learning will be done more informally, via more ad hoc methods such as on–the–job (a concept decreasingly divorced from physical locale) training or by otherwise experiential means. Social networks, communities of practice, and the technology required to facilitate such connections will increase in importance. The “pipe” will become central, rather than merely the content flowing within it (Siemens, 2004). Visible manifestations of our increasing interconnectedness in the workplace already exist, in the tools used as much as the way in which individuals and groups use them.
Evidence of connectivist principles (if not their formal use) is visible in socio–technical communities (STC), typically consisting of social relations between individuals or groups focused on a specific interest or problem, within a specific institution or organization (Jahnke, 2010). Moreover, STCs can decrease the complexity and information overload within an organization, possibly by enabling individuals to obtain only the information they need at a given time (Jahnke, 2010). By leveraging one’s connection to a community generating knowledge in a particular area, an information consumer may receive a “low flow” of high–quality information. However, it is entirely possible that as an STC expands, the flow of information will increase beyond optimal (or even acceptable) thresholds, requiring the development of new techniques or strategies.
If multi–communication and connectivism represent signs of co–evolutionary change, one could envision that behavioral and cognitive shifts promoted by co–evolution have begun to alter the types of overload experienced within the workplace. Research has identified “supertaskers,” individuals who demonstrate markedly superior performance in dual–tasking, suffering none of the ordinary performance penalties involved in switching from one task to the next (Watson and Strayer, 2010) . Supertasking ability has not been linked definitively to neurological or genetic factors (Watson and Strayer, 2010); might we attribute this to co–evolution as well, and if so, how do individual propensities toward overload change in such a paradigm?
This paper cannot endeavor to be a complete assessment of information overload, related types, contributing factors, or potential solutions. It is certainly not the first to examine issues of information (or even information overload) within a social context, nor to re–examine rationality in this area. However, this work exists in the recognition that new perspectives on existing things can be illuminating, and in the hope that perhaps it may serve in such a way.
A proliferation of complex social variables interconnects the various manifestations and effects of information overload. As with any other dynamic system, a single change can initiate dramatic and far–reaching reverberations. Variables like the ones examined in this paper, then, form their own “social network,” with the same uniqueness and capacity for metamorphosis. To zero in on individual components can push others out of the frame.
Analysis of social dynamics as they relate to information overload has shown a similar inclination toward narrowness, examining in depth a specific paradigm to the reduction or exclusion of others. While it has produced some instructive results, this lack of inclusion has been done at the expense of a broader understanding of information overload in a holistic social context. Curiously, research done in this way finds itself culpable of some of the same criticisms of rationality–based models that it in itself may have leveled. This is not to impugn the value of previous research on the topic; rather, it is to suggest a new way of thinking when it comes to overload, one that acknowledges the restrictiveness of examining specific social dynamics and the necessity for crosscutting analyses that seek to understand with greater clarity how the interplay between such variables affect the whole. Paradoxically, this may require expanding the scope of relevant research in order to view emergent patterns in all their completeness — resulting, perhaps, in its own new form of information overload to confront as well.
About the author
Anthony Lincoln is a graduate student at the UC Berkeley School of Information. He has over 15 years of experience in information management for private–sector companies including Netscape Communications and public–sector institutions such as Lawrence Berkeley National Laboratory.
Direct comments to alincoln [at] ischool [dot] berkeley [dot] edu.
1. Zeldes, 2009, p. 30.
2. Mason, et al., 1995, p. 52.
3. Despite the primacy and ubiquity of the phrase “information overload”, many of the works surveyed in this paper use the terms “data,” “information,” and in some cases “knowledge” loosely and on occasion interchangeably. (Although “wisdom,” mercifully, is not.) I will attempt to normalize where possible and rationalize when necessary.
4. Edmunds and Morris, 2000, p. 18.
5. Eppler and Mengis, 2004, p. 325.
7. They also credit “information fatigue syndrome” to Oppenheim (1997).
8. Wurman, 2001, p. 14.
9. Edmunds and Morris, 2000, p. 18.
10. Miller utilizes “chunks” rather than bits because of phoneme differences in words.
11. This is, of course, not strictly true: time–division multiplexing, one of the more common ways for a computer to multitask, allocates slices of CPU time to each job in small enough quantities (typically on the order of microseconds) as to be imperceptible to a human operator.
12. Exemplified perhaps by Babbage’s difference engine, a mechanical computer just a bit ahead of its time.
13. A plausible (if somewhat melodramatic) modern conclusion is that we are all information workers now.
14. Compared to perceived information overload, perceived underload correlates with improved decision–making, despite lower satisfaction (O’Reilly III, 1980).
15. Roberts, 1982, p. 94.
16. Thought initially to be a pejorative for Mill’s concept (Wikipedia, n.d.).
17. Gould and Kolb, 1964, p. 223.
18. Roberts, 1982, p. 94.
19. Simon, 1955, p. 99.
21. Simon, 1955, p. 114.
22. Simon, 1955, pp. 106–107.
23. Spohn, 2002, p. 12. But see Bolton and Ockenfels (2009), who show, conversely, that investigating perfect reputation systems may result in valuable insights despite their simplicity.
24. Simon, 1955, p. 114.
25. Roberts, 1982, p. 102.
26. These individuals bring to mind generational distinctions in tool utilization, epitomized by ubiquitous thumb–texting youths and their parents who hunt and peck with an index finger.
M.S. Ackerman, 2000. “The intellectual challenge of CSCW: The gap between social requirements and technical feasability,” Human–Computer Interaction, volume 15, numbers 2–3, pp. 181–203.
M.H. Anderson, 2008. “Social networks and the cognitive motivation to realize network opportunities: A study of managers’ information gathering behaviors,” Journal of Organizational Behavior, volume 29, number 1, pp. 51–78.http://dx.doi.org/10.1002/job.459
M.H. Anderson and M.L. Nichols, 2007. “Information gathering and changes in threat and opportunity perceptions,” Journal of Management Studies, volume 44, number 3, pp. 367–387.http://dx.doi.org/10.1111/j.1467-6486.2006.00678.x
K. Aquino and A. Reed II, 1998. “A social dilemma perspective on cooperative behavior in organizations: The efects of scarcity, communication, and unequal access on the use of a shared resource,” Group & Organization Management, volume 23, number 4, pp. 390–413.http://dx.doi.org/10.1177/1059601198234004
D. Bawden and L. Robinson, 2008. “The dark side of information: Overload, anxiety and other paradoxes and pathologies,” Journal of Information Science, volume 35, number 2, pp. 180–191.http://dx.doi.org/10.1177/0165551508095781
G.E. Bolton and A. Ockenfels, 2009. “The limits of trust in economic transactions: Investigations of perfect reputation systems,” In: K.S. Cook (editor). eTrust: Forming relationships in the online world. New York: Russell Sage Foundation, pp. 15–36, and at http://www.russellsage.org/sites/all/files/Cook_eTrust_Chap1.pdf, accessed 24 February 2011.
J.G. Breslin, A. Passant, and S. Decker, 2009. “Social Web applications in enterprise” (chapter 12), In: J.G. Breslin, A. Passant, and S. Decker. The social semantic Web. New York: Springer–Verlag, pp. 251–267.
J.S. Brown and P. Duguid, 2000. The social life of information. Boston: Harvard Business School Press.
R. Capurro, 2005. “Between trust and anxiety: On the moods of information policy,” at http://www.capurro.de/lincoln.html, accessed 25 April 2010.
N.L. Chervany and G.W. Dickson, 1974. “An experimental evaluation of information overload in a production environment,” Management Science, volume 20, number 10, pp. 1,335–1,344.
M.V. Cloonan, 2010. “Information overload in the 19th Century,” Chronicle of Higher Education, volume 56, number 25 (28 February), p. B18.
Economist, 2010. “A special report on managing information: Data, data everywhere,” Economist (25 February), at http://www.economist.com/node/15557443?story_id=15557443, accessed 23 March 2010.
A. Edmunds and A. Morris, 2000. “The problem of information overload in business organisations: A review of the literature,” International Journal of Information Management, volume 20, number 1, pp. 17–28.http://dx.doi.org/10.1016/S0268-4012(99)00051-1
M. Eppler and J. Mengis, 2004. “The concept of information overload: A review of literature from organization science, accounting, marketing, MIS, and related disciplines,” Information Society, volume 20, number 5, pp. 325–344.http://dx.doi.org/10.1080/01972240490507974
R. Glazer, J.H. Steckel, and R.S. Winer, 1992. “Locally rational decision making: The distracting effect of information on managerial performance,” Management Science, volume 38, number 2, pp. 212–226.http://dx.doi.org/10.1287/mnsc.38.2.212
M.T. Hansen and M.R. Haas, 2001. “Competing for attention in knowledge markets: Electronic document dissemination in a management consulting company,” Administrative Science Quarterly, volume 46, number 1, pp. 1–28.http://dx.doi.org/10.2307/2667123
T.D. Hecht and N.J. Allen, 2005. “Exploring links between polychronicity and well-being from the perspective of person–job fit: Does it matter if you prefer to do only one thing at a time?” Organizational Behavior and Human Decision Processes, volume 98, number 2, pp. 155–178.http://dx.doi.org/10.1016/j.obhdp.2005.07.004
R. Hirschheim and M. Newman, 1991. “Symbolism and information systems development: Myth, metaphor and magic,” Information Systems Research, volume 2, number 1, pp. 29–62.http://dx.doi.org/10.1287/isre.2.1.29
S. Holtz, 2008. “Bring your intranet into the 21st century,” Communication World, volume 25, number 1, p. 14.
I. Jahnke, 2010. “Dynamics of social roles in a knowledge management community,” Computers in Human Behavior, volume 26, number 4, pp. 533–546.http://dx.doi.org/10.1016/j.chb.2009.08.010
P. Karr–Wisniewski and Y. Lu, 2010. “When more is too much: Operationalizing technology overload and exploring its impact on knowledge worker productivity,” Computers in Human Behavior, volume 26, number 5, pp. 1,061–1,072.
N. Kock, N., A.R. Del Aguila–Obra, and A. Padilla–Meléndez, 2009. “The information overload paradox: A structural equation modeling analysis of data from New Zealand, Spain, and the USA,” Journal of Global Information Management, volume 17, number 3, pp. 1–19.http://dx.doi.org/10.4018/jgim.2009070101
C.J. Konig, M. Buhner, and G. Murling, 2005. “Working memory, fluid intelligence, and attention are predictors of multitasking performance, but polychronicity and extraversion are not,” Human Performance, volume 18, number 3, pp. 243–266.http://dx.doi.org/10.1207/s15327043hup1803_3
K. Kumar, H.G. van Dissel, and P. Bielli, 1998. “The merchant of Prato–revisited: Toward a third rationality of information systems,” MIS Quarterly, volume 22, number 2, pp. 199–226.http://dx.doi.org/10.2307/249395
L. Lu, K. Leung, and P.T. Koch, 2006. “Managerial knowledge sharing: The role of individual, interpersonal, and organizational factors,” Management and Organization Review, volume 2, number 1, pp. 15–41, and at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=889983, accessed 1 May 2010.
D.A. MacKenzie with G. Spinardi, 1996. “Tacit knowledge and the uninvention of nuclear weapons” (chapter 10), In: D.A. MacKenzie. Knowing machines: Essays on technical change. Cambridge, Mass.: MIT Press, pp. 215–260.
R.O. Mason, F.M. Mason, and M.J. Culnan, 1995. Ethics of information management. Thousand Oaks, Calif.: Sage.
L. Mathiassen and C. Sørensen, 2002. “A task–based theory of information services,” Proceedings of the Twenty–fifth Information Systems Research Seminar in Scandinavia (IRIS ’25), at http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.101.7508&rep=rep&1type=pdf, accessed 24 February 2011.
G.A. Miller, 1956. “The magical number seven, plus or minus two: Some limits on our capacity for processing information,” Psychological Review, volume 63, number 2, pp. 81–97, and at http://www.musanim.com/miller1956/, accessed 24 February 2011.
S. Mollman, 2010. “How can we cope with information overload?” CNN.com (4 February), at http://www.cnn.com/2010/TECH/02/03/content.overload/index.html, accessed 31 March 2010.
M. Montebello, 1998. “Information overload — An IR problem?” at http://www.computer.org/portal/web/csdl/doi/10.1109/SPIRE.1998.712984, accessed 24 April 2010.
C.A. O’Reilly III, 1980. “Individuals and information overload in organizations: Is more necessarily better?” Academy of Management Journal, volume 23, number 4, pp. 684–696.http://dx.doi.org/10.2307/255556
E. Ophir, C. Nass, and A.D. Wagner, 2009. “Cognitive control in media multitaskers,” Proceedings of the National Academy of Sciences, volume 106, number 37 (15 September), pp. 15,583–15,587.
A. Passant, P. Kärger, M. Hausenblas, D. Olmedilla, A. Polleres, and S. Decker, 2009. “Enabling trust and privacy on the social Web,” paper presented at the W3C Workshop on the Future of Social Networking (15–16 January 2009, Barcelona), at http://www.w3.org/2008/09/msnws/papers/trustprivacy.html, accessed 24 February 2011.
N.L. Reinsch, Jr., J.W. Turner, and C.H. Tinsley, 2008. “Multicommunicating: A practice whose time has come?” Academy of Management Review, volume 33, number 2, pp. 391–403.http://dx.doi.org/10.5465/AMR.2008.31193450
L.P. Robert and A.R. Dennis, 2005. “Paradox of richness: A cognitive model of media choice,” IEEE Transactions on Professional Communication, volume 48, number 1, pp. 10–21.http://dx.doi.org/10.1109/TPC.2004.843292
N. Roberts, 1982. “A search for information man,” Social Science Information Studies, volume 2, number 2, pp. 93–104.http://dx.doi.org/10.1016/0143-6236(82)90003-5
G. Rong and V. Grover, 2009. “Keeping up–to–date with information technology: Testing a model of technological knowledge renewal effectiveness for IT professionals,” Information & Management, volume 46, number 7, pp. 376–387.http://dx.doi.org/10.1016/j.im.2009.07.002
A.G. Schick, L.A. Gordon, and S. Haka, 1990. “Information overload: A temporal approach,” Accounting, Organizations and Society, volume 15, number 3, pp. 199–220.http://dx.doi.org/10.1016/0361-3682(90)90005-F
U. Schultz and B. Vandenbosch, 1998. “Information overload in a groupware environment: Now you see it, now you don’t,” Journal of Organizational Computing, volume 8, number 2, pp. 127–148.
G. Siemens, 2004. “Connectivism: A learning theory for the digital age,” elearnspace (12 December), at http://www.elearnspace.org/Articles/connectivism.htm, accessed 19 April 2010.
H.A. Simon, 1955. “A behavioral model of rational choice,” Quarterly Journal of Economics, volume 69, number 1, pp. 99–118.http://dx.doi.org/10.2307/1884852
W. Spohn, 2002. “The many facets of the theory of rationality,” Croatian Journal of Philosophy, volume 2, number 6, pp. 249–264.
L. Sproull and S. Kiesler, 1986. “Reducing social context cues: Electronic mail in organizational communication,” Management Science, volume 32, number 11, pp. 1,492–1,512.
K.K. Stephens and J. Davis, 2009. “The social influences on electronic multitasking in organizational meetings,” Management Communication Quarterly, volume 23, number 1, pp. 63–83.http://dx.doi.org/10.1177/0893318909335417
S. Traweek, 1988. Beamtimes and lifetimes: The world of high energy physicists. Cambridge, Mass.: Harvard University Press.
J.M. Watson and D.L. Strayer, 2010. “Supertaskers: Profiles in extraordinary multitasking ability,” Psychonomic Bulletin & Review, volume 17, number 4, pp. 479–485.http://dx.doi.org/10.3758/PBR.17.4.479
Wikipedia, n.d. “Homo economicus,” at http://en.wikipedia.org/wiki/Homo_economicus, accessed 24 February 2011.
R.S. Wurman, 2001. Information anxiety 2. Indianapolis, Ind.: Que.
A.S. Zaki and R.C. Hoffman, 1988. “Information type and its impact on information dissemination,” Journal of Managment Information Systems, volume 5, number 2, pp. 71–81.
R.F. Zammuto, T.L. Griffith, A. Majchrzak, D.J. Dougherty, and S. Faraj, 2007. “Information technology and the changing fabric of organization,” Organization Science, volume 18, number 5, pp. 749–762.http://dx.doi.org/10.1287/orsc.1070.0307
N. Zeldes, 2009. “Infoglut,” IEEE Spectrum, volume 46, number 10, pp. 30–55.http://dx.doi.org/10.1109/MSPEC.2009.5267994
Y. Zhao, L.C.M. Tang, M.J. Darlington, S.A. Austin, and S.J. Culley, 2008. “High value information in engineering organisations,” International Journal of Information Management, volume 28, number 4, pp. 246–258.http://dx.doi.org/10.1016/j.ijinfomgt.2007.09.007
Received 23 June 2010; accepted 24 February 2011.
“FYI: TMI: Toward a holistic social theory of information overload” by Anthony Lincoln is licensed under a Creative Commons Attribution–NonCommercial–NoDerivs 3.0 Unported License.
FYI: TMI: Toward a holistic social theory of information overload
by Anthony Lincoln.
First Monday, Volume 16, Number 3 - 7 March 2011