Reinventing academic publishing online Part I Rigor, relevance and practice
First Monday

Reinventing academic publishing online Part I Rigor, relevance and practice



Abstract
While current computing practice abounds with innovations like online auctions, blogs, wikis, twitter, social networks and online social games, few if any genuinely new theories have taken root in the corresponding “top” academic journals. Those creating computing progress increasingly see these journals as unreadable, outdated and irrelevant. Yet as technology practice creates, technology theory is if anything becoming even more conforming and less relevant. We attribute this to the erroneous assumption that research rigor is excellence, a myth contradicted by the scientific method itself. Excess rigor supports the demands of appointment, grant and promotion committees, but is drying up the wells of academic inspiration. Part I of this paper chronicles the inevitable limits of what can only be called a feudal academic knowledge exchange system, with trends like exclusivity, slowness, narrowness, conservatism, self–involvement and inaccessibility. We predict an upcoming social upheaval in academic publishing as it shifts from a feudal to democratic form, from knowledge managed by the few to knowledge managed by the many. The technology trigger is socio–technical advances. The drive will be that only democratic knowledge exchange can scale up to support the breadth, speed and flexibility modern cross–disciplinary research needs. Part II suggests the sort of socio–technical design needed to bring this transformation about.

Contents

The role of academic knowledge exchange
Feudal knowledge exchange trends
Cross–disciplinary research
Conclusions

 


 

The role of academic knowledge exchange

Introduction

Caveat lector: Previous iterations of what you’re about to read have been dismissed by information systems (IS) editors and reviewers since a first draft written in 1999 after an ISWorld rigor/relevance discussion. Many years of rejection confirm it as unpublishable in IS. This seems partly because high–level papers always have faults, and partly because suggesting to his tailors that the emperor of academic publishing is wearing only the fig leaf of rigor is unwise. If you find the academic publishing system “excellently attired” please read no further, as here we argue it has serious problems that need addressing. Yet our target is not the many good authors, wise reviewers and supportive editors in our field, many of whom are personal friends. Our target is the feudal knowledge exchange system they currently work under. While academia covers many disciplines, our evidential case is the field of technology use — the reader must judge for themselves in their field. Yet as much the same case has been made in the field of quantum physics (Smolin, 2006) our conclusions may benefit others.

Part I argues that the current gate–keeping model of academic publishing is performing poorly as knowledge expands and interacts, and that academic publishing must reinvent itself to be inclusive and democratic rather than exclusive and plutocratic. Part II suggests a design to do this using already successful socio–technical tools.

Knowledge exchange systems

Following Willinsky’s (2000) knowledge exchange model a knowledge exchange system (KES) aims to produce quality human knowledge by:

  1. Development: To create new knowledge that was not there before. Does the system foster tomorrow’s important ideas today? Is it research at the cutting edge?

  2. Discrimination: To discriminate good quality knowledge by peer review. Is the knowledge likely true? Are the arguments logical and the claims valid?

  3. Dissemination: To disseminate and present knowledge well. Are readers educated? Is its knowledge useful, well written, clear, and timely?

A knowledge exchange system succeeds if it produces good knowledge whether physically or electronically. The definition includes non–academic systems like Wikipedia if they create, discriminate and disseminate knowledge. Certainly an academic journal is expected to encourage new research, to separate good from bad research, and to educate its readers (Paul, 2005).

We envision a KES as an orchard whose research “fruit” arises from its roles:

  1. Development: New ideas enter the academic world like seeds, initially small and fragile, needing time and support to grow. Just as one may not know what a seed will become until it sprouts, so a new idea may be unclear until it develops. As an orchard that is not watered or fertilized gives only stunted fruit, so papers need reviews to develop. As an orchard that plants no new trees will soon have only old ones, so a KES that plants no new theories will soon become intellectually barren.

  2. Discrimination: This role is like culling weeds or pruning diseased tree branches, without which orchards get overgrown and disease spreads. Likewise a KES that doesn’t weed out bad research may be overgrown by falsehood, since errors breed errors as weeds breed weeds.

  3. Dissemination: This role compares to packaging and delivering fruit to the customer. As fruit must look good in the shop, so published works should look professional, and as fruit must get from orchard to shop before it rots, so journals must publish a paper while it is still relevant.

The alternate, and it must be said established, view is that academics are the keepers of “guarded channels of knowledge” [1], who protect high quality knowledge as soldiers guard a castle. Hence it is not surprising that many journals today have the “under siege” mentality of castle owners [2]. In social terms, those who manage the “memes” of accepted knowledge have built around themselves protective knowledge walls of jargon and custom so strong that only initiated insiders can cross them. Academic knowledge has become the monopoly of a protective class or caste, who zealously guard its access [3].

While science may once have consisted of amateurs cultivating private knowledge gardens, today it is organized into specialist fiefdoms that defend themselves vigorously. Academics are now gate–keepers of feudal knowledge castles, not humble knowledge gardeners. They have for over a century successfully organized, specialized and built walls against error. However the problem with castles, whether physical or intellectual, is that they dominate the landscape, they make the majority subservient and apathetic, and battles for their power reduce productivity. As research grows, knowledge feudalism, like its physical counterpart, is a social advance that has had its day.

The theory–practice divide

While research progress seems continuous and rational, in reality it is marked by ongoing discontinuities (Bryson, 2003) and occasional revolutions (Kuhn, 1970). It is easy to forget how “obvious” inventions like the cell phone and e–mail were beyond the zeitgeist of their times (Smith, et al., 2002). The predictions of 1995 for the future of software for example did not include open source development, although Linux was there for all to see (Campbell–Kelly, 2008). Breakthroughs like chat rooms, blogs, text messaging and wikis are not media–rich, yet these simple distributed systems were “killer applications”, while touted co–located systems like IBM’s Group Systems have faded into irrelevance (Nunamaker, et al., 1991).

Google, with its simple white screen and one entry box, scooped the search engine field, not Yahoo with its multimedia graphics. People investing in Internet bandwidth expecting a multimedia surge lost money, as did those investing in multimedia helmets for virtual reality games. Video phones are not sweeping the world, despite technology and marketing, but rather the reverse, as texting can be more popular than calling.

Certainly media richness is important, but who foresaw social gaming, where “richness” is created by human interaction not the medium? The usability theories of the day, plus 25,000 hours of user testing, predicted that Mr. Clippy’s friendly graphical help would be a huge success (Horvitz, 2004), but it was one of the biggest software flops of 2001 (PC Magazine, 2001). Asked why plain text products like blogs and e–mail succeeded while multimedia, user–friendly products like Mr. Clippy failed, mainstream IS theory is strangely silent. Microsoft still seems still unaware of the problem (Pratley, 2004), that Mr. Clippy was socially impolite (Whitworth, 2005).

If it is any consolation, the pattern that practice leads while theory bleeds has a long history in computing. Over 25 years ago pundits proclaimed that paper was “dead” to be replaced by the electronic “paperless office” (Toffler, 1980), yet today we use more paper than ever before. James Martin predicted program generators would make programmers obsolete, yet programming today is a thriving industry. A three weekday “leisure society” was supposed to arise as machines took over human work, but workers today are busier than ever (Golden and Figart, 2000). E–mail was supposed to be only for routine tasks, the Internet was supposed to collapse without central control, and computer AI smart–help was supposed to replace people, and so on.

Each case had a grain of truth, but for technology use the predictive power of theory has been low and the gap between theory and practice is widening. In Eric Raymond’s (1997) analogy, the bazaar of technology practice is booming while the cathedral of technology theory is declining, because one is open and one is closed.

Bridging the divide

Given a theory/practice divide, can theory or practice go it alone (Kock, et al., 2002)? Theory going it alone gives metaphysical speculations of the number of angels on a pinhead, while practice alone means painful trial and error evolutions. Theory and practice should work together, giving two paths to progress:

  1. The way of practice: Find what works by intuitive trial and error, then explain it with theory later. Here theory, like the icing on a cake, is applied after a practical advance is made.

  2. The way of theory: Develop a new theoretical vision, then develop it in practice. Now theory, like the recipe used in baking a cake, is used before the progress occurs.

In the first approach, practice innovates then theory explains, while in the second theory envisions then practice builds. Successful progress normally involves both ways, e.g., rockets were first built without theory, but now theory is critical to space rocket launches. Neither approach is “better” as progress needs both. In the field of information technology however the theory/practice relationship seems broken, as if rocket builders found that the less they knew of rocket science the better their rockets flew.

Today’s researchers often first build a new Web site, interface or tool, then look around for a convenient theory in order to publish, i.e., theory now merely accessorizes practice. Indeed the all power to the IT artifact approach of IS (Benbasat and Zmud, 2003) directs scholars to theorize about how IT artifacts are built, how used, and how they impact organizations, so one first needs an artifact, then a theory. Conversely IS/IT theorists increasingly meet a “show me don’t tell me” response to their ideas.

Yet what is a theory but the distillation of previous practice? If physicists had treated Einstein this way he would have had to build a particle accelerator to be heard. In the IS marriage of theory and practice the partners barely speak to each other — practice finds theory barren and theory complains that practice never listens (Klein and Hirschheim, 2003).

However a pragmatic “try it and see” approach working alone has serious limits. While first pickings from the tree of knowledge may come easily from its lower branches, soon running around the tree gives only an occasional windfall. One needs the ladder of theory to reach the higher fruit. Black box approaches struggle with complex systems which by definition have more ways to go wrong than to go right, e.g., imagine managing a space shuttle or nuclear program by trial and error!

Yet creating a new online global society is a socio–technical system as complex as any space program, as socio–technical systems need both social and technical performance to succeed (Whitworth and Moor, 2009c). We cannot expect to progress by trial and error alone. If theory and practice are the two legs of scientific progress, a crippled theory leg is a serious problem. We now suggest the main cause of this is unbalanced rigor.

The rigor problem

If rigorous work is less likely to have errors it would seem that more rigor is inevitably better. However research theory has two types of errors not one:

  1. Type I. Errors of commission: things done that are wrong.

  2. Type II. Errors of omission: things not done that would have been right.

A Type I error claims a false result as true, while a Type II error rejects a true result as false (Rosenthal and Rosnow, 1991) [4]. So journals can err in two ways not one, namely by:

  1. Publishing what is later shown to be wrong (error of commission)

  2. Not publishing what is later shown to be right (error of omission)

While the latter are often overlooked, opportunity costs (value lost by opportunities missed) are a major cause of business failure (Bowman, 2005). WordPerfect no longer dominates word processing not by faults made but by missing the usability opportunity that Microsoft Word took. Similarly the hypertext academic community dismissed Berners–Lee’s World Wide Web idea, seeing HTML as a too simple tag language, but failed to see its enormous potential (Berners–Lee, 2000). Rejecting the idea behind the World Wide Web was a Type II error by that academic community, which missed the chance to be part of progress. The point is that Type II errors are real errors with real consequences.

These error types trade off, so reducing one increases the other, e.g., a journal can reduce Type I errors to 0 percent by rejecting all submissions, but this also raises Type II errors to 100 percent as nothing useful is published. The commonsense principle is that to win a lottery (get value) you must buy a ticket (take risk). In academic publishing the rigor problem occurs when reducing Type I error increases Type II error more, i.e., when more rigor lowers KES performance.

Good knowledge exchange reduces both type I and Type II errors, i.e., avoids faults and takes opportunities, e.g. rejecting a paper with nine good ideas and one bad one in the name of rigor trades nine opportunities for one risk. Given the overall KES goal of advancing knowledge, this may be a bad deal.

The one–dimensional rigor “ladder” is not the path to better research (Davenport and Markus, 1999). If rigor is merely a hygiene factor to relevance, it only has value when combined with it. While food without hygiene may give sickness and death, hygiene without food gives certain death. Likewise, although rigorously reducing Type I errors improves journal health, avoiding Type II relevance errors is critical to survival.

We believe in rigor, but see system performance as a mix of many criteria (Whitworth, et al., 2008), which “bite back” if one criteria is exclusively pursued at the expense of others (Tenner, 1996). The better model of knowledge exchange performance is of an efficient frontier — a line of many points that defines the best one can get of rigor given a value of relevance (Keeney and Raiffa, 1976). Pursuing rigor alone produces rigor mortis in the theory leg of scientific progress.

The role of research

If excess rigor reduces innovation and causes theory to lag behind practice, in IS at least, why not change the strategy? Surely academics prefer to ride the technology wave rather than struggle along behind it?

Yet these problems have been apparent for some time now (Szajna, 1994; Robey and Markus, 1998; Davenport and Markus, 1999). The lack of real change suggests this is a social problem not an information one. Originally, the primary goal of academia was to produce, assess and disseminate knowledge and its secondary role to help allocate business resources. Today one could argue that the role of resource allocation supersedes the role of knowledge growth.

In the big business of university management, department ranks, research funds, PhD scholarships and library allocations all depend on publishing (Rainer and Miller, 2005). While the nominal goal of research is to seek the truth, publishing today is the primary screening mechanism for academic appointments, grants and promotions (Katerattanakul, et al., 2003). To say the goal of academic publishing is to develop, select and diffuse knowledge is naïve when scholarly journals drive all university hiring and firing (Lowry, et al., 2007).

When a system becomes the mechanism for power, profit and control, idealized goals like the search for truth can easily take a back seat. Authors may not personally want their work locked away in expensive journals that only endowed western universities can afford, but business exclusivity requires it. Authors may personally see others as colleagues in a cooperative research journey, but the system frames them as competition for jobs and grants. As academia becomes a business, new ideas become threats to power rather than opportunities for knowledge growth. Journals become the gatekeepers of academic power rather than cultivators of knowledge, and theories battle weapons in promotion arenas, rather than plows in knowledge fields.

That most mainstream IS journals now accept in single digit percentages illustrates how far we have moved from publishing to grow knowledge to publishing to allocate resources. Can a system where rejection is the norm claim its primary goal is producing knowledge value? University courses also aim to exchange knowledge, but a course with a 90 percent failure rate would be morally unacceptable. Yet mainstream IS academic knowledge exchange works this way. It is simply not true that the 90+ percent of papers rejected by the “main” journals have minimal value. Indeed we suggest it is precisely the useful new ideas that are blocked. This exclusivity at the highest level has had, we argue, a toxic effect on academic research creativity. The exchange of academic knowledge has become a system of authority and control.

One can justify distributing rare economic resources to the few, as there is not enough to go around, but one cannot justify distributing knowledge this way, as giving knowledge away does not diminish it. While physical resources distribute by a zero–sum model, information resources follow a non–zero–sum model (Wright, 2000), where the more one gives the more synergy is created (Whitworth, 2009a). Economic scarcity is no argument for knowledge exclusivity.

Conformity training

The modern academic system has become almost a training ground for conformity. PhD students spend three–six years as apprentices under senior direction, then another three–six years seeking the security of a tenured appointment. At both stages, criticizing the establishment is unwise if one wants a career. It is not surprising that six–12 years of such training produces people who toe the party line. While one might expect young researchers to make breakthroughs, a paper written after an ICIS 2005 rigor/relevance debate explicitly advises them not to:

“So for now, unfortunately, I would not recommend PhD students or junior faculty to aim for ‘IS research that really matters.’ My recommendation … would be to stick to their career paths. … not too much research that really matters seems publishable.” (Desouza, et al., 2006)

Due to publishing pressure senior IS leaders explicitly advise new faculty not to innovate if they want a career! As the word “unfortunately” suggests, they take no responsibility for a system that actively drives innovators out to make their breakthroughs in practice, e.g., the movement of automatic indexing from universities to commercial enterprises like Google (Arms, 2008).

While the low–risk, low–gain strategy of risk avoidance may work for tasks like routine factory production, it fails dismally in areas like new technology development, where success requires a high–risk, high–gain strategy. If academia chooses the security of rigor it will lower its externally perceived value:

“The publish or perish syndrome has devalued the original purpose of research in the university … it has led business and political leaders to doubt whether the expected value of research in defense, health, and prosperity have actually occurred.” [5]

Paradoxically, while academic motives like truth make academia good business, business motives like seeking promotion make academia bankrupt. When an academic system becomes a business system it loses both academic and business value, and when business goals overpower academic goals, both fail.

Changing the system

Can this system change itself? IS academics traditionally judge journal importance by measures like internal expert perceptions, number of citations and publication numbers (Hamilton and Ives, 1982). These internally generated and self–reinforcing measures all favor the status quo. As an academic publishing review notes: “What gives this enterprise its peculiar cast is the fact that the producers of knowledge are also its primary consumers.” [6]

Suggestions to make journal rating systems more relevant by adding criteria like timeliness (Rainer and Miller, 2005), submit to publication times (Snodgrass, 2003), readership size and reader rated usefulness (Nerur, et al., 2005) have had little effect.

Current research into journal quality illustrates the contrast between science as a search for gain and science as a search for truth. While accepting that “science can be perceived as a social network which accumulates, distributes and processes new knowledge” [7], they see journal “quality” in terms of stakeholder gains:

  1. So authors can publish in quality journals (for better career impact);

  2. So readers can select quality journals (to save time);

  3. So tenure and promotion committees can choose staff (more easily); and,

  4. So libraries can more easily choose quality publications [8].

The analysis contains no mention of the community good of uncovering the truth, or of any reality beyond individual gains. “Quality” is assumed to equate to rigor:

“It is well known that higher quality journals tend to have more stringent review and quality controls; thus, the findings contained within their articles often have more validity and reliability than those in lower quality journals.” [9]

Yet, as argued, equating quality with rigor is an error, as quality needs both rigor and relevance. When academia incestuously rates itself by citation studies and expert ratings it can easily become a self–reinforcing system disconnected from external reality (Katerattanakul, et al., 2003).

The IS case

The information systems discipline, which addresses how people and organizations use technology to process, transmit and store information, provides an interesting case.

The IS diaspora

As more people use computers for more tasks one might expect to find IS a rapidly growing discipline, but this is not so. While practice has over the last decade innovated systems like eBay, Wikipedia and YouTube, IS academia, crippled by rigor, has hobbled along behind as best it could. As it wandered into the desert of irrelevance inevitably its value came under question, and IS research funding dried up (Robey and Markus, 1998):

“Due to the discontinuity in transferring knowledge created by IS academia to all the IS practitioners, the sources of funding for IS research efforts are few and they too are in jeopardy.” [10]

IS research combines technical, human and social constructs, requires more complex methods, measures and analysis, which take more time to do. When “slow” disciplines like IS rigorously self–mortify their publication rate drops relative to more specialized disciplines at comparable rigor levels (Valacich, et al., 2006), causing lower promotion, tenure and grant rates for the IS field relative to others (Kozar, et al., 2006). Following the exclusive religion of rigor brought famine rather than prosperity to IS.

Consequently, while increasing technology usage around the world created more jobs, applications and research, IS faculty have been cut back or redeployed. The IS discipline, by its own strategy, has managed to shrink itself in an expanding market. The growth in students, staff and research was absorbed by neighbor disciplines like business, engineering, health, education and computing, who added IT groups (Klein and Hirschheim, 2003), e.g., Health computing or Informatics arose to do a job sterile IS research failed to do.

As “retrenchments of IS faculty have been a reality for some years now” [11], refugees from disbanded information systems groups now exist in scattered discipline enclaves, from engineering to psychology, often under the IT name, e.g., the first author is in a science school while the second is in liberal arts. The originally cross–disciplinary “information systems” is increasingly the business sub–discipline of management information systems (MIS). At the same time as “… the concepts upon which IS is focused are becoming increasingly similar to other business disciplines.” (Hovorka, et al., 2009) Since 1990 the role of IS in computing curriculae has shrunk and the IT curriculum has expanded into its place [12].

This discipline diaspora arose partly from outside assault but also from a myopic internal vision directed at finding the holy grail of “IS identity” by strictly following the religion of rigor (Benbasat and Zmud, 2003). This navel–gazing pointed the discipline into itself, when it should have been looking outside itself. There was a major strategic failure of vision and leadership in IS, as a growing academic discipline should be a melting pot of new ideas, not a stagnant pool of old ones.

How rigor constricts

Even respected IS journal editors recognize there is a problem: “Research publications in IS do not appear to be publishing the right sort or content of research.” [13] The cause we suggest is social conformity to old theories. For example two well known theories in IS are:

  1. Technology acceptance model (TAM), which suggests that users assess technology by ease of use and usefulness (Davis, 1989).

  2. Media richness theory (MRT), which links “rich” media to rich interactions (Daft, et al., 1987).

These theories dominate the IS theoretical landscape, even though they are over 20 years old and showing their age, e.g., MRT’s “richness” dimension suggests that people won’t use “lean” media like e–mail in “rich” social relations, but today friends often text and chat. Either plain–text e–mail is “multimedia” rich or MRT over–simplifies human communication (Whitworth, 2009b). Likewise TAM’s prediction that ease of use and usefulness define technology acceptance is valid but it omits criteria like security, reliability and privacy — critical in today’s Internet.

Certainly TAM has been “extended” by many factors, like playfulness, credibility, attractiveness, self–efficacy, behavioral control, user satisfaction, enjoyment and trust (Moon and Kim, 2001; Heijden, 2003; Ong, et al., 2004; Taylor and Todd, 1995; Shih, 2004; Yu, et al., 2005; Pavlou, 2003). There is a flavor of TAM for every taste or need, but how all these variations work together is unclear, as none of these grafts onto the TAM tree has “taken”. These many minor “tweaks” to a major model have cancelled each other out, leading to: “… a state of theoretical chaos and confusion in which it is not clear which version of the many versions of TAM is the commonly accepted one.” [14]

The unified theory of acceptance and use of technology (UTAUT) (Venkatesh, et al., 2003) aimed to authoritatively “upgrade” TAM. While replete with scholarly detail, it merely renamed TAM’s usefulness construct to performance expectancy, renamed the ease of use construct to effort expectancy, then combined this “face–lifted” TAM with eight other equally old psychological and sociological constructs to create a “new” model. Such attempts to re–animate old theories produce zombie theories that live briefly then die without issue. An earlier example was process gain theory (Nunamaker, et al., 1991; Vogel, 1993), which tweaked and briefly successfully resurrected Steiner’s (1972) earlier process loss theory.

Computer practitioners are not fooled:

“Despite their claims of attempting to tackle futuristic problems, many computer science academics continue to pursue fruitless avenues of research and solve problems of interest to, well, no one. In a constant attempt to create a façade of relevance and attract funding, they reinvent their research by simply changing the terminology used in old papers to reflect the new industry trends. It’s an easy way to get papers published that no one reads.” [15]

The problem lies not with “old but good” theories but with a system that seems unable to grow new ones around them. Given the enormous changes of the last decade in computing, the lack of matching theoretical innovation over the same period is nothing short of astounding.

Is the inability to spawn new ideas for a new computing generation because none are available? Why do new authors believe that the only way to get a new idea past the current gatekeepers is to graft it onto an old one, like TAM? The process is hidden so we don’t know, but in the first author’s direct experience an experiment validating a TAM theory alternative was editorially rejected by JAIS in 2005 for the reason that papers critical of TAM never passed review. Essentially the same paper was then sent to and published in a good non–IS venue (Whitworth, et al., 2008). Incredibly, in 2007, a series of JAIS articles wondered if TAM had “over–conquered” IS (Benbasat and Barki, 2007; Straub, 2007; Venkatesh, et al., 2007)? To be clear, when aging theories deny publishing sunlight to new ones it is not conquest, it is exclusion.

Do IS journals say they welcome new theories but in fact reject them in the name of rigor? MISQ recently editorially rejected a latent categorization method analysis of 180 U.S. and European journals which concluded that community factors underlie IS publishing (Larsen, et al., 2008). Shortly after, MISQ published a same method paper using only three top U.S. journals — which found the IT artifact central to IS research [16], following a senior editor’s theory (Benbasat and Zmud, 2003).

Readers can judge for themselves why a broad study with a new conclusion was rejected but a narrow study with an old conclusion accepted. Does the future of IS lie with artifacts or communities? The reader can again decide, but artifacts are the lower technology level and communities are the higher social level, and as socio–technical systems evolve it is the higher levels that increasingly drive progress (Whitworth, 2009b).

The reality is that it is hard to publish a new theory in mainstream IS, if “new” means not an old theory tweak and “theory” means more than speculative conjecture. Innovation is not a term that comes to mind as one reviews technology use theory yet in technology use practice precisely the opposite is true. That progress is coming from practice — not theory — suggests that theory has its priorities wrong.

 

++++++++++

Feudal knowledge exchange trends

We have described a feudal knowledge exchange system run by the few for the few, supported ideologically by the church of rigor, financed by university factories of knowledge, whose goal is to dominate and defend the purity of specialized intellectual fiefdoms. We now outline some inevitable trends of such a system, again for the IS case.

Exclusive

A KES is exclusive when its dominant information flows are narrow in scope and contribution. In competitive economics scarcity reflects demand, so high journal rejection rates become quality indicators. This creates a self–reinforcing system, where exclusive journals that reject more attract more, as their exclusivity makes them more attractive. When journal “impact factor” is number of citations divided by number of publications, publishing many papers dilutes a journal’s citation ranking (Lamp, et al., 2007). When exclusivity is based on rigor, avoiding faults becomes more important than new ideas. Wrongly accepting a paper with a fault gives reputation consequences, while wrongly rejecting a useful paper leaves no evidence, as it doesn’t see the light of day.

So while the IS field has changed considerably over the last decade, its journal rankings have remained remarkably static over time (Rainer and Miller, 2005), and attempts to create more “A” journals have struggled (Avison, et al., 2006; Gallivan and Benbunan–Fich, 2007; Paul, 2007b). Yet a handful of mainstream journals generating the majority of “impact” IS papers, at say 60 papers a year, poorly represents a field with potentially over 10,000 researchers. Also MISQ and ISR often have repeat contributors, usually senior professors who edit or review for these journals and know the norms. That reviewers are invited, mostly from informal connections of editors, invites criticisms of an “old boy network” that replicates in its own image (Furnham, 1990).

The trend is for a few exclusive top journals to dominate the theoretical landscape. The alternative proposed in Part II, is a more democratic system.

Outdated

A KES is outdated when its information flows mainly address issues that are no longer current. Lack of timeliness due to publication delay is a Type II opportunity loss. What use is quality that is too late to affect things, when others have either solved or bypassed the problem? MISQ recently noted its backlog of accepted papers awaiting publication stretched for over a year (Saunders, 2005). Today such delays are not unusual. Add to that one–two years of review, and one–two years of data gathering and paper writing, and “newborn” academic papers are already three–five years old at birth — an extraordinarily long gestation period by any definition. Many journal papers are out of date before they are even published.

Rigor is easier to maintain for known content. A review found IS researchers in 1990 focused on the issues practitioners faced a decade earlier (Szajna, 1994), and the situation is likely much the same today. When academic journals seek the topics that interest their editorial boards, they become records of knowledge past rather than knowledge creators.

The rigor justification that truly good papers will end up published somewhere, so nothing is lost by Type II errors is simply not true. In the glacial world of academic publishing one rejection can delay publication by two–four years. Of the good papers rejected, some despair, some move to greener pastures, but most just conform to reviewer “suggestions”. If rejectees do not try again, publishing delayed, like justice, is publishing denied, as some leave academia for good:

“For young scholars constant rejection leaves them disillusioned and disheartened, especially if they perceive the review process as erratic and destructive. Some leave the academic game after investing much of their lives in equipping themselves to play it.” [17]

The opposite of outdated is current, and only an open access electronic system, as described in Part II, can keep up with the modern rate of change.

Conservative

A KES is conservative if it resists change and innovation. A rising rigor bar means that new theories face a greater burden of proof than old ones (Avison, et al., 2006). That new theories respect the old is reasonable, but when they face critiques that old theories don’t answer either, then those who have climbed the tree of knowledge have pulled the ladder up behind them. New theories rarely rise like Venus from the sea, fully formed and faultless. Usually new ideas begin imperfect and only develop over time with help from others. So if anything, the bias should be the other way. When new theories must be fully proved before they can even be proposed as research questions, then we have got science backwards. As Einstein is said to have said: “If we knew what we were doing, it wouldn’t be called research, would it?”

Faculty on the tenure clock today face long journal review times and low acceptance rates, but if anything need more publication “notches” on their curriculum vitae belt to survive. Since university tenure committee members often rate candidates outside their specialty, the easy way to do this is to count the number of papers in rated journals. Such committees rarely assess content directly, by actually reading the papers. Numbers are ostensibly more “objective”, and also, conveniently, save time.

Hence for authors, a ground–breaking paper involving years of work that changes the field and a trivial spinoff of a prior work both count as “one”. When what is measured is “hits” not knowledge value, it pays authors to increase hits rather than knowledge value, by publishing in “least–publishable–units”, making overlapping variants of the same work, publishing in groups, and by “milking” breakthroughs rather than going on to explore more — in other words by specializing.

Authors who innovate risk their careers, as even their successful innovations may not flourish until after their tenure decision. It should not be this way. Innovators are the “whistle blowers” of academia — they challenge false claims of knowledge profits. A system that rejects its own agents of change rejects its own progress.

New ideas by definition contradict the agreed norm, so can be expected to polarize reviewers. A proposal that offends no one probably changes nothing. Yet in academic hiring one bad reference can kill an appointment [18], and in journal submissions and grant proposals, a “perfect” application must get a perfect score not one person must dislike it. Yet if no one dislikes your work you probably aren’t doing anything worthwhile. Indeed a hallmark of innovation is that it polarizes people — some love it and some hate it. The score tick box system of most grant reviews weeds out creativity.

A hundred years ago Einstein invented special relativity working in the Swiss Patent Office because no university would appoint him. Yet he revolutionized physics. Is the academic system today any more inviting to unorthodoxy? Would Einstein today be both unable to get a job and unable to publish? If so, the community of science is the loser. To choose attractive conformity over thorny innovation is a strategic error of the highest degree. This is the policy that has already produced a recession in IS. Should academia in general follow the same path? As the 2003 National Academy of Sciences President notes:

“We have developed an incentive system for young scientists that is much too risk averse. … of peers who claim that they admire scientific risk taking, but who generally invest in safe science when allocating resources. … This helps to explain why so many of our best young people are doing “me too” science.” [19]

Part II explores how to change this.

Unread

A KES is unread when most of its information flows are not read or understood. Practitioner readership of journals like MISQ has been in sharp decline for some time (Benbasat and Zmud, 1999). A survey of 476 readers of 130 management journals found that 90 percent of academic articles are not even read by journal subscribers (Siggelkow, 2001). As a past President of the ACM notes “… about two million scholarly papers in science and engineering are published each year by 72,000 journals; the vast majority of these papers are read by a few hundred people at most; in most disciplines well over half the papers are never cited by another author.” [20]

Content apathy is illustrated when academic journals invite letters to the editor but receive none. Does “publish or perish” produce academics interested in their own work but not that of others? For example, authors who publish in conferences but then don’t attend, or authors unconcerned you find their ideas are wrong — as long as you cite them. Conversely, why do so many try to publish if so few read their work? Is it a naïve belief that others care, or a cynical view that as long as one is published, who cares who reads?

Readers want knowledge value for their reading effort. More rigor means more complex papers that take more effort to read for often the same value. If journals feel obliged to publish the n+1th rigorous paper on a topic, whether it adds value or not, readers get less semantic bang for their cognitive buck, as authors recycle the same ideas in ever more sophisticated ways. If rigor increases paper complexity and reduces the new ideas per paper, readers can redress the imbalance by reducing reading effort, e.g. skimming titles or abstracts rather than the whole paper. The rigor trend predicts readers skimming rather than digesting, which biases against new ideas.

If the democratic KES outlined in Part II lets everyone publish, won’t that worsen the not–reading problem, as there will be more to read? It would — if the motivation didn’t change, but it will. While in a risk–avoiding system more papers are more error to avoid, in a value–seeking system more papers are more potential value. Readers will use electronic tools, like Google Scholar, to do positive searches. While the literature seems huge, a search on a specific research topic may produce only a handful of relevant papers. Even imperfect papers may have good parts or stimulate new ideas. When the motive moves from following normative ideas to finding useful knowledge, more people will read a greater variety of papers.

The opposite of apathy is involvement and participation, and in Part II we suggest that socio–technical tools can turn readers from passive recipients of pre–selected “quality” to active participants in value generation.

Inaccessible

A KES is inaccessible when most of its potential users cannot write to it or read it. In academia, to contribute one must pass the reviewer firewall. Yet reviewers, who labor unpaid and unknown, are also often overworked. When reviewing, one choice is to accept the paper, but if other reviewers find faults this can be professionally embarrassing. In a rigor admiring environment, the safe option is to find faults, as while to praise when others condemn implies naiveté, a scathing review among praises is commendable rigor. Finding only faults passes the task of recognizing potential to others, just as finding no faults passes on the task of recognizing error. Both accepting all and rejecting all submissions equal lazy reviewing, as is the trend to one–line reviews.

For an anonymous reviewer to spend time growing a paper is not just time–consuming but also invisible. If reviewer’s advice is ignored they waste their time, while if it is taken up the authors get credit for the reviewer’s ideas. An AIS President summed up the trend a decade ago: “Allegations are often made that reviews are not timely, that their quality is low, that they are not supporting and affirming of authors, and that they reflect the prejudices of an ‘elite’ who control the journals … Based on my own experiences, I believe the allegations have some foundation.” [21] The rigor trend predicts negatively driven reviewing based on denying faults rather than growing value. In contrast the democratic KES outlined in Part II can report review contributions and still respect anonymity, which increases incentives for quality reviewing.

Specialized

Rigor is easier to maintain for restricted content so it pays to erect and defend specialist knowledge castles. Cross–disciplinary research, where academics cross into other fields, rarely survives specialist critique, as when researchers move into related fields: “No matter how original and useful their insights, their work will be technically unimpressive to specialists in the domain.” [22] Yet cross–disciplinary areas are historically where knowledge expands, e.g., computing arose at the intersection of mathematics and engineering. The feudal KES approach favors specialization rather than integration, as castles are built to exclude not to connect. While new areas start to open soon they too build knowledge walls to exclude and dominate their domain.

That opening up knowledge exchange improves academic performance is illustrated by a quasi “experiment” carried out in 1999 when the Association for Information Systems introduced two online journals, the first a rigorous and traditional double–blind peer review journal (JAIS) and second the “lighter” Communications of the AIS (CAIS), which under Paul Gray gave authors the choice of a light one–person or a full three–person review. Strangely, in 2001 CAIS was rated significantly higher (18th) than JAIS (30th) in journal impact rankings (Barnes, 2005; Mylonopoulos and Theoharakis, 2001). It is noteworthy that in 2003 CAIS published 95 articles while JAIS published 16. One can conclude that reducing rigor increased academic publishing performance.

As more rigorous and exclusive “specialties” emerge, the expected trend is an academic publishing system that produces more and more about less and less. The alternative proposed in Part II is to tear down the walls to instead allow more and more about more and more.

The end point

Under a rigor trend top journals will be exclusive in participation, innovation averse, few in number, outdated in content, restricted in scope, largely unread and increasingly specialized. Authors will duplicate, imitate and supplicate rather than innovate. They will recycle old theories under catchy new labels, develop minor “tweaks” to gatekeeper theories and never rock the boat of received opinion. Reviewers will deny, critique and oppose author attempts to publish while readers will graze, skim and browse the old ideas in new clothes that get through — if they read them at all. The feudal answer to more people writing is more rejections and more people not reading. The expected end point will be journals that are more rigorous than relevant, authors more prolific than productive, reviewers denying not inspiring, and readers grazing but not digesting. The reader can decide if this applies to their field.

This final vision of journals as exclusive and isolated castles of specialist knowledge, manned by editor–sovereigns and reviewer–barons, raising the barricade of rigor against a mass assault by peasant–authors seeking tenure knighthoods, is not inspiring.

In feudalism an elite few manage the valued resources. When the resource is knowledge “truth” becomes what its self–appointed guardians say it is, and innovation is rejected along with error. Is not “Let them publish elsewhere” the knowledge equivalent of Marie Antoinette’s “Let them eat cake”? A system where the few choose what is best for the many to read cannot be sustained as in the end people must choose for themselves.

In feudalism the faces change but the system stays the same, as in the feudal motto “The king is dead. Long live the King.” In academic publishing the same occurs. Fred Davis’ TAM paper was once rejected for a conference but today is part of the system that rejects tomorrow’s theories. Why must each academic innovation generation storm the “Bastille” of their predecessors? Why not transfer knowledge power as democracies transfer economic power — by a peaceful majority decision? Democracies shift power by common consent not expensive political battles, so why can’t a KES?

The worry that opening the gates of the knowledge citadel will let in a flood of error confuses democracy with anarchy. Government by the people does not mean no rules, it just means new rules. It does not destroy hierarchies, just opens them to all by merit. To the academic realists now playing the publishing game, this is “the way it is”, and ideas of knowledge democracy are unreal idealism. Yet the same would have been said of physical democracy in the middle ages. Social change emerges as individuals evolve.

The cracks in the current system are already showing, and First Monday may be one of them. A democratic knowledge economy will outperform its feudal equivalent for the same reason that democratic physical economies outperform feudal ones — that people produce more when control is shared. One driving force for this change will be the breadth and speed of knowledge exchange required by cross–disciplinary research.

 

++++++++++

Cross–disciplinary research

In multi–disciplinary research academic specialists work side by side on the belief that specialty ideas will cross–fertilize, but increased specialization reduces this likelihood. In contrast cross–disciplinary research uses faculty trained in more than one discipline to merge knowledge across specialties. Cross–disciplinary teams have both cross–trained generalists and discipline specialists.

The nexus of technology use

We identify cross–disciplinary research at the nexus of technology use as an area of knowledge expansion. Terms like Web science (Fischetti, 2006), socio–technical systems (Whitworth and Moor, 2009c), information communication technology (ICT), information systems, social computing, information science, informatics and Science 2.0 (Shneiderman, 2008) all point to a nascent “knowledge flower” growing at the crossroads of technology use (Figure 1).

If knowledge grows at the intersection of disciplines then it should grow at the point of technology use, as many disciplines intersect there. A decade ago one might have picked IS for this new cross–disciplinary field, but a business sub–discipline is unlikely to capture the middle ground of many disciplines. While how this knowledge crossroads will evolve is uncertain, that it will expand is not in doubt. To capture this expanding knowledge middle ground requires a meta–discipline that cuts across other disciplines.

To grow cross–disciplinary people academia needs cross–disciplinary centers, to foster research creativity and attract gifted faculty and students seeking to travel across knowledge borders. Already many universities have cross–disciplinary centers to develop better grants. A cross–disciplinary technology use curriculum would combine a technology core with another discipline major, e.g., music and computing, accounting and computing, etc. Such a “discipline of disciplines” would attract staff and students from foreign fields like psychology, engineering, computer science, information science, health science, education, business and mathematics, unlike IS which bled into neighbor disciplines. A cross–disciplinary “electronic knowledge portal” could become the “Singapore” of academic knowledge exchange — the place people go to get to other knowledge places.

 

Figure 1: The cross–disciplinary knowledge flower of technology use
Figure 1: The cross–disciplinary “knowledge flower” of technology use.

 

Driving this trend will be student numbers. “Hard” subjects like computing have traditionally struggled to attract women, who are now the majority of university students. This is not because women can’t learn technology, but because they often choose not to. The problem of too few women in technology will be solved by changing the nature of technology, not by changing the nature of women. As technology morphs into socio–technology, the traditional choice of social or technical, model or geek, will give way to a new option: social and technical, as illustrated by social network systems like Facebook. As computing recognizes the value of social knowledge, young women of ability will increasingly choose to study humanized computing.

Cross–disciplinary crunch

As the number of knowledge specialties increases the number of cross–disciplinary connections grows geometrically. The two–dimensional Figure 1 cannot illustrate this, as with eight disciplines there should be 256 potential overlaps, not just eight as shown. A science with hundreds of distinct disciplines has tens of thousands of knowledge intersections, each potentially another specialty.

Should each develop its own special interest group, conferences and journals? Indeed, already we see almost a new industry of intersection journals, led by IGI, with titles like the International Journal of Computational Models and Algorithms in Medicine (IJCMAM) (computer science plus medicine), or the International Journal of Adult Vocational Education and Technology (IJAVET) (technology plus education). This knowledge expansion satisfies the needs of the many to publish, but retains the tradition of dividing knowledge into artificial and disconnected fiefdoms, i.e., it merely adds mini–castles around the major ones.

The cross–disciplinary crunch will occur when the knowledge generated at discipline intersections exceeds that generated by specialty nodes. When progress created in the open fields between castles exceeds that generated within the castles, as in technology use today, the castles will be unnecessary to all but those within them. As the feudal knowledge system isolates and purifies, it will be seen as many now see feudal aristocracies — symbols of a bygone era. When the bandwidth of cross–disciplinary knowledge exchange exceeds that of specialty knowledge exchange, the feudal academic knowledge system will collapse.

Knowledge expansion at the intersection of disciplines is a chance for evolutionary progress, rather than a sign of failure. Building walls to protect knowledge is necessary in a land of bandits and thieves, but in a land of earnest artisans it only reduces beneficial synergies, and forces each specialty to reinvent the intellectual wheels of others.

A social network diagram based upon the citations of 120 IS journals in 2003 shows clearly that there are now more connections than nodes [23]. While the generalist Communications of the ACM is central and influential, the rigorous “pure” IS cluster centered around MISQ “… is largely isolated from other journals in the network.” [24] For an interconnected knowledge network the driving need is to exchange knowledge not to guard it, which will create new search engine–based forms of “cyber–scholarship” (Arms, 2008).

 

++++++++++

Conclusions

The demands of cross–disciplinary research suggest that academia should:

  1. Replace the myth that rigor is excellence with research as a risk–opportunity mix;

  2. Reduce business influence on the grounds that academic truth is good business; and,

  3. Reinvent academic publishing as a democratic open knowledge exchange system.

Socio–technologies like wikis show what is possible when communities activate, but wikis are not the academic answer as they don’t attribute or allocate accountability, nor offer anonymous review. The easy options in academic publishing have already been tried, so Part II of this paper suggests a socio–technical hybrid.

A democratic KES would reaffirm academia’s original goal of publishing knowledge freely for mutual critique and benefit. The search for knowledge should be open not closed, dynamic not static, inclusive not exclusive, current not outdated, affirming not denying, innovative not conservative and most of all, living not dead. To achieve this goal academics must hold to the goal of knowledge growth. If we do our duty as others do theirs, progress will occur naturally. Lest academia forget, its very reason to exist is to grow knowledge, not to guard it, nor to profit from it. End of article

 

About the authors

Brian Whitworth is a Senior Lecturer at Massey University (Albany), Auckland, New Zealand. He holds a B.Sc. (Mathematics), B.A. (Psychology), M.A. (neuro–psychology), and an Information Systems Ph.D. He has published in journals like Small Group Research, Group Decision & Negotiation, Database for Advances in IS, Communications of the AIS, IEEE Computer, Behavior and Information Technology (BIT), Communications of the ACM and IEEE Transactions on Systems, Man and Cybernetics. Topics include generating online agreement, voting before discussing, legitimate by design, spam and the socio–technical gap and the web of system performance. He, with Aldo de Moor, edited the Handbook of Research on Socio–Technical Design and Social Networking Systems (Hershey, Pa.: Information Science Reference, 2009). See http://brianwhitworth.com.

Rob Friedman is an associate professor of humanities and information technology and directs the science, technology and society program at New Jersey Institute of Technology. His research examines science and culture, socio–technical systems design, and technology’s role in education. He is first author of a reference guide to the theory and research supporting the field of technology and innovation management. Principle Concepts of Technology and Innovation Management: Critical Research Models, published in August 2008 by IGI Publishing. Friedman serves as editor–in–chief of the ACM’s Special Interest Group for Information Technology Education’s peer–reviewed SIGITE Newsletter. He teaches graduate and undergraduate courses on socio–technical systems in their cultural contexts.

 

Acknowledgements

Thanks to Paul Gray for a penetrating critique, to Marilyn Tremaine for insights, to Karen Patten and Elizabeth Whitworth for early help, to Jeff Axup for useful comments, to Peter Denning for a valuable re–direction, to our First Monday reviewers for critical insights, and to the first author’s IIMS colleagues for advice, comment and tolerance.

 

Notes

1. Lyytinen, et al., 2007, p. 317.

2. Grudin, 2004, p. 20.

3. Csikszentmihalyi, 1999, p. 320.

4. See http://researchroadmap.org/content/Element/ErrorType for details.

5. Denning, 1997, p. 132.

6. PHER, 1998, p. 3.

7. Lowry, et al., 2007, p. 358.

8. Lowry, et al., 2007, p. 352.

9. Ibid.

10. Bakshi, et al., 2007, p. 139.

11. Darroch and Toleman, 2007, p. 1072.

12. Denning, 2008, Figure 1.

13. Paul, 2007a, p. 194.

14. Benbasat and Barki, 2007, p. 2.

15. Gorton, 2008, p. 99.

16. Sidorova, et al., 2008, Figure 4.

17. Weber, 1999, p. 4.

18. Smolin, 2006, p. 342.

19. http://video.nationalacademies.org/ramgen/news/042803.rm.

20. Denning, 1997, p. 132.

21. Weber, 1999, p. 1.

22. Smolin, 2006, p. 343.

23. Polites and Watson, 2008, Figure 1, p. 97.

24. Polites and Watson, 2008, p. 98.

 

References

W. Arms, 2008. “Cyberscholarship: High performance computing meets digital libraries,” Journal of Electronic Publishing, volume 11, number 1, pp. 1–6.http://dx.doi.org/10.3998/3336451.0011.103

D. Avison, G. Fitzgerald, and P. Powell, 2006. “An opportunity for editors of I.S. journals to relate their experiences and offer advice,” European Journal of Information Systems, volume 15, number 3, pp. 241–243.http://dx.doi.org/10.1057/palgrave.ejis.3000625

S. Bakshi and S. Krishna, 2007. “Crisis in the information systems discipline: A reflection,” Proceedings of the 18th Australasian Conference on Information Systems (5–7 December, Toowoomba), pp. 132–141, and at http://www.acis2007.usq.edu.au/assets/papers/32.pdf.

S. Barnes, 2005. “Assessing the value of IS journals,” Communications of the ACM, volume 48, number 1, pp. pp 110–112.

I. Benbasat and H. Barki, 2007. “Quo vadis, TAM?” Journal of the Association for Information Systems (JAIS) volume 8, number 4, at http://aisel.aisnet.org/jais/vol8/iss4/16.

I. Benbasat and R. Zmud, 2003. “The identity crisis within the IS discipline: Defining and communicating the discipline's core properties,” MIS Quarterly, volume 27, number 2, pp. 183–194, and at http://www.misq.org/archivist/vol/no27/issue2/Benbasat.html.

I. Benbasat and R. Zmud, 1999. “Empirical research in information systems: The practice of relevance,” MIS Quarterly, volume 23, number 1, pp. 3–16.http://dx.doi.org/10.2307/249403

T. Berners–Lee, 2000. Weaving The Web: The original design and ultimate destiny of the World Wide Web. New York: HarperCollins.

C. Bowman, 2005. Intangibles: Exploring the full depth of issues. Sarnia, Ontario: Grafiks Marketing & Communication.

B. Bryson, 2003. A short history of nearly everything. New York: Broadway Books.

M. Campbell–Kelly, 2008. “Will the future of software be open source?” Communications of the ACM, volume 51, number 10, pp. 21–23.http://dx.doi.org/10.1145/1400181.1400189

M. Csikszentmihalyi, 1999. “Implications of a systems perspective for the study of creativity,” In: R.Sternberg (editor). Handbook of creativity. Cambridge: Cambridge University Press, pp. 313–335.

R. Daft, R. Lengel, and L. Trevino, 1987. “Message equivocality, media selection, and manager performance: Implications for information systems,” MIS Quarterly, volume 11, number 3, pp. 354–366.http://dx.doi.org/10.2307/248682

F. Darroch and M. Toleman, 2007. “Bridging the IS academic–practitioner relationship divide: A review, A theoretical framework, and an example of interaction,” Proceedings of the 18th Australasian Conference on Information Systems, pp. 1069–1078.

T. Davenport and L. Markus, 1999. “Rigor vs. relevance revisited: Response to Benbasat And Zmud,” MIS Quarterly, volume 23, number 1, pp. 19–23.http://dx.doi.org/10.2307/249405

F. Davis, 1989. “Perceived usefulness, perceived ease of use, and user acceptance of information technology,” MIS Quarterly, volume 13, number 3, pp. 319–340.http://dx.doi.org/10.2307/249008

P. Denning, 2008. “The computing field: Structure,” In: B. Wah (editor). Wiley encyclopedia of computer science and engineering. Hoboken, N.J.: Wiley, and at http://cs.gmu.edu/cne/pjd/PUBS/ENC/cs08.pdf.

P. Denning, 1997. “A new social contract for research,” Communications of the ACM, volume 40, number 2, pp. 132–134.http://dx.doi.org/10.1145/253671.253755

K. Desouza, O. El–Sawy, R. Galliers, C. Loebbecke, and R. Watson, 2006. “Beyond rigor and relevance towards responsibility and reverberation: Information systems research that really matters,” Communications of the AIS, volume 17, pp. 341–353.

M. Fischetti, 2006. “A science of the Web begins,” Scientific American (2 November), at http://www.scientificamerican.com/.

A. Furnham, 1990. “Quantifying quality: An argument in favor of citation counts,” Journal of Further and Higher Education, volume 14, number 2, pp. 105–110.http://dx.doi.org/10.1080/0309877900140208

M. Gallivan and R. Benbunan–Fich, 2007. “Analyzing IS research productivity: An inclusive approach to global IS scholarship,” European Journal of Information Systems, volume 16, pp. 36–53.http://dx.doi.org/10.1057/palgrave.ejis.3000667

L. Golden and D. Figart (editors), 2000. Working time: International trends, theory, and policy perspectives. London: Routledge.

I. Gorton, 2008. “XML does real programmers a service,” Computer, volume 41, number 9, pp. 99–100.http://dx.doi.org/10.1109/MC.2008.404

J. Grudin, 2004. “Crossing the divide,” ACM Transactions on Computer-Human Interaction, volume 11, number 1, pp. 1–25.http://dx.doi.org/10.1145/972648.972649

S. Hamilton and B. Ives, 1982. “The journal communication system for MIS research,” ACM SIGMIS Database, volume 14, number 2, pp. 3–14.http://dx.doi.org/10.1145/1040676.1040677

H. van der Heijden, 2003. “Factors influencing the usage of websites: The case of a generic portal in the Netherlands,” Information & Management, volume 40, number 6, pp. 541–549.http://dx.doi.org/10.1016/S0378-7206(02)00079-4

E. Horvitz, 2004. “Lumiere Project: Bayesian reasoning for automated assistance,” at http://research.microsoft.com/en-us/um/people/horvitz/lum.htm.

D. Hovorka, K. Larsen, and D. Morachi, 2009. “Conceptual convergences: Positioning information systems among the business disciplines,” European Conference on Information Systems (ECIS) (Verona, 8–10 June), at http://epublications.bond.edu.au/business_pubs/92/.

P. Katerattanakul, B. Han, and S. Hong, 2003. “Objective quality ranking of computing journals,” Communications of the ACM, volume 46, number 10, pp. 111–114.http://dx.doi.org/10.1145/944217.944221

R. Keeney and H. Raiffa, 1976. Decisions with multiple objectives: Preferences and value tradeoffs. New York: Wiley.

H. Klein and R. Hirschheim, 2003. “Crisis in the IS field? A critical reflection on the stage of the discipline,” Journal of the AIS, volume 4, number 10, pp. 237–293.

N. Kock, P. Gray, R. Hoving, H. Klein, M. Myers, and J. Rockart, 2002. “IS research relevance revisited: Subtle accomplishment, unfulfilled promise, or serial hypocrisy?” Communications of the AIS, volume 8, pp. 330–346.

K. Kozar, K. Larsen, and D. Straub, 2006. “Leveling the playing field: A comparative analysis of business school journal productivity,” Communications of the AIS, volume 17, pp. 524–538.

T. Kuhn, 1970. The structure of scientific revolutions. Second edition, enlarged. Chicago: University of Chicago Press.

J. Lamp, S. Milton, L. Dawson, and J. Fisher, 2007. “RFQ publication quality measures: Methodological issues,” ACIS 2007: Proceedings of the 18th Australasian Conference on Information Systems (University of Southern Queensland, Toowoomba), pp. 478–486, and at http://www.deakin.edu.au/dro/view/DU:30008019.

K. Larsen, D. Monarchi, D. Hovorka, and C. Bailey, 2008. “Analyzing unstructured text data: Using latent categorization to identify intellectual communities in information systems,” Decision Support Systems, volume 45, number 4, pp. 884–896.http://dx.doi.org/10.1016/j.dss.2008.02.009

P. Lowry, S. Humphreys, J. Malwitz, and J. Nix, 2007. “A scientometric study of the perceived quality of business and technical communication journals,” IEEE Transactions on Professional Communication, volume 50, number 4, pp. 352–378.http://dx.doi.org/10.1109/TPC.2007.908733

K. Lyytinen, R. Baskerville, J. Iivari, and D. Te’eni, 2007. “Why the old world cannot publish? Overcoming challenges in publishing high–impact IS research,” European Journal of Information Systems, volume 16, pp. 317–326.http://dx.doi.org/10.1057/palgrave.ejis.3000695

J. Moon and Y. Kim, 2001. “Extending the TAM for a World–Wide–Web context,” Information & Management, volume 38, number 4, pp. 217–230.http://dx.doi.org/10.1016/S0378-7206(00)00061-6

N. Mylonopoulos and V. Theoharakis, 2001. “Global perceptions of IS journals,” Communications of the ACM, volume 44, number 9, pp. 29–33.http://dx.doi.org/10.1145/383694.383701

S. Nerur, R. Sikora, G. Magalaraj, and V. Balijepally, 2005. “Assessing the relative influence of journals in a citation network,” Communications of the ACM, volume 48, number 11, pp. 71–74.http://dx.doi.org/10.1145/1096000.1096007

J. Nunamaker, A. Dennis, J. Valacich, D. Vogel, and J. George, 1991. “Electronic meeting systems to support group work,” Communications of the ACM, volume 34, number 7, pp. 40–61.http://dx.doi.org/10.1145/105783.105793

C. Ong, J. Lai, and Y. Wang, 2004. “Factors affecting engineer’s acceptance of asynchronous e–learning systems in high–tech companies,” Information & Management, volume 41, number 6, pp. 795–804.http://dx.doi.org/10.1016/j.im.2003.08.012

R. Paul, 2007a. “Challenges to information systems: Time to change,” European Journal of Information Systems, volume 16, pp. 193–195.http://dx.doi.org/10.1057/palgrave.ejis.3000681

R. Paul, 2007b. “Change strikes back,” European Journal of Information Systems, volume 16, pp. 1–2.http://dx.doi.org/10.1057/palgrave.ejis.3000668

R. Paul, 2005. “Editor’s view: An opportunity for editors of IS journals to relate their experiences and offer advice. the Editorial view of Ray J. Paul. First in a series,” European Journal of Information Systems, volume 14, pp. 207–212.http://dx.doi.org/10.1057/palgrave.ejis.3000542

P. Pavlou, 2003. “Consumer acceptance of electronic commerce: Integrating trust and risk with the technology acceptance model,” International Journal of Electronic Commerce, volume 7, number 3, pp. 101–134; abstract at http://www.gvsu.edu/business/ijec/.

PC Magazine, 2001. “20th anniversary of the PC survey results,” http://www.pcmag.com/article2/0,1759,57454,00.asp.

PHER (PEW Higher Education Roundtable), 1998. “To publish or perish,” Policy Perspectives, volume 4, number 4, pp. 1–12.

G. Polites and R. Watson, 2008. “The centrality and prestige of CACM,” Communications of the ACM, volume 51, number 1, pp. 95–100.http://dx.doi.org/10.1145/1327452.1327454

C. Pratley, 2004. “Chris_Pratley’s Office Labs and OneNote BLog,” at http://weblogs.asp.net/chris_pratley/archive/2004/05/05/126888.aspx.

R. Rainer and M. Miller, 2005. “Examining differences across journal rankings,” Communications of the ACM, volume 48, number 2, pp. 91–94.http://dx.doi.org/10.1145/1042091.1042096

E. Raymond, 1997. “The cathedral and the bazaar,”at http://tuxedo.org/~esr/writings/cathedral-bazaar/; also at First Monday, volume 3, number 3 (March 1998), at http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/578/499.

D. Robey and L. Markus, 1998. “Beyond rigor and relevance: Producing consumable research about information systems,” Information Resources Management Journal, volume 11, number 1, pp. 7–15.http://dx.doi.org/10.4018/irmj.1998010101

R. Rosenthal and R. Rosnow, 1991. Essentials of behavioral research; Methods and data analysis Second edition. New York: McGraw–Hill.

C. Saunders, 2005. “Between a rock and a hard spot,” MIS Quarterly, volume 29, number 4, pp. iii–xii, and at http://www.misq.org/archivist/vol/no29/Issue4/EdComments.pdf.

H. Shih, 2004. “An empirical study on predicting user acceptance of e–shopping on the Web,” Information & Management, volume 41, number 3, pp. 351–368.http://dx.doi.org/10.1016/S0378-7206(03)00079-X

B. Shneiderman, 2008. “Science 2.0,” Science, volume 319, number 5868, pp. 1349–1350.http://dx.doi.org/10.1126/science.1153539

A. Sidorova, N. Evangelopolous, J. Valacich, and T. Ramakrishan, 2008. “Uncovering the intellectual core of the information systems discipline,” MIS Quarterly, volume 32, number 3, pp. 467–482.

N. Siggelkow, 2001. “Who reads my paper anyways? A survey of journal readership and reputation,” at http://www-management.wharton.upenn.edu/siggelkow/pdfs/Who%20reads.pdf.

H. Smith, N. Kulatilaka, and N. Venkatramen, 2002. “Developments in IS practice III: Riding the wave: Extracting value from mobile technology,” Communications of the Association of Information Systems, volume 8, number 1, pp. 467–481.

L. Smolin, 2006. The trouble with physics: The rise of string theory, the fall of a science, and what comes next. Boston: Houghton Mifflin.

R. Snodgrass, 2003. “Journal relevance,,” ACM SIGMOD Record, volume 32, number 3, pp. 11–15.http://dx.doi.org/10.1145/945721.945723

I. Steiner, 1972. Group process and productivity. New York: Academic Press.

D. Straub, 2007. “Veni, vidi, vici: Breaking the TAM logjam,” Journal of the Association for Information Systems, volume 8, number 4, pp. 223–229.

B. Szajna, 1994. “How much is information systems research addressing key practitioner concerns?” ACM SIGMIS Database, volume 25, number 2, pp. 49–59.http://dx.doi.org/10.1145/190743.190747

S. Taylor and P. Todd, 1995. “Assessing IT usage: The role of prior experience,” MIS Quarterly, volume 19, number 4, pp. 561–570.http://dx.doi.org/10.2307/249633

E. Tenner, 1996. Why things bite back: Technology and the revenge of unintended consequences. New York: Knopf.

A. Toffler, 1980. The third wave. New York: Morrow.

J. Valacich, M. Fuller, C. Schneider, and A. Dennis, 2006. “Publication opportunities in premier business outlets: How level is the playing field?” Information Systems Research, volume 17, number 2, pp. 107–125.http://dx.doi.org/10.1287/isre.1060.0089

V. Venkatesh, F. Davis, and M. Morris, 2007. “Dead or alive? The development, trajectory and future of technology adoption research,” Journal of the Association for Information Systemsy, volume 8, number 4, pp. 267–286.

V. Venkatesh, M. Morris, G. Davis, and F. Davis, 2003. “User acceptance of information technology: Toward a unified view,” MIS Quarterly, volume 27, number 3, pp. 425–478, and at http://www.misq.org/archivist/vol/no27/issue3/Venki.html.

D. Vogel, 1993. “EDI group process modelling,” In: J. Gricar and J. Novak (editors). Proceedings of the Sixth International EDI Conference (Bled, Slovenia, 7–9 June), pp. 234–243.

R. Weber, 1999. “The journal review process: A manifesto for change,” Communications of the Association for Information Systems, volume 2, number 12, pp. 1–24.

B. Whitworth, 2009a. “A social environment model of socio–technical performance,” at http://brianwhitworth.com/social-environment-model.pdf.

B. Whitworth, 2009b. “The social requirements of technical systems,” In: B. Whitworth and A. de Moor (editors). Handbook of research on socio–technical design and social networking systems. Hershey, Pa.: Information Science Reference; and at http://brianwhitworth.com/STS/STS-chapter1.pdf.

B. Whitworth and A. de Moor (editors), 2009c. Handbook of research on socio–technical design and social networking systems. Hershey, Pa.: Information Science Reference; and at http://brianwhitworth.com/STS.

B. Whitworth, V. Bañuls, C. Sylla, and E. Mahinda, 2008. “Expanding the criteria for evaluating socio–technical software,” IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans, volume 38, number 4, pp;. 777–790; and at http://brianwhitworth.com/wospahp.pdf.

B. Whitworth, 2005. “Polite computing,” Behaviour & Information Technology, volume 24, number 5, pp. 353–363; and at http://brianwhitworth.com/polite05.pdf.http://dx.doi.org/10.1080/01449290512331333700

J. Willinsky, 2000. “Proposing a knowledge exchange model for scholarly publishing,” Current Issues in Education, volume 3, number 6, at http://cie.asu.edu/volume3/number6/.

R. Wright, 2000. NonZero: The logic of human destiny. New York: Pantheon.

J. Yu, I. Ha, M. Choi, and J. Rho, 2005. “Extending the TAM for a t–commerce,” Information and Management, volume 42, number 7, pp. 965–976.http://dx.doi.org/10.1016/j.im.2004.11.001

 


Editorial history

Paper received 20 December 2008; revised 25 April 2009; revised 14 July 2009; accepted 20 July 2009.


Creative Commons License
This paper is licensed under a Creative Commons Attribution–Noncommercial–Share Alike 3.0 United States License.

Reinventing academic publishing online. Part I: Rigor, relevance and practice
by Brian Whitworth and Rob Friedman.
First Monday, Volume 14, Number 8 - 3 August 2009
http://journals.uic.edu/ojs/index.php/fm/article/viewArticle/2609/2248





A Great Cities Initiative of the University of Illinois at Chicago University Library.

© First Monday, 1995-2014.