While current computing practice abounds with innovations like online auctions, blogs, wikis, twitter, social networks and online social games, few if any genuinely new theories have taken root in the corresponding “top” academic journals. Those creating computing progress increasingly see these journals as unreadable, out-dated and irrelevant. Yet as technology practice creates, technology theory is if anything becoming even more conforming and less relevant.
We attribute this to the erroneous assumption that research rigor is excellence, a myth contradicted by the scientific method itself. Excess rigor supports the demands of appointment, grant and promotion committees, but is drying up the wells of academic inspiration.
Part I of this paper chronicles the inevitable limits of what can only be called a feudal academic knowledge exchange system, with trends like exclusivity, slowness, narrowness, conservatism, self-involvement and inaccessibility.
We predict an upcoming social upheaval in academic publishing as it shifts from a feudal to democratic form, from knowledge managed by the few to knowledge managed by the many.
The technology trigger is socio-technical advances. The drive will be that only democratic knowledge exchange can scale up to support the breadth, speed and flexibility modern cross-disciplinary research needs. Part II suggests the sort of socio-technical design needed to bring this transformation about.
Contents
- The role of academic knowledge exchange
- Feudal knowledge exchange trends
- Cross–disciplinary research
- Conclusions
Introduction
Caveat lector: Previous iterations of what you’re about to read have been dismissed by information systems (IS) editors and reviewers since a first draft written in 1999 after an ISWorld rigor/relevance discussion. Many years of rejection confirm it as unpublishable in IS. This seems partly because high–level papers always have faults, and partly because suggesting to his tailors that the emperor of academic publishing is wearing only the fig leaf of rigor is unwise. If you find the academic publishing system “excellently attired” please read no further, as here we argue it has serious problems that need addressing. Yet our target is not the many good authors, wise reviewers and supportive editors in our field, many of whom are personal friends.
Our target is the feudal knowledge exchage system they currently work under. While academia covers many disciplines, our evidential case is the field of technology use — the reader must judge for themselves in their field. Yet as much the same case has been made in the field of quantum physics (Smolin, 2006) our conclusions may benefit others.
Part I argues that the current gate–keeping model of academic publishing is performing poorly as knowledge expands and interacts, and that academic publishing must reinvent itself to be inclusive and democratic rather than exclusive and plutocratic. Part II suggests a design to do this using already successful socio–technical tools.
Knowledge exchange systems
[snip]
While science may once have consisted of amateurs cultivating private knowledge gardens, today it is organized into specialist fiefdoms that defend themselves vigorously. Academics are now gate–keepers of feudal knowledge castles, not humble knowledge gardeners. They have for over a century successfully organized, specialized and built walls against error. However the problem with castles, whether physical or intellectual, is that they dominate the landscape, they make the majority subservient and apathetic, and battles for their power reduce productivity. As research grows, knowledge feudalism, like its physical counterpart, is a social advance that has had its day.
The theory–practice divide
[snip]
Each case had a grain of truth, but for technology use the predictive power of theory has been low and the gap between theory and practice is widening. In Eric Raymond’s (1997) analogy, the bazaar of technology practice is booming while the cathedral of technology theory is declining, because one is open and one is closed.
Bridging the divide
[snip]
Yet creating a new online global society is a socio–technical system as complex as any space program, as socio–technical systems need both social and technical performance to succeed (Whitworth and Moor, 2009c). We cannot expect to progress by trial and error alone. If theory and practice are the two legs of scientific progress, a crippled theory leg is a serious problem. We now suggest the main cause of this is unbalanced rigor.
The rigor problem
[snip]
We believe in rigor, but see system performance as a mix of many criteria (Whitworth, et al., 2008), which “bite back” if one criteria is exclusively pursued at the expense of others (Tenner, 1996). The better model of knowledge exchange performance is of an efficient frontier — a line of many points that defines the best one can get of rigor given a value of relevance (Keeney and Raiffa, 1976). Pursuing rigor alone produces rigor mortis in the theory leg of scientific progress.
The role of research
If excess rigor reduces innovation and causes theory to lag behind practice, in IS at least, why not change the strategy? Surely academics prefer to ride the technology wave rather than struggle along behind it?
[snip]
When a system becomes the mechanism for power, profit and control, idealized goals like the search for truth can easily take a back seat. Authors may not personally want their work locked away in expensive journals that only endowed western universities can afford, but business exclusivity requires it. [snip]
[snip]
One can justify distributing rare economic resources to the few, as there is not enough to go around, but one cannot justify distributing knowledge this way, as giving knowledge away does not diminish it. While physical resources distribute by a zero–sum model, information resources follow a non–zero–sum model (Wright, 2000), where the more one gives the more synergy is created (Whitworth, 2009a). Economic scarcity is no argument for knowledge exclusivity.
Conformity training
The modern academic system has become almost a training ground for conformity. PhD students spend three–six years as apprentices under senior direction, then another three–six years seeking the security of a tenured appointment. At both stages, criticizing the establishment is unwise if one wants a career. It is not surprising that six–12 years of such training produces people who toe the party line.[snip]
[snip]
Due to publishing pressure senior IS leaders explicitly advise new faculty not to innovate if they want a career! As the word “unfortunately” suggests, they take no responsibility for a system that actively drives innovators out to make their breakthroughs in practice, e.g., the movement of automatic indexing from universities to commercial enterprises like Google (Arms, 2008).
Changing the system
Can this system change itself? IS academics traditionally judge journal importance by measures like internal expert perceptions, number of citations and publication numbers (Hamilton and Ives, 1982). These internally generated and self–reinforcing measures all favor the status quo. As an academic publishing review notes: “What gives this enterprise its peculiar cast is the fact that the producers of knowledge are also its primary consumers.” [6]
Current research into journal quality illustrates the contrast between science as a search for gain and science as a search for truth. While accepting that “science can be perceived as a social network which accumulates, distributes and processes new knowledge” [7], they see journal “quality” in terms of stakeholder gains:
- So authors can publish in quality journals (for better career impact);
- So readers can select quality journals (to save time);
- So tenure and promotion committees can choose staff (more easily); and,
- So libraries can more easily choose quality publications [8].
Yet, as argued, equating quality with rigor is an error, as quality needs both rigor and relevance. When academia incestuously rates itself by citation studies and expert ratings it can easily become a self–reinforcing system disconnected from external reality (Katerattanakul, et al., 2003).
The IS case
[snip]
[snip] There was a major strategic failure of vision and leadership in IS, as a growing academic discipline should be a melting pot of new ideas, not a stagnant pool of old ones.
How rigor constricts
Even respected IS journal editors recognize there is a problem: “Research publications in IS do not appear to be publishing the right sort or content of research.” [13] The cause we suggest is social conformity to old theories. [snip]
[snip]
The problem lies not with “old but good” theories but with a system that seems unable to grow new ones around them. Given the enormous changes of the last decade in computing, the lack of matching theoretical innovation over the same period is nothing short of astounding.
[snip]
The reality is that it is hard to publish a new theory in mainstream IS, if “new” means not an old theory tweak and “theory” means more than speculative conjecture. Innovation is not a term that comes to mind as one reviews technology use theory yet in technology use practice precisely the opposite is true. That progress is coming from practice — not theory — suggests that theory has its priorities wrong.
Feudal Knowledge Exchange Trends
We have described a feudal knowledge exchange system run by the few for the few, supported ideologically by the church of rigor, financed by university factories of knowledge, whose goal is to dominate and defend the purity of specialized intellectual fiefdoms. We now outline some inevitable trends of such a system, again for the IS case.
Exclusive
[snip]
The trend is for a few exclusive top journals to dominate the theoretical landscape. The alternative proposed in Part II, is a more democratic system.
Outdated
A KES is outdated when its information flows mainly address issues that are no longer current. Lack of timeliness due to publication delay is a Type II opportunity loss. What use is quality that is too late to affect things, when others have either solved or bypassed the problem?
[snip]
The rigor justification that truly good papers will end up published somewhere, so nothing is lost by Type II errors is simply not true. In the glacial world of academic publishing one rejection can delay publication by two–four years. Of the good papers rejected, some despair, some move to greener pastures, but most just conform to reviewer “suggestions”. If rejectees do not try again, publishing delayed, like justice, is publishing denied, as some leave academia for good [snip].
Conservative
A KES is conservative if it resists change and innovation. A rising rigor bar means that new theories face a greater burden of proof than old ones (Avison, et al., 2006). That new theories respect the old is reasonable, but when they face critiques that old theories don’t answer either, then those who have climbed the tree of knowledge have pulled the ladder up behind them. New theories rarely rise like Venus from the sea, fully formed and faultless. Usually new ideas begin imperfect and only develop over time with help from others. So if anything, the bias should be the other way. When new theories must be fully proved before they can even be proposed as research questions, then we have got science backwards.
As Einstein is said to have said: “If we knew what we were doing, it wouldn’t be called research, would it?”
[snip]
Authors who innovate risk their careers, as even their successful innovations may not flourish until after their tenure decision. It should not be this way. Innovators are the “whistle blowers” of academia — they challenge false claims of knowledge profits. A system that rejects its own agents of change rejects its own progress.
[snip]
New ideas by definition contradict the agreed norm, so can be expected to polarize reviewers. A proposal that offends no one probably changes nothing. Yet in academic hiring one bad reference can kill an appointment [18], and in journal submissions and grant proposals, a “perfect” application must get a perfect score not one person must dislike it. Yet if no one dislikes your work you probably aren’t doing anything worthwhile. Indeed a hallmark of innovation is that it polarizes people — some love it and some hate it. The score tick box system of most grant reviews weeds out creativity.
A hundred years ago Einstein invented special relativity working in the Swiss Patent Office because no university would appoint him. Yet he revolutionized physics. Is the academic system today any more inviting to unorthodoxy? [snip]
Part II explores how to change this.
Unread
[snip]
If the democratic KES outlined in Part II lets everyone publish, won’t that worsen the not–reading problem, as there will be more to read? It would — if the motivation didn’t change, but it will. While in a risk–avoiding system more papers are more error to avoid, in a value–seeking system more papers are more potential value. Readers will use electronic tools, like Google Scholar, to do positive searches. While the literature seems huge, a search on a specific research topic may produce only a handful of relevant papers. Even imperfect papers may have good parts or stimulate new ideas. When the motive moves from following normative ideas to finding useful knowledge, more people will read a greater variety of papers.
The opposite of apathy is involvement and participation, and in Part II we suggest that socio–technical tools can turn readers from passive recipients of pre–selected “quality” to active participants in value generation.
Inaccessible
A KES is inaccessible when most of its potential users cannot write to it or read it.
In academia, to contribute one must pass the reviewer firewall. [snip] The rigor trend predicts negatively driven reviewing based on denying faults rather than growing value. In contrast the democratic KES outlined in Part II can report review contributions and still respect anonymity, which increases incentives for quality reviewing.
Specialized
[snip]
As more rigorous and exclusive “specialties” emerge, the expected trend is an academic publishing system that produces more and more about less and less. The alternative proposed in Part II is to tear down the walls to instead allow more and more about more and more.
The end point
Under a rigor trend top journals will be exclusive in participation, innovation averse, few in number, outdated in content, restricted in scope, largely unread and increasingly specialized. Authors will duplicate, imitate and supplicate rather than innovate. They will recycle old theories under catchy new labels, develop minor “tweaks” to gatekeeper theories and never rock the boat of received opinion. Reviewers will deny, critique and oppose author attempts to publish while readers will graze, skim and browse the old ideas in new clothes that get through — if they read them at all.
The feudal answer to more people writing is more rejections and more people not reading. The expected end point will be journals that are more rigorous than relevant, authors more prolific than productive, reviewers denying not inspiring, and readers grazing but not digesting. The reader can decide if this applies to their field.
This final vision of journals as exclusive and isolated castles of specialist knowledge, manned by editor–sovereigns and reviewer–barons, raising the barricade of rigor against a mass assault by peasant–authors seeking tenure knighthoods, is not inspiring.
The worry that opening the gates of the knowledge citadel will let in a flood of error confuses democracy with anarchy. Government by the people does not mean no rules, it just means new rules. It does not destroy hierarchies, just opens them to all by merit. To the academic realists now playing the publishing game, this is “the way it is”, and ideas of knowledge democracy are unreal idealism. Yet the same would have been said of physical democracy in the middle ages. Social change emerges as individuals evolve.
The cracks in the current system are already showing ... . A democratic knowledge economy will outperform its feudal equivalent for the same reason that democratic physical economies outperform feudal ones — that people produce more when control is shared.[snip]
Cross–Disciplinary Research
In multi–disciplinary research academic specialists work side by side on the belief that specialty ideas will cross–fertilize, but increased specialization reduces this likelihood. In contrast cross–disciplinary research uses faculty trained in more than one discipline to merge knowledge across specialties. [snip]
The nexus of technology use
We identify cross–disciplinary research at the nexus of technology use as an area of knowledge expansion. Terms like Web science (Fischetti, 2006), socio–technical systems (Whitworth and Moor, 2009c), information communication technology (ICT), information systems, social computing, information science, informatics and Science 2.0 (Shneiderman, 2008) all point to a nascent “knowledge flower” growing at the crossroads of technology use (Figure 1).
[snip]
- The demands of cross–disciplinary research suggest that academia should:
- Replace the myth that rigor is excellence with research as a risk–opportunity mix;
- Reduce business influence on the grounds that academic truth is good business; and,
- Reinvent academic publishing as a democratic open knowledge exchange system.
Socio–technologies like wikis show what is possible when communities activate, but wikis are not the academic answer as they don’t attribute or allocate accountability, nor offer anonymous review. The easy options in academic publishing have already been tried, so Part II of this paper suggests a socio–technical hybrid.
A democratic KES would reaffirm academia’s original goal of publishing knowledge freely for mutual critique and benefit. The search for knowledge should be open not closed, dynamic not static, inclusive not exclusive, current not outdated, affirming not denying, innovative not conservative and most of all, living not dead. To achieve this goal academics must hold to the goal of knowledge growth. If we do our duty as others do theirs, progress will occur naturally. Lest academia forget, its very reason to exist is to grow knowledge, not to guard it, nor to profit from it.
Notes
No comments:
Post a Comment