Monday, November 28, 2005

Leydesdorff on citation impact visualizations

Peter Suber brought this article to my attention (and it's available in pdf AND html :)

Although Leydesdorff's approach is limited by his choice to aggregate citations by journals instead of individual articles, I liked his symbolic manipulation of the shape of each node--stretching them vertically or horizontally--to denote additional
information about the journals themselves.

--jon

Leydesdorff, Loet (2005) Visualization of the Citation Impact Environments of Scientific Journals: An online mapping exercise. In Proceedings Annual Meeting of the Society for Social Studies of Science (4S), Pasadena, California.

http://dlist.sir.arizona.edu/992/

Abstract

Aggregated journal-journal citation networks based on the Journal Citation Reports 2004 of the Science Citation Index (5968 journals) and the Social Science Citation Index (1712 journals) are made accessible from the perspective of any of these
journals. A vectorspace model is used for normalization, and the results are brought online at http://www.leydesdorff.net/jcr04 as input-files for the visualization program Pajek. The user is thus able to analyze the citation environment in terms of
links and graphs. Furthermore, the local impact of a journal is defined as its share of the total citations in the specific journal’s citation environments; the vertical size of the nodes is varied proportionally to this citation impact. The
horizontal size of each node can be used to provide the same information after correction for within-journal (self-)citations. In the “citing” environment, the equivalents of this measure can be considered as a citation activity index which maps how
the relevant journal environment is perceived by the collective of authors of a given journal. As a policy application, the mechanism of interdisciplinary developments among the sciences is elaborated for the case of nanotechnology journals.

Tuesday, November 15, 2005

Re: Current Science issue on citation indexing

Dear OACI list members,

Peter Suber invited me to join your list because of my work with Interarchive, a consortium that explores distributed models of publication and recognition at http://newmedia.umaine.edu/interarchive/. (I'm cross-posting this to the Interarchive
"recognition-metrics" list.) By way of introduction, I'm an artist, curator at the Guggenheim Museum, and teacher in the University of Maine's New Media Program. My work focuses less on traditional scientific scholarship than experimental,
networked, and fluid paradigms of knowledge gathering.

That said, I was grateful for Stevan Harnad's reference to the Current Science special issue on citation indexing (http://www.ias.ac.in/currsci/nov102005/contents.htm). Having skimmed Jacso, Roth, Lewison, Scharnhorst, Lederberg, and Cronin, I was
struck by the schizophrenic character of much of this journal's reporting on citation indexing (apart from a seemingly universal reverence for citation pioneer Eugene Garfield).

For example, Peter Jacso's comparison of Web of Science, Scopus, and Google Scholar, "As We May Search," begins with an example that serves as a scathing indictment of the narrow range of publications indexed by established systems. Jacso is
referring to Vannevar Bush's 1945 paper "As We May Think":

"Bush 60 years ago contemplated - among many other
things - an information workstation, the Memex. A re-
searcher would use it to annotate, organize, link, store,
and retrieve microfilmed documents. He is acknowledged
today as the forefather of the hypertext system, which in
turn, is the backbone of the Internet.

"He outlined his thoughts in an essay published in the
Atlantic Monthly. Maybe because of using a non-
scientific outlet the paper was hardly quoted and cited in
scholarly and professional journals for 30 years." (p. 1537)

Google, of course, knows the impact of this article, as evidenced by its 120,000 returns for "Vannevar Bush" "As We May Think". Yet in his conclusion Jacso excoriates Google Scholar (only a couple months old) for its buggy handling of Boolean
queries and other interface details:

"Riding on the waves of the regular Google software which
is great for processing the unstructured heap of billions of
Web pages, G-S cannot handle even the meticulously
tagged, metadata-enriched few million journal articles
graciously offered to it by many publishers for free." (p. 1546)

I wonder what Jacso would have concluded about the value of the World Wide Web had he launched Mosaic in 1993 and run into a 404 or two.

Similarly, Blaise Cronin in "A Hundred Million Acts of Whimsy?" cites studies in support of citations:

"Michael Kurtz
and colleagues19 at the Harvard-Smithsonian Center for
Astrophysics have used the NASA (National Aeronautics
and Space Administration) Astrophysics Data System to
compare the obsolescence function as measured by 'reads'
of records in the system with the obsolescence function
as measured by citations. Their statistical analyses show
that reads and cites 'fundamentally measure the same thing,
the usefulness of an article'." (p. 1506)

Cronin does conclude with a weak endorsement of citation based on "trust," but he sets himself up for criticism in the body of his article:

"In the course of writing The Hand of Science7, to which I
alluded earlier, I cited, amongst many others, Eugene Garfield
(EG) and Elisabeth Davenport (ED), both of whom I have
known personally for two decades. The former is best de-
scribed as a professional friend (for example, I co-edited
a Festschrift in his honor), the latter as a frequent col-
laborator, co-author and close friend. I did not cite them
because of our social ties, but because their ideas were
relevant to the work in hand. At the same time, the odds
on their being cited by me are increased as a result of the
pre-existing personal connections: I know them and their
publications well; I interact with them, exchanging thoughts
and materials. In the case of the latter we have been active,
and occasionally co-located, collaborators for many years. A
consequence of my citing EG and ED (and others with
whom I am personally acquainted) is that it reduces, po-
tentially, the likelihood of others in the citable author
pool from being selected. All other things (the citable
work's topicality, relevance, currency, etc.) being equal,
strong social ties will presumably trump weak or non-
existent ties. Granovetter25,26 has discussed weak and strong
ties. Call it preferential attachment, a statistical fact of
not only scientific but also social life." (p. 1507)

I bring up the ambivalence expressed by these researchers not to suggest they should straightjacket contradictory findings into a single thesis, but to ask why they flag the importance of knowledge as a social construct only to shy away from using
social metrics for evaluating its producers.

jon