Green business, green careers and job training
Solutions that build smart communities with green careers and sustainable businesses!

Save on Career Education through Lorman Education

Our readers are offered a discount on Lorman seminars and courses. Check out their business, management and legal offers at this link to qualify for our discounts. They also offer green business training opportunities. Please visit for a complete listing of courses. Register online or call Lorman at 866-352-9539.
Home > By DEPARTMENTS > Green Business and Human Resources

Measuring the Value and Impact of Science Information

Scientific journal impact factor is a very poor measure of article impact.

Find green business solutions
And, obviously, the fact that an article is highly influential by any measure does not necessarily mean it should be. Science and applied science is increasingly important in green and sustainable business and community. Learning to learn is an important part of the ongoing challenge of doing more with less... and restoring our natural sciences. This article provides us with ideas on learning to learn -- the hallmark of scientific sectors.

Solutions for Scientific Sharing of Information

As Clay Shirky famously said, you can complain about information overload but the only way to deal with it is to build and use better filters. It is no longer sufficient to depend on journals as your only filter; instead, it is time to start evaluating papers on their own merits.

This research overview by Cameron Neylon and Shirley Wu points out a few details about the following options for sharing scientific information in articles:

  • Citation counts are an excellent measure of influence and impact but are very slow to collect.
  • Download statistics are rapid to collect but may be misleading.
  • Comments can provide valuable and immediate feedback, but are currently sparse and require a change in the research reward culture to become more widespread and to improve quality.
  • Bookmarking statistics can be both rapid to collect and contain high quality information but are largely untested and require the widespread adoption of unfamiliar tools.
  • Expert ratings by services such as Faculty of 1000
  • Simple rating schemes

Context of Relevance and Value

The fundamental problem of which paper to read can also have different contexts.
  • Which new papers are relevant to you?
  • Which papers should you read if you are going to pursue research question X?
  • Which papers do you need to read before submitting your paper?
  • Are you a funder interested in media coverage of work you have paid for
  • Are you a textbook writer aiming to assess the most important contributions in a field?

Overview of Article Sharing Options

Peer-reviewed journals have served an important purpose in evaluating submitted papers and readying them for publication. In theory, one could browse the pages of the most relevant journals to stay current with research on a particular topic.

But as the scientific community has grown, so has the number of journals—to the point where over 800,000 new articles appeared in PubMed in 2008.

The sheer number makes it impossible for any scientist to read every paper relevant to their research, and a difficult choice has to be made about which papers to read. Journals help by categorizing papers by subject, but there remain in most fields far too many journals and papers to follow. As a result, we need good filters for quality, importance, and relevance to apply to scientific literature. There are many we could use but the majority of scientists filter by preferentially reading articles from specific journals -- those they view as the highest quality and the most important.

Thomson ISI Journal Impact Factor

The Thomson ISI Journal Impact Factor is being researched as an external and “objective” measure for ranking the impact of specific journals and the individual articles within them. Yet the impact factor, which averages the number of citations per eligible article in each journal, is deeply flawed both in principle and in practice as a tool for filtering the literature. The conclusion was that for the job of assessing the importance of specific papers, the impact factor—or any other journal-based metric for that matter—cannot escape an even more fundamental problem: it is simply not designed to capture qualities of individual papers.

Article-level Metrics

If choosing which articles to read on the basis of journal-level metrics is not effective, then we need a measure of importance that tells us about the article. It makes sense that when choosing which of a set of articles to read, we should turn to “article-level metrics,” yet in practice data on individual articles are rarely considered, let alone seriously measured.
  • The biggest problem is the time-delay inherent in citations Accurately determining the importance of an article takes years and is very difficult to do objectively.
  • The “gold standard” of article impact is formal citations in the scholarly literature, but citation metrics have their own challenges. One is that citation metrics do not take the “pro- or con-sentiment” of the citation into account, using citation counts without any context can be misleading.

Comment Solutions

A common solution proposed for getting rapid feedback on scientific publications is inspired by the success of many Web-based commenting forums. Sites like Stack Overflow, Wikipedia, and Hacker News each have an expert community that contributes new information and debates its value and accuracy. It is not difficult to imagine translating this dynamic into a scholarly research setting where scientists discuss interesting papers. A spirited, intelligent comment thread can also help raise the profile of an article and engage the broader community in a conversation about the science.

Stack Overflow

Stack Overflow caters to programmers and works in part because its contributors build up “karma” through a points-based system that translates into greater influence within the community. More importantly, high Stack Overflow karma can be taken beyond the site to add credibility to your resume and directly enhance career opportunities. There is currently no analogous credit system for post-publication commenting in the scientific community.

And if there is no reward for quality contribution then people will struggle to justify the time involved in generating high quality comments.

Faculty of 1000

It is interesting in this sense that the one commenting system that does appear to obtain reasonable amounts of researcher input is the Faculty of 1000, where senior researchers are selected to become “Faculty” and contribute their opinions on papers they believe are important. Being able to place “Member: Faculty of 1000” on your CV is incentive enough to encourage contributions of sufficient quantity and quality.
Unfortunately, commenting in the scientific community simply hasn't worked, at least not generally. BioMedCentral, PLoS, and BMJ have all had commenting platforms for several years and while certain papers have extensive discussions these are the exception rather than the rule.

Attempts to apply a “Digg-like” mechanism for voting up or down on the basis of perceived value—on the ArXiv preprints service, for instance—have failed to gain traction

Another issue is that the majority of people making hiring and granting decisions do not consider commenting a valuable contribution.

Usage and Download Stats

A simple way of measuring interest in a specific paper might be via usage and download statistics; for example, how many times a paper has been viewed or downloaded, how many unique users have shown an interest, or how long they lingered. This method can certainly provide a rapid means of assessing interest in a paper by comparing the trend in downloads and page views against the average.

These statistics may not be completely accurate but they are consistent, comparable, and considered sufficiently immune to cheating to be the basis for a billion dollar Web advertising industry.

Reference Management Software & Personal Library Entries

A more valuable metric might be the number of people who have actively chosen to include the paper in their own personal library. Endnote, Refworks, and libraries in BibTex format have been the traditional tools for managing personal reference libraries, but there is a growing set of tools with some significant advantages: the tools are free, easy to use, and can help to provide high value article-level metrics without requiring any additional effort on the part of the researchers.

Examples of such tools are Zotero, Citeulike, Connotea, and Mendeley, which all allow the researcher to collect papers into their library while they are browsing on the Web, often in a single click using convenient “bookmarklets.” The user usually has the option of adding tags, comments, or ratings as part of the bookmarking process.

From this point on the tools differ in a variety of ways: Zotero and Mendeley allow formatting citations within manuscripts, whereas Citeulike and Connotea are more focused on using tags and feeds to share information. Importantly, however, they all provide information on how many people have bookmarked a specific paper. Citeulike, Connotea, and Zotero currently go further by providing information on precisely who has bookmarked a specific paper, a piece of information that may be very valuable, although people can choose not to make that information public. Some of these tools may also eventually be able to track the amount of time users spend viewing papers within their interface.

Factors for Successful Solutions

Metrics collected by reference management software are especially intriguing because they offer a measure of active interest without requiring researchers to do anything more than what they are already doing. Scientists collect the papers they find interesting, take notes on them, and store the information in a place that is accessible and useful to them. A significant question is why would they share that valuable information with the rest of the world? The data would still be useful even without identities attached, but researchers are more likely to share openly if appropriate incentive structures exist, as in the example of Faculty of 1000.

Part of the solution to encouraging valuable contributions, then, may simply be that the default settings involve sharing and that people rarely change them. A potentially game-changing incentive, however, may be the power to influence peers. By broadcasting what papers they think are important, researchers are directly influencing the research community's choice of reading and discussion material. This type of influence can be both a good thing, in providing the type of recognition that drives career prospects and will enhance the quality of contributions, but also potentially bad in as much as it concentrates power in the hands of the few. In a sense it is a shift in power from one set of editors, those currently in charge of journals, to a new set of “editors” who curate published papers in a different, but possibly just as useful way.

It is too early to tell whether any specific tools will last, but they already demonstrate an important principle: a tool that works within the workflow that researchers are already using can more easily capture and aggregate useful information. With researchers constantly pressured for time and attention, approaches that gather information from processes that are already part of the typical research workflow are also much more likely to succeed.

Read the full article at the SOURCE:
Public Library of Science "Article-Level Metrics and the Evolution of Scientific Impact" by Cameron Neylon and Shirley Wu.

Edited by Carolyn Allen, owner/editor of California Green Solutions
Green Solution Providers
Green companies directory
your green solutions - business, nonprofit, government program.
It's free.
| science | career services | training |

Share in Social Media

Sign Up For News & Information
Subscribe to our free solutions newsletter
"Ten+ Tips for Greening Your Office"

Read prior issues: California GreenLines

Search For Green Solutions

Custom Search
Green Career Center
Green Job Wizard
Career Certifications Directory

green job training certifications
Green Biz Center
Solutions For Green
Directory of Green Companies

Solutions for Alternative Energy

rose in natural systems
Green Living Center
Solutions for Green: Consumers

Solutions for Remodeling

Backyard Nature Center


"Knowing is not enough; we must apply. Willing is not enough; we must do." Goethe

Related Articles

Business Training from Lorman Education

Top 10 questions facing plant science and agriculture

Job Training Teleconferences - February

Related Green Resources

Marcom Tips
Interactive & Video Tips

Career Resources

We encourage lifelong learning to support sustainable communities and provide these select resources to help you pursue green and sustainable self-development:

Lorman Business Training Directory of On-line Business and Compliance Classes

Green Job Wizard Job and Career Certifications Directory

Job & Career Resources

Sales & Marketing
Human Resources

Green Economy

Business Sectors
Natural Resources

Solutions For Green

About Us ~ Privacy Policy
Contact Us ~ Home

Text Link Ads

AD: Place your link here


We help you green your career, workplace and community by connecting you to quality green, sustainable and high performance resources. California Green Solutions focuses attention on effective solutions that sustain our natural systems. You can support our editorial work by supporting our advertisers and spreading the word about best practices and green solutions.
California Green Solutions is a publication of Carolyn Allen ~ Copyright ©2006-2030 Carolyn Allen

B2B | Job Certifications | Alternative Energy | Events | Green Directory | LED Lights | Remodeling |
CONSUMERS | Backyard Nature | Senior Health | MultiMedia Marketing | Marketing | Networking Events | Japan |