Monday, December 30, 2013

PLoS Medicine > Why Most Published Research Findings Are False


Summary

There is increasing concern that most current published research findings are false. The probability that a research claim is true may depend on study power and bias, the number of other studies on the same question, and, importantly, the ratio of true to no relationships among the relationships probed in each scientific field. In this framework, a research finding is less likely to be true when the studies conducted in a field are smaller; when effect sizes are smaller; when there is a greater number and lesser preselection of tested relationships; where there is greater flexibility in designs, definitions, outcomes, and analytical modes; when there is greater financial and other interest and prejudice; and when more teams are involved in a scientific field in chase of statistical significance. Simulations show that for most study designs and settings, it is more likely for a research claim to be false than true. Moreover, for many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias. In this essay, I discuss the implications of these problems for the conduct and interpretation of research.

Source and Full Text Available At:

[http://www.plosmedicine.org/article/info:doi/10.1371/journal.pmed.0020124]

Thursday, December 12, 2013

Canadian Association of Research Libraries Publishes Altmetrics in Context

OTTAWA, December 9, 2013 - The Canadian Association of Research Libraries (CARL) is pleased to announce the publication of Altmetrics in Context. As scholarly communication takes on new forms and moves increasingly to digital and open access venues, the value of new types of metrics is increasingly important for the research community. It is causing discussion and, in some camps, heated debate.

Altmetrics report the impact of a wide range of research outputs, including data sets, articles and code. This document, available on the CARL Website, provides a quick introduction to this new field of research impact assessment and encourages researchers to use altmetrics in their work.

Source and Full Text Available At 

[http://us6.campaign-archive1.com/?u=9000187600&id=22afbbc570]

Wednesday, December 11, 2013

A/V Now Available > FREE Webinar > Measuring Impact: Redefining Scholarly Value Through New Data > December 18 2013 > 3:00 PM - 4:00 PM ET

LJ Data Driven Webcasts 2013 Header Series 550px1 Measuring Impact: Redefining Scholarly Value Through New Data (DDAL Pt. 3)
Wednesday, December 18, 2013
3:00-4:00 PM ET / 12:00-1:00 PM PT

SPONSORED BY: ProQuest, Library Journal and ER&L

Scholars are looking beyond traditional metrics to show the impact their work can have in the online world, while publishers are looking to show more value for their content. This has led to looking at other sources of data to determine other ways to consider value. This webcast will highlight the work scholars and organizations are doing around alternative metrics and article-level use to expand the definition of the impact of scholarly exchange.

Speakers

Gregg Gordon - President and CEO, Social Science Research Network (SSRN)
Jason Priem - Co-founder, ImpactStory
Jennifer Lin - Senior Product Manager, Public Library of Science

Moderator
Bonnie Tijerina - Head of E-Resources and Serials, Harvard Library 

Source and A/V Available At: 

Monday, November 25, 2013

Docear: The Academic Literature Suite

Docear is a unique solution to academic literature management, i.e. it helps you organizing, creating, and discovering academic literature. Among others, Docear offers:

A single-section user-interface that allows the most comprehensive organization of your literature. With Docear, you can sort documents into categories; you can sort annotations (comments, bookmarks, and highlighted text from PDFs) into categories; you can sort annotations within PDFs; and you can view multiple annotations of multiple documents, in multiple categories – at once.

A ‘literature suite concept‘ that combines several tools in a single application (pdf management, reference management, mind mapping, …). This allows you to draft your own papers, assignments, thesis, etc. directly in Docear and copy annotations and references from your collection directly into your draft.

A recommender system that helps you to discover new literature: Docear recommends papers which are free, in full-text, instantly to download, and tailored to your information needs.

Source and Link Available At:

Altmetric Pilots Help Elsevier Authors Understand the Impact of Their Articles

The colorful donut indicating impact in news and social media is now featured on various journal homepages and ScienceDirect

By Linda Willems | Posted on 25 November 2013

Almetric pilot for Journal of Experimental Social Psychology

The academic community has traditionally looked to citation analysis to measure the impact of scientific and medical research. But with journal articles increasingly disseminated via online news and social media channels, new measures are coming to the fore.

Alternative metrics – or altmetrics – represent one of the innovative ways the reach of articles is now being assessed, and Elsevier has just launched two pilots featuring the highly-recognizable altmetric "donut."

 Almetric pilot for Journal of Experimental Social PsychologyAlmetric pilot for Journal of Experimental Social Psychology The first pilot will feature donuts for a journal's top three rated articles displayed on the Elsevier.com homepages of 33 Elsevier titles.

This rating is based on a social media traffic score given by Altmetric.com; an article must have received at least one social media mention within the last six months to qualify. By clicking on the "view all" option beneath this list, visitors can review altmetric donuts for the top 10 articles.

An example of the pilot altmetric pod on the Elsevier.com homepage of the Journal of Experimental Social Psychology.

[snip]

Source and Full Text Available At:

[http://www.elsevier.com/connect/altmetric-pilots-help-elsevier-authors-understand-the-impact-of-their-articles]

Wednesday, November 20, 2013

NISO Alternative Assessment Metrics (Altmetrics) Project > Second In-person Meeting - Wednesday, December 11, 2013 > Free Streaming Available

In June 2013, the Alfred P. Sloan Foundation awarded NISO a grant to undertake a two-phase initiative to explore, identify, and advance standards and/or best practices related to a new suite of potential metrics in the community. This initiative was a direct outgrowth of a breakout discussion group during the altmetrics 12 meeting in Chicago, IL. This project is an important step in the development and adoption of new assessment metrics, which include usage-based metrics, social media references, and network behavioral analysis. In addition, this project will explore potential assessment criteria for non-traditional research outputs, such as data sets, visualizations, software, and other applications. After the first phase, which will expose areas for potential standardization, the community will collectively prioritize those potential projects. The second phase will be to advance and develop those standards/best practices prioritized by the community and approved by the membership

NISO will host the second of three meetings meant to further engage the community in this project. The second in-person meeting in support of this work will take place on Wednesday, December 11, 2013 from 8:30 a.m. - 4:30 p.m. (ET) at the Capitol Hilton, Federal Room A, in Washington, DC. This meeting is made possible by the generous support from the Alfred P. Sloan Foundation, and the objectives of this one-day meeting will include a short opening keynote on the topic of assessment, lightning talks on related projects, brainstorming for identification of topics for discussion, and prioritizing proposed work items.

**The meeting is free for all attendees, but room capacity is limited. Please RSVP here, which will assist in planning and logistics.**

[snip]

FREE LIVESTREAM AVAILABLE: For those interested in this work, but unable to attend in-person, NISO will be live streaming this event. Credentials for login will be provided closer to the event date; please make sure to designate your attendance as "virtual" in the RSVP form so that we may be sure to communicate that information to you.

Source and Links Available At:

Monday, October 7, 2013

NISO Altmetrics Project > First In-person / [Streamed] Meeting > October 9, 2013 > San Francisco, CA

As previously announced, NISO is undertaking—with a grant from the Alfred P. Sloan Foundation—a two-phase initiative to explore, identify, and advance standards and/or best practices related to a new suite of assessment metrics for the scholarly community. The first phase of the project is intended to expose areas for potential standardization and collectively prioritize those potential projects.

The first in-person meeting in support of this work will take place on Wednesday, October 9, 2013 in San Francisco. The objectives of this one-day meeting will include a short opening keynote on topic of assessment, lightning talks on related projects, brainstorming for identification of topics for discussion, and prioritizing proposed work items.

The meeting is free for all attendees. [snip]. In-person registration for this event closed on Friday, October 4 at 5:00 p.m. (ET). Virtual attendance registration closes on Tuesday, October 8 at 5:00 p.m. (ET).

Virtual Attendance Registration is available via 


Livestream information and link will be added to this page on the day of the event.
AGENDA
Note: All Sessions Are Pacific Time

October 9, 2013
8:30 a.m.
Welcome
Round the Room Introductions
8:45 a.m.
Introduction: Background and What We Hope to Achieve
Todd Carpenter, Executive Director, NISO
9:15 a.m.
Lightning Talks on Related Projects (5 min each)
Speakers currently signed up are:
  • Euan Adie – Uptake of altmetrics in academic publishing environments
  • Michael Habib – Expectations by researchers
  • Stefanie Haustein – Exploring disciplinary differences in the use of social media in scholarly communication
  • Gregg Gordon – Building trust into altmetrics
  • Heather Piwowar – Altmetrics for Alt-products: approaches and challenges
  • Marcus Banks – Moving beyond the PDF: data sets and visualizations as equal partners
  • Carly Strasser – Altmetrics as part of the services of a large university library system
  • William Gunn – The provenance of altmetrics readership
  • Richard Price – The role of altmetrics in Academia.edu
  • Peter Brantley – Deriving altmetrics from annotation activity

10:45 a.m.
Break
11:00 a.m.
Brainstorming: Identification of Topics for Discussion
Participation of all attendees including virtual attendees
Exercise to include noting topics of interest from attendees and posting problems/issues/gaps/challenges/themes on post-it notes followed by collective grouping and prioritizing of themes. Initial themes will have surfaced in the open Google Doc shared with the group prior to the event.
12:00 p.m.
Lunch
1:00 p.m.
Breakout of Discussion Groups
The groups will have an open discussion of their selected topics and how it plays into a future ecosystem. Depending on the topic, this could include identification of related projects, potential solutions, ongoing pilot projects and gaps in community activity related to the theme. Each group will come up with 3-6 action items related to the topic for prioritization later.
2:00 p.m.
Reporting Out of Discussion Groups & All-Attendee Discussion of Reports
Each group will report on its discussions, highlighting necessary actions, gaps or areas where more information is needed.
3:30 p.m.
Wrap up, Meeting Adjourns
7:00 p.m.
Group Dinner

Source and Links Available At:

Sunday, October 6, 2013

ASIS Bulletin > Altmetrics: What, Why and Where?


Heather Piwowar, Guest Editor

Introduction

Altmetrics is a hot buzzword. What does it mean? What's behind the buzz? What are the risks and benefits of using alternative metrics of research impact – altmetrics – in our discovery and evaluation systems? How are altmetrics being used now, and where is the field going?

This special section of the Bulletin of the Association for Information Science and Technology focuses on these questions. Essays from seven perspectives highlight the role of altmetrics in a wide variety of settings.

The collection begins with its most general article, one I authored with my ImpactStory co-founder Jason Priem, motivating the role of altmetrics for individual scholars through "The Power of Altmetrics on a CV." The next few papers highlight ways that altmetrics may transform scholarly communication itself. Ross Mounce, a doctoral student and Panton Fellow of the Open Knowledge Foundation, explores the relationship between open access and altmetrics in "OA and Altmetrics: Distinct but Complementary." Juan Pablo Alperin, doctoral student and developer with the Public Knowledge Project, encourages us to "Ask Not What Altmetrics Can Do for You, but What Altmetrics Can Do for Developing Countries." Stacy Konkiel and Dave Scherer, librarians at Indiana University and Purdue, respectively, discuss how almetrics can empower institutional repositories in "New Opportunities for Repositories in the Age of Altmetrics."
Completing the collection are three more perspectives from the builders of hot altmetrics tools. Jennifer Lin and Martin Fenner, both of PLOS, explore patterns in altmetrics data in "The Many Faces of Article-level Metrics." Jean Liu, blogger, and Euan Adie, founder of Altmetric.com, consider "Five Challenges in Altmetrics: A Toolmaker's Perspective." Finally, Mike Buschman and Andrea Michalek, founders of Plum Analytics, wrap up the collection asking, "Are Alternative Metrics Still Alternative?"

[snip]

We might even consider nontraditional applications of citation metrics to be altmetrics – citations to datasets as first-class research objects, for example. Other examples include citation counts filtered by type of citation, like citations by editorials or citations only from review articles or citations made only in the context of experimental replication. All of these are alternative indicators of impact.

Altmetrics offer four potential advantages:

  • A more nuanced understanding of impact, showing us which scholarly products are read, discussed, saved and recommended as well as cited.
  • Often more timely data, showing evidence of impact in days instead of years.
  • A window on the impact of web-native scholarly products like datasets, software, blog posts, videos and more.
  • Indications of impacts on diverse audiences including scholars but also practitioners, clinicians, educators and the general public.

Of course, these indicators may not be “alternative” for long. At that point, hopefully we’ll all just call them metrics.

[snip]

Source and Links Available At:

[http://www.asis.org/Bulletin/Apr-13/AprMay13_Piwowar.html]

Monday, September 23, 2013

Altmetrics: Present and Future (SIG/MET) > ASIS&T 2013 Annual Meeting Montréal, Québec, Canada > November 1-5, 2013 > 1:30 PM (EST)

  • Dr. Cassidy Sugimoto, Indiana University Bloomington
  • Judit Bar-Ilan, Bar-Ilan University
  • William Gunn, Mendeley
  • Stefanie Haustein, Université de Montréal
  • Stacy Konkiel, Indiana University Bloomington, 
  • Vincent Larivière, Université de Montréal
  • Jennifer Lin, Public Library of Science
Summary

Scholars are increasingly incorporating social media tools like blogs, Twitter and Mendeley into their professional communications. Altmetrics tracks usage of these and similar tools to measure scholarly influence on the social web. Altmetrics researchers and practitioners have amassed a growing body of literature and working tools to gather and analyze altmetrics and there is growing interest in this emerging subfield of scientometrics. Panelists will present results demonstrating the utility of alternative metrics from a variety of stakeholders: researchers, librarians, publishers and those participating in academic social media sites.

Thanks to Jose Kruse /

Source and Links Available At

[https://www.asis.org/asist2013/abstracts/panels/23.html]

ASIS > SIG MET

SIG/MET is the Special Interest Group for the measurement of information production and use. It encourages the development and networking of all those interested in the measurement of information. It encompasses not only bibliometrics, scientometrics and informetrics, but also measurement of the Web and the Internet, applications running on these platforms, and metrics related to network analysis, visualization, scholarly communication and the design and operation of Information Retrieval Systems. SIG/MET will facilitate activities to encourage the promotion, research and application of metrics topics. Academicians, practitioners, commercial providers, government representatives, and any other interested persons are welcome. 

Source

Sunday, September 22, 2013

A/V + Available > NISO Webinar: Beyond Publish or Perish: Alternative Metrics for Scholarship

NISO How the information world CONNECTS
November 14, 2012 / 1:00 - 2:30 p.m. (Eastern Time)

[snip]

About the Webinar

Increasingly, many aspects of scholarly communication—particularly publication, research data, and peer review—undergo scrutiny by researchers and scholars. Many of these practitioners are engaging in a variety of ways with Alternative Metrics (#altmetrics in the Twitterverse). Alternative Metrics take many forms but often focus on efforts to move beyond proprietary bibliometrics and traditional forms of peer referencing in assessing the quality and scholarly impact of published work. Join NISO for a webinar that will present several emerging aspects of Alternative Metrics.

Agenda

Introduction

Todd Carpenter, Executive Director at NISO

[snip]

Article-Level Metrics at PLOS

Martin Fenner, Technical Lead, PLOS Article-Level Metrics project

Article-Level Metrics have become an exciting new opportunity for publishers, funders, universities and researchers. The publisher Public Library of Science (PLOS) has started to collect and display citations, usage data, and social web activity for all their articles in 2009. The webinar will discuss the opportunities (and challenges) of Article-Level Metrics, from issues in collecting data to interesting results of data analysis.

Total-Impact and other altmetrics initiatives

Jason Priem, Ph.D. Student, Co-Principal Investigator, Impact Story

Altmetrics helps us track diverse scholarly impacts by looking in new places for evidence--public places like Wikipedia and Twitter, and scholarly environments like Mendeley and Faculty of 1000. Doing this lets us promote and reward new forms of Web-native scholarship in two ways. Broader measures of impact  help us move

  • beyond the article:  we can value the increasingly important and powerful new genres of scholarly products like blog posts, software, and datasets, and
  • beyond the impact factor: we can value the observed impact of scholarly products themselves, across lots of different audiences and use types--not just awarding the prestige of where they're published.

That said, altmetrics can be tricky to gather and understand. We'll discuss tools and frameworks to help turn rich but dense altmetrics data into data-supported stories that can help inform important conversations about what it means to make a scholarly impact.

Unconventional Scholarly Communications

Aalam Wassef, Founder of Peer Evaluati

Participate in Aalam's survey on social networks at https://www.surveymonkey.com/s/VNZSNRZ 

Scholars are blogging, microblogging, searching, sharing primary data, collaborating, discussing, rating, bookmarking articles in public folders, recommending links over public networks, offering live coverages of events and receiving badges, views, likes or mentions for all they do online and elsewhere. More than ever, scholars are communicating and getting credit for it, with no limitations as to style, format or environment, enjoying high levels of engagement and responsiveness from their peers.

  • How are all other parties concerned (librarians, public funders, policy makers, publishers universities, research centers) absorbing, supporting or rejecting all of the above?
  • Could “unconventional” communications and alternative metrics be eventually as valued as peer reviewed articles and proprietary bibliometrics? How much of these altmetrics are truly accessible and for free, and what would be the alternatives to potential limitations?
  • What is the current perception of direct publishing and open peer review, whether by individuals, groups or institutions? What are the risks and opportunities for the production of high quality research?

Event Q&A

[more]

Source and Links Available At:

NISO > Altmetrics Steering Group

NISO How the information world CONNECTS
  • Wed, 11 Sep 2013 > MP3 recording - Altmetrics Steering Group call - September 10, 2013 (6MB) 
  • Thu, 05 Sep 2013MP3 recording - Altmetrics Steering Group call - September 4, 2013 (5MB) 
  • Tue, 03 Sep 2013 > MP3 recording - Altmetrics Steering Group call - August 30, 2013 \\(7MB) 
  • Thu, 29 Aug 2013  > Draft Agenda NISO Altmetrics Workshop 2013-10-09.docx (102K) 
[more]

Source and Links Available At  

Information Standards Quarterly (ISQ) > Summer 2013 > Volume 25, no. 2 > Topic: Altmetrics

Table of Contents

Letter from the Guest Content Editor: Altmetrics Have Come of Age
by Martin Fenner

FEATURES

Consuming Article-Level Metrics: Observations and Lessons
by Scott Chamberlain

Institutional Altmetrics and Academic Libraries
by Robin Chin Roemer and Rachel Borchardt

IN PRACTICE

Altmetrics in Evolution: Defining & Redefining the Ontology of Article-Level Metrics
by Jennifer Lin and Martin Fenner

Exploring the Boundaries: How Altmetrics Can Expand Our Vision of Scholarly Communication and Social Impact
by Mike Taylor

Social Signals Reflect Academic Impact: What it Means When a Scholar Adds a Paper to Mendeley
by William Gunn

[more]

Source and Links to Full Text Available At:

[http://www.niso.org/publications/isq/2013/v25no2/]

NISO to Develop Standards and Recommended Practices for Altmetrics

NISO How the information world CONNECTS

Grant from Sloan Foundation will fund community-informed effort to standardize collection and use of alternative metrics measuring research impact

Baltimore, MD - June 20, 2013 - The National Information Standards Organization (NISO) announces a new two-phase project to study, propose, and develop community-based standards or recommended practices in the field of alternative metrics. Assessment of scholarship is a critical component of the research process, impacting everything from which projects get funded to who gains promotion and tenure to which publications gain prominence. Since Eugene Garfield's pioneering work in the 1960s, much of the work on research assessment has been based upon citations, a valuable measure but one that has failed to keep pace with online reader behavior, network interactions with content, social media, and online content management. Exemplified by innovative new platforms like ImpactStory, a new movement is growing to develop more robust alternative metrics—called altmetrics—that complement traditional citation metrics. NISO will first hold several in-person and virtual meetings to identify critical areas where altmetrics standards or recommended practices are needed and then convene a working group to develop consensus standards and/or recommended practices. The project is funded through a $207,500 grant from the Alfred P. Sloan Foundation.

Citation analysis lacks ways to measure the newer and more prevalent ways that articles generate impact such as through social networking tools like Twitter, Facebook, or blogs," explains Nettie Lagace, NISO's Associate Director for Programs. "Additionally, new forms of scholarly outputs, such as datasets, software tools, algorithms, or molecular structures are now commonplace, but they are not easily—if at all—assessed by traditional citation metrics. These are two among the many concerns the growing movement around altmetrics is trying to address."

"For altmetrics to move out of its current pilot and proof-of-concept phase, the community must begin coalescing around a suite of commonly understood definitions, calculations, and data sharing practices," states Todd Carpenter, NISO Executive Director. "Organizations and researchers wanting to apply these metrics need to adequately understand them, ensure their consistent application and meaning across the community, and have methods for auditing their accuracy. We must agree on what gets measured, what the criteria are for assessing the quality of the measures, at what granularity these metrics are compiled and analyzed, how long a period the altmetrics should cover, the role of social media in altmetrics, the technical infrastructure necessary to exchange this data, and which new altmetrics will prove most valuable. The creation of altmetrics standards and best practices will facilitate the community trust in altmetrics, which will be a requirement for any broad-based acceptance, and will ensure that these altmetrics can be accurately compared and exchanged across publishers and platforms."

[more]

Source and Full Text Available At

[http://www.niso.org/news/pr/view?item_key=72efc1097d4caf7b7b5bdf9c54a165818399ec86]

Saturday, September 21, 2013

JISC Report > Access to Citation Data: Cost-benefit and Risk Review and Forward Look

1 Introduction
1.1 Aim, scope and focus of the study

Aim

1.1.1 The overarching aim of the report is to explore and suggest practical directions and actions to move toward more cost-effective creation, dissemination and exploitation of citation data in the context of current and potential future usage scenarios. A further aim is to propose the roles that Jisc and others might play in this system in future.

Scope and focus

1.1.2 The general scope of this work is the creation and exploitation of citation data derived from peerreviewed academic research articles. However, the citation of datasets is specifically excluded. While  bibliographic metadata associated with the referencing and referenced outputs is clearly relevant, it is  not the focus of the work. Approaches to exploitation of citation data are also not the focus of the review, except in as much that this might increase or change the demand for different types of citation data.

1.2 Study approach

1.2.1 The study approach has been informed by the JISC invitation to tender (JISC Executive, 2012) and was conducted in three phases. In the first phase, desk research and an intensive series of interviews were undertaken with key stakeholders including publishers, citation data providers, citation data users, research funders to understand the strategic drivers. The information gathered was used to identify possible outline usage scenarios and develop a business model framework together with an initial view of the pros and cons of each scenario.

1.2.2 In the second phase, an agreed set of usage scenarios and business models were developed and refined in consultation with users of citation data, publishers and citation data providers. In addition, a DevCSI developers’ workshop or Hack Day was held on 27 September 2012 [6] to explore usage of citation data through short trials or pilots using real citation data. This brought together a group of domain experts, users and developers to explore ideas related to potential real world uses of citation data and to prototype potential solutions. The group investigated aspects of citation data and its use, including the properties of sparse networks of data, and considering new ways to visualise citation data. The event provided some interesting perspectives on the use of citation data, and in particular supported the design of the ‘open’ processes.

1.2.3 In the final phase, the results of the second phase have been used to develop options for a viable practical direction and set of actions for taking the use of citation data forward. It is planned to seek  feedback and agreement with stakeholders at a final stakeholder meeting.

Source and Full Text Available At:

[http://repository.jisc.ac.uk/5371/1/Access-to-Citation-data-report-final.pdf]

Wednesday, September 18, 2013

S&TL > Introduction to Altmetrics for Science, Technology, Engineering, and Mathematics (STEM) Librarians

Science & Technology Libraries

Linda M. Galloway, Janet L. Pease & Anne E. Rauh
Published online: 12 Sep 2013 / DOI:10.1080/0194262X.2013.829762

ABSTRACT

Quantifying scholarly output via citation metrics is the time-honored method to gauge academic success. Altmetrics, or alternative citation metrics, provide researchers and scholars with new ways to track influence across evolving modes of scholarly communication. This article will give librarians an overview of new trends in measuring scholarly influence, introduce them to altmetrics tools, and encourage them to engage with researchers in discussion of these new metrics.

  • INTRODUCTION
  • WHAT ARE ALTMETRICS?
  • TRADITIONAL TOOLS
  • ALTMETRICS TOOLS
  • CONNECTING SCHOLARSHIP WITH...
  • ENGAGING CONSTITUENTS
  • LIMITATIONS
  • CONCLUSION

Paying attention to and collecting alternative metrics about research products will vary according to one's field and scholarly community. Authors should be encouraged to explore and engage with social media tools already in use in their disciplines and be mindful of emerging tools. Scholars are beginning to go “beyond the paper” and engage with their colleagues via Twitter, blogs, and reference managers (Priem 2013). These types of interactions will continue to increase, and those who remain unengaged will likely be left out of important discussions. Increasingly, it is important to not only read the newest journal article, but to follow the chatter about the research in social media platforms. Reluctant social media adopters may be encouraged to engage once they understand that it is perfectly acceptable to simply read or observe, rather than post or tweet.

Awareness of new metric tools and how they relate to social media is important knowledge for producers of scholarly output. These tools complement existing readership, promote work to new readers, and measure outputs in concert with traditional scholarly metrics. As a complement to traditional citation metrics, altmetrics can provide a more rapid assessment and arguably a more complete picture of an individual's scholarly influence. Altmetrics tools can also help illustrate the value of scholarly output beyond publications.
Tracking the relevance and significance of these research products requires knowledge of the practices within a discipline and the foresight to predict what may be important to track in the future. While altmetrics can help researchers by vetting, organizing, and adding value to information products retrieved, it is essential to contextualize these data. Information professionals, with knowledge of both traditional and emerging scholarly metrics, are able to bridge the divide between these forms of scholarly engagement.

>>> Thanks to Lorrie Pellack for the HeadUp ! <<<

Source and Full Text Available At

[http://www.tandfonline.com/doi/full/10.1080/0194262X.2013.829762#.UjoSQsash8E]

Open Access Version Not Currently Availablle [09-18-13] / Subscribers and Pay-Per-View Only

Tuesday, September 17, 2013

Article-Level Metrics Workshop 2013 / Thursday, October 10, 2013 at 8:30 AM - Friday, October 11, 2013 at 4:00 PM (PDT) San Francisco, CA

Event Details

As article-level metrics (ALM) come of age, the question is no longer whether we need them, but rather how we implement them. Building upon the successful ALM Workshop in November 2012, PLOS invites you to the second annual ALM Workshop 2013 on October 10-12, 2013 in San Francisco.

The preliminary program is [now] ... posted ... .  Check out the lineup of speakers and presentations in store for the event, representing researchers; funders; academic administrators; and technology providers.

[http://article-level-metrics.plos.org/alm-workshop-2013-preliminary-program/]

This year, we will move the community conversations beyond the basics to focus on success stories and challenges encountered, latest analyses on the growing data corpus, as well as deep dives into the technical details behind it all. The format in the first two days will consist of a series of talks and panel discussions with ample mingling time to encourage in-depth sharing. For the third day of the workshop, we will organize a data challenge (data hackathon), giving participants the opportunity to do data analysis and data visualization on a variety of ALM datasets from different sources.

*** FREE Registration ***

Please register for the data challenge [at]

[http://almdatachallenge.eventbrite.com/]  
                                    
Source and Links Available At:

[http://almworkshop13.eventbrite.com/]

A/V Now Available > 09-19-13 > Science > Live Chat: Should We Ditch Journal Impact Factor? > September 19 2013 > 3 PM (ET)

Chat Guests. (L) Sandra Schmid is the Cecil H. Green Distinguished Chair in Cellular and Molecular Biology at the University of Texas Southwestern Medical Center. (C) Heather Piwowar is a postdoctoral researcher at Duke University who works remotely from Vancouver, Canada, and who primarily studies the way bibliometric factors and credit attribution affect scientists. (R) Mike Price is a staff writer for Science and the chat moderator.

[The live video chat will begin at 3 p.m. Eastern on Thursday 19 September. Please leave your questions for the guests in the comment box below and check back just before it begins to join the chat.]

The journal impact factor was designed to help librarians decide which journals to subscribe to and was never intended as a measuring stick for the value of a scientist’s research, as it is sometimes used today. Now, there has been a push to reexamine the importance that tenure committees and journal reviewers assign to journal impact factors.

Earlier this year, a group of concerned scientists and journal publishers signed an open letter known as the San Francisco Declaration on Research Assessment (DORA) to encourage review boards and tenure committees to “eliminate the use of journal-based metrics, such as Journal Impact Factors, in funding, appointment, and promotion considerations,” and to encourage the development of alternative metrics (altmetrics) to measure a scientist’s research contributions.

Join Heather Piwowar of Duke University, an expert in bibliometric factors and credit attribution, and DORA signatory Sandra Schmid of the University of Texas Southwestern Medical Center on Thursday, 19 September, at 3 p.m. EDT ... .

Source, Link, and A/V Available At:

Friday, April 19, 2013

The Article of the Future Is Now Live!



Resulting from the Article of the Future project innovations, we are now able to announce the SciVerse ScienceDirect redesigned article page, with a new layout including a navigational pane and an optimized reading middle pane.

The Article of the Future project- an ongoing initiative aiming to revolutionize the traditional format of the academic paper in regard to three key elements: presentation, content and context.

About the Article of the Future

Elsevier invests in platform innovation bringing together solutions like SciVerse ScienceDirect, SciVerse Scopus and web/third party content into one point of access: SciVerse. Now, through the Article of the Future project, Elsevier is redefining the article and associated article page on SciVerse ScienceDirect to allow for an optimal exchange of formal scientific research between scientist

  • The Article of the Future project is our never-ending quest to explore better ways to create and deliver the formal published record.
  • The Article of the Future format makes Elsevier journals on SciVerse ScienceDirect the best possible place to expose and explore research
  • Developed with 150 researchers
  • Redesigned article presentation for excellent on-line readability and seamless navigation
  • Discipline-specific content, format, and tools adjusted to the author and user needs and workflow
  • Enriched article content with features such as the Protein Viewer, Genome Viewer and Google Maps
  • Enables authors to put their article in the context of other research such as Genbank and Protein Data Bank

Sources Available At 

[ http://www.articleofthefuture.com ]

Thursday, April 18, 2013

New SPARC Community Resource on Article-Level Metrics



Greg Tananbaum / April 16, 2013

Today, SPARC released a new community resource, Article-Level Metrics -- A SPARC Primer, delving into Article-Level Metrics (ALMs) an emerging hot topic in the scholarly publishing arena. Article-Level Metrics (ALMs) are rapidly emerging as important tools to quantify how individual articles are being discussed, shared, and used. This new SPARC primer is designed to give campus leaders and other interested parties an overview of what ALMs are, why they matter, how they complement established utilities and metrics, and how they might be considered for use in the tenure and promotion process.  

While Article-Level Metrics are not inherently part of the open access movement, they are tools that can be applied in a variety of ways that are of interest to SPARC and its constituents.  The community can develop, distribute, and build upon ALM tools in a manner that opens up impact metrics as never before.   These community efforts are transparent in the methodologies they use to track impact, as well as the technologies behind the processes.  In this manner, ALMs dovetail with not just SPARC's push for open access but various other “open” movements – open science, open data, and open source chief among them.  ALMs that are free to use, modify, and distribute contribute to a world in which information is more easily shared and in which the pace of research and development is accelerated as a consequence.

Source and Link Available At 

Thursday, March 28, 2013

The Future of Publishing > _Nature_ Special Issue



After nearly 400 years in the slow-moving world of print, the scientific publishing industry is suddenly being thrust into a fast-paced online world of cloud computing, crowd sourcing and ubiquitous sharing. Long-established practices are being challenged by new ones – most notably, the open-access, author-pays publishing model. In this special issue, Nature takes a close look at the forces now at work in scientific publishing, and how they may play out over the coming decades.

How scientists share and reuse information is driven by technology but shaped by discipline.
Nature ( 28 March 2013 )

NEWS

Sham journals scam authors
Con artists are stealing the identities of real journals to cheat scientists out of publishing fees.
Nature ( 28 March 2013 )

NEWS FEATURES

The true cost of science publishing
Cheap open-access journals raise questions about the value publishers add for their money.
Nature ( 28 March 2013 )

The library reboot
As scientific publishing moves to embrace open data, libraries and researchers are trying to keep up.
Nature ( 28 March 2013 )
The dark side of publishing

The explosion in open-access publishing has fuelled the rise of questionable operators.
Nature ( 28 March 2013 )
COMMENT

Beyond the paper
The journal and article are being superseded by algorithms that filter, rate and disseminate scholarship as it happens, argues Jason Priem.
Nature ( 28 March 2013 )
A fool's errand

Objections to the Creative Commons attribution licence are straw men raised by parties who want open access to be as closed as possible, warns John Wilbanks.
Nature ( 28 March 2013 )
How to hasten open access

Three advocates for a universally free scholarly literature give their prescriptions for the movement’s next push, from findability to translations.
Nature ( 28 March 2013 )
BOOKS AND ARTS

Q&A: Knowledge liberator
Robert Darnton heads the world's largest collection of academic publications, the Harvard University Library system. He is also a driver behind the new Digital Public Library of America. Ahead of its launch in April, he talks about Google, science journals and the open-access debate.
Nature ( 28 March 2013 )

CAREERS

Open to possibilities
Opting for open access means considering costs, journal prestige and career implications.
Nature ( 28 March 2013 )

Source and Access to Full Text Available At 

Wednesday, January 2, 2013

A/V Now Available > FREE Webcast > Individual and Scholarly Networks > A Two-Part Seminar on Building Networks and Evaluating Network Relationships > January 22 2013

Collaborative platforms and social networking websites are becoming popular with scientists and researchers around the world: scholars can connect between institutions, countries and disciplines easily, faster and better than ever before. "The Individual and Scholarly Networks" will explore two aspects of this phenomenon; firstly, how the connections are forming, and how attitudes may change to adapt to the new environment, and, secondly, how connections can be evaluated, nuanced and measured.

The seminar will take place on Tuesday, January 22nd 2013 and will be webcast live from New York, Amsterdam and Oxford. It will be split into two segments:

Part 1: Building Networks | 8:00-10:00 EST / 13:00-15:00 GMT

This session will focus on the ways in which these relationships are formed and maintained, and how they are changing the nature of scholarly relationships.

Part 2: Evaluating Network Relationships | 10:30-12:30 EST / 15:30-17:30 GMT

Altmetrics is one of the most explosive areas of interest in bibliometric analysis and is increasing in importance. This session will explore the related areas of altmetrics, contributorship and the culture of reference.

SPEAKERS

Dr William Gunn, Head of Academic Outreach, Mendeley

http://www.mendeley.com/profiles/william-gunn/ 

Professor Jeremy Frey, Head of Physical Chemistry, Southampton University

http://www.southampton.ac.uk/~jgf/Frey/Home.html

Dr Heather Piwowar, Postdoc at Duke University, ImpactStory

http://www.researchremix.org/wordpress/

Gregg Gordon, President and CEO, Social Science Research Network

http://ssrn.com/

Dr Gudmundur Thorisson, Research Associate, University of Leicester

http://gthorisson.name/

Kelli Barr, Graduate Research Assistant, Center for Study of Interdisciplinarity, University of North Texas

http://www.csid.unt.edu/about/People/barr.html

SEATS ARE LIMITED > REGISTRATION REQUIRED

Link to A/V Available At 

[http://www.researchtrends.com/virtualseminar/]

Comparison Report

How virtual science communities are transforming academic research

[http://elsevierconnect.com/how-virtual-science-communities-are-transforming-academic-research/]