Recent from talks
Nothing was collected or created yet.
Altmetric
View on WikipediaAltmetric, or altmetric.com, is a data science company that tracks where published research is mentioned online, and provides tools and services to institutions, publishers, researchers, funders and other organisations to monitor this activity, commonly referred to as altmetrics.[1][2][3] Altmetric was recognized by European Commissioner Máire Geoghegan-Quinn in 2014 as a company challenging the traditional reputation systems.[4]
Key Information
Altmetric is a portfolio company of Digital Science,[5][6] which is owned by Holtzbrinck Publishing Group.[7]
History
[edit]Altmetric was founded by Euan Adie[8][9] in 2011.[10] Previously a researcher, Adie had already worked on Postgenomic.com, an open source scientific blog aggregator[11] founded in 2006. In 2011, Adie entered an altmetrics app into Elsevier's Apps for Science competition and won.[12][13] The prize money enabled Altmetric to develop a full version of the Altmetric Explorer, released in February 2012.[14]
In July 2012, Altmetric took on additional investment from Digital Science,[1] before being fully acquired in 2016. Altmetric is still a part of the Digital Science group today, with offices in London, Germany, the United States and Australia.
In 2019 Altmetric and Nature received funding from the Google Digital News Innovation Fund to "build a novel tool for measuring the impact of journalism".[15]
Concept
[edit]A term first coined in the altmetrics manifesto in 2010,[9][16] altmetrics (also known as 'alternative metrics') were developed to provide authors and other stakeholders a more comprehensive record of engagement with scholarly work, particularly that which takes place beyond the academy amongst a broader audience. In order to do this, Altmetric tracks a range of online sites and sources looking for 'mentions' (links or written references) to scholarly outputs (which include journal articles, blogs, data sets and more).[17] Sources of the attention include the mainstream media, public policy documents, social and academic networks, post-publication peer-review forums and, more recently, Wikipedia and the Open Syllabus Project.[18]
The data are tracked in real-time and collated in the Altmetric details pages, which provide a clickable summary of all of the online attention relating to a single research output.
The Altmetric Attention Score and Donut Badge
[edit]Altmetric employs an algorithm to assign each item an automatically calculated score. Based on the volume and source of attention an item has received, the score is intended to reflect the reach or popularity of the research output. A multicolored 'donut' visualization is also generated to provide a summary of the sources of the attention that an item has received (red for news, light blue for Twitter, etc.).[19][20] Altmetric make the data available via the Altmetric Bookmarklet, a browser plugin, the Explorer platform, a cloud-hosted database, and API.

The sources include academic citations and academic platforms, patents, posts on social media and web forums, Wikipedia articles, users on Mendeley.[21]
Many publishers, including John Wiley & Sons, Taylor and Francis, The JAMA Network and Springer Nature embed the Altmetric 'Donut' Badges into their journal article and book pages to show the Altmetric score for individual items from within the publisher platform.[22][23][24][25] At least one website, OOIR, is specifically built around the showcase of scientific trends based on Altmetric Attention Scores.[26] Even though there is no clear link between altmetric scores and societal impact, they can be used to predict future citation impact,[27] and may be a target for manipulation.[28]
See also
[edit]References
[edit]- ^ a b Piwowar, Heather (9 January 2013). "AOP". Nature. 493 (7431): 159. Bibcode:2013Natur.493..159P. doi:10.1038/493159a. PMID 23302843. S2CID 205075867.
- ^ Shema, Hadas. "Thoughts about altmetrics (an unorganized, overdue post)". Scientific American Blog Network. Retrieved 25 February 2017.
- ^ Clark, Liat (2014-08-13). "How 'Google Science' could transform academic publishing". Wired UK. Retrieved 25 February 2017.
- ^ Geoghegan-Quinn, Máire. "EuroScience Open Forum (ESOF) Keynote Speech: "Science 2.0: Europe can lead the next scientific transformation"". europa.eu. European Commission. Retrieved 25 February 2017.
- ^ "Home - Digital Science". Digital Science. Retrieved 2017-02-25.
- ^ Coghill, Jeffrey G.; Russell, Roger G., eds. (2016). Developing Librarian Competencies for the Digital Age. Rowman & Littlefield. p. 9. ISBN 9781442264458.
- ^ Carpenter, Caroline (2015-05-06). "Completed merger forms 'Springer Nature'". The Bookseller. Retrieved 10 February 2017.
- ^ Stuart, David (2015-06-01). "Research Information". Retrieved 10 February 2017.
- ^ a b Luther, Judy (2012-07-25). "Altmetrics – Trying to Fill the Gap". The Scholarly Kitchen. Society for Scholarly Publishing. Retrieved 10 February 2017.
- ^ Hicks, Diana; Wouters, Paul; Waltman, Ludo; de Rijcke, Sarah; Rafols, Ismael (22 April 2015). "Bibliometrics: The Leiden Manifesto for research metrics". Nature. 520 (7548): 429–431. Bibcode:2015Natur.520..429H. doi:10.1038/520429a. hdl:10261/132304. PMID 25903611.
- ^ McIntosh, Joyce, ed. (2016). Library and Information Science: Parameters and Perspectives. CRC Press. p. 33. ISBN 9781466562028.
- ^ "Altmetric | Apps for Science". Apps for Science. Retrieved 2017-02-09.
- ^ Elsevier. "Elsevier Announces Winners of "Apps for Science" Challenge". www.prnewswire.com (Press release). Retrieved 2017-02-09.
- ^ "Explorer for Publishers". Altmetric. 2015-07-09. Retrieved 2017-02-09.
- ^ "Altmetric and Nature awarded funding from the Google Digital News Innovation Fund – Altmetric". Retrieved 2022-02-21.
- ^ "altmetrics: a manifesto – altmetrics.org". altmetrics.org. Retrieved 2017-02-09.
- ^ "How it works". Altmetric. 2015-07-09. Archived from the original on 2020-02-12. Retrieved 2017-02-09.
- ^ "Our sources". Altmetric. 2015-07-09. Retrieved 2017-02-09.
- ^ "The donut and Altmetric Attention Score". Altmetric. 2015-07-09. Retrieved 2017-02-09.
- ^ Sheffield, University of. "Altmetric donuts - Altmetric - Research Information Systems - Research & Innovation Services - The University of Sheffield". www.sheffield.ac.uk. Archived from the original on 2017-03-08. Retrieved 2017-02-10.
- ^ "Sources of Attention: Altmetric track a unique range of online sources to capture the conversations relating to research outputs". 2015-07-09.
- ^ "Altmetrics". Retrieved 2017-02-10.
- ^ "Author Services Measuring impact with article metrics". authorservices.taylorandfrancis.com. 2015-05-25. Retrieved 2017-02-10.
- ^ Network, The JAMA. "About Altmetrics on The JAMA Network". sites.jamanetwork.com. Retrieved 2017-02-10.
- ^ "Springer now sharing data from Altmetric on SpringerLink". springer.com. Retrieved 2017-02-10.
- ^ "About OOIR". Retrieved 2023-03-14.
- ^ Thelwall, Mike; Nevill, Tamara (2018). "Could scientists use Altmetric.com scores to predict longer term citation counts?". Journal of Informetrics. 12 (1): 237–248. arXiv:1801.10311. doi:10.1016/j.joi.2018.01.008. S2CID 4623584.
- ^ Wien, Charlotte; Deutz, Daniella B. (2019-06-21). "What's in a tweet? Creating Social Media Echo Chambers to inflate 'the donut'". LIBER Quarterly. 29: 3. doi:10.18352/lq.10289.
Altmetric
View on GrokipediaBackground and History
Founding of Altmetric
Altmetric was founded in 2011 by Euan Adie, a former bioinformatics researcher and senior product manager at Nature Publishing Group.[9][10] Prior to launching Altmetric, Adie had developed Postgenomic.com between 2005 and 2009, an open-source platform that aggregated content from scientific blogs to facilitate online discussions in the research community.[10][11] Adie's motivation for founding Altmetric stemmed from his personal experiences as a researcher and author, where he observed limitations in traditional metrics like journal impact factors and citation counts, which often failed to capture the broader, real-time influence of scholarly work, particularly in fields less suited to high-impact publications.[12] He sought to address the growing need to track non-traditional impacts of research, such as mentions in online conversations, social media, and other digital platforms, thereby providing a more comprehensive view of scholarly attention beyond citations.[12][13] In July 2012, Altmetric received investment from Digital Science, a technology company focused on research data solutions, which became part of its portfolio to accelerate the development of data-driven insights for the scholarly ecosystem.[13][10] This funding supported key early milestones, including the release of the full version of the Altmetric Explorer in February 2012, an interactive tool for visualizing research attention, and the launch of the company's first API later that year, enabling publishers and institutions to embed altmetrics data into their workflows.[10] By 2013, Altmetric had achieved significant early adoption through integrations with major publishers, including Nature Publishing Group, allowing seamless incorporation of attention metrics directly into article pages and enhancing the visibility of online engagement for researchers and readers.[14][9] These developments marked Altmetric's transition from a startup project to a foundational service in tracking the diverse impacts of scholarly outputs. In 2016, Digital Science fully acquired Altmetric. Adie served as CEO until 2017, after which he left to found Overton, a company focused on tracking policy-related research impacts.[15]Development of the Altmetrics Concept
The term "altmetrics" was coined in 2010 by Jason Priem, Dario Taraborelli, Paul Groth, and Cameron Neylon in their manifesto, which proposed tracking scholarly impact through social media and online mentions as a complement to traditional citation counts.[16] This document emphasized the potential of platforms like Twitter (now X), blogs, and reference managers to capture diverse forms of influence, such as public engagement and rapid dissemination, that formal publications often overlook.[17] The development of altmetrics emerged as a direct response to the limitations of established bibliometric indicators, including the h-index, which rely heavily on peer-reviewed citations and thus delay assessment of impact by years while ignoring broader societal reach.[16] Proponents argued that citation-based metrics, while valuable for academic validation, fail to account for immediate feedback loops in digital ecosystems, where scholars increasingly share and discuss work on social platforms—over one-third of researchers used Twitter by 2010, for instance.[17] This shift highlighted the need for faster, more inclusive measures to reflect the evolving nature of scholarly communication in a web-connected environment. Early academic contributions further solidified altmetrics' theoretical foundation, notably in a 2013 article by Heather Piwowar and Jason Priem published in the Bulletin of the Association for Information Science and Technology.[18] The paper outlined altmetrics' role in enabling real-time impact evaluation, demonstrating how online signals could supplement CVs and assessments by quantifying attention from non-traditional sources like news outlets and forums, thus providing a more holistic view of research influence.[18] The altmetrics concept quickly transitioned toward practical implementation, inspiring initial prototypes in 2011 that tracked online mentions, such as ReaderMeter, which aggregated references from blogs and social media to scholarly articles.[16] These early tools, developed amid workshops like altmetrics11, paved the way for commercial applications, including the operationalization seen in services like Altmetric.[19]Core Principles
Definition and Scope of Altmetrics
Altmetrics, short for alternative metrics, encompass the measurement of scholarly impact through online activity and engagement beyond traditional academic citations. Coined in 2010 by Jason Priem, the term refers to metrics that track the broad, rapid influence of research outputs across digital platforms, including social media mentions, news coverage, policy citations, and blog discussions.[20] Unlike bibliometrics, which focus primarily on peer-reviewed citations accruing over years, altmetrics capture immediate and diverse forms of attention that reflect real-world dissemination and uptake.[21] The scope of altmetrics is expansive, applying to various research outputs such as journal articles, datasets, books, software, and non-traditional research objects (NTROs) like performances, digital exhibits, and policy reports.[22][23] This breadth allows for the assessment of both quantitative signals—such as the volume of shares or downloads—and qualitative indicators of societal impact, including how research shapes public opinion, informs policy, or influences non-academic communities.[21] By prioritizing these multifaceted traces, altmetrics provide a more holistic view of research value in an interconnected digital ecosystem. Central principles guiding altmetrics include timeliness, which facilitates real-time monitoring of engagement as it occurs; diversity, encompassing signals from varied sources like social networks, reference managers, and encyclopedias to avoid overreliance on any single channel; and openness, advocating for transparent, publicly available data to support equitable evaluation practices.[24][21] This framework emerged as a response to the limitations of citation-based systems dominant in the 2000s, amid the explosive growth of scholarly output that overwhelmed traditional filters. Altmetrics gained prominence in the 2010s, propelled by open science initiatives promoting accessibility, data sharing, and inclusive impact assessment.[24][22]Data Sources and Tracking Methods
Altmetric aggregates attention data from a diverse array of online sources to monitor discussions of scholarly outputs. Primary sources include social media platforms such as X (formerly Twitter), where public posts, quotes, and reposts are tracked; Facebook, limited to public pages; Reddit, focusing on original post titles; and Bluesky, with public posts and reposts added in late 2024.[25][26] Additional sources encompass mainstream media from over 12,000 global news outlets in multiple languages, blogs via more than 16,000 RSS feeds, Wikipedia edits, policy documents from governmental and non-governmental organizations, and patent citations across nine jurisdictions.[25] Emerging integrations include podcast episodes, tracked starting October 15, 2025, to capture audio-based mentions of research.[27] To link these mentions to specific research outputs, Altmetric employs DOI-based identification alongside other scholarly identifiers like PubMed IDs and ISBNs, ensuring accurate matching even for varied formats such as journal articles, datasets, and reports.[28] Tracking methods vary by source: APIs facilitate real-time collection from platforms including X, Bluesky, Facebook, Reddit, Wikipedia, and YouTube, while web scraping via RSS feeds handles daily updates for news and blogs.[29] For policy documents and clinical guidelines, PDF scanning and text mining extract references, with updates occurring at least monthly; patent data is sourced weekly from JSON feeds.[29] Historical data collection began in October 2011, providing longitudinal coverage for most sources, though some like policy documents extend back to 1928 where available.[30] Data processing involves rigorous steps to enhance reliability, including deduplication by cross-checking identifiers to consolidate mentions across versions of the same output into a single record.[28] Geolocation is applied where possible, such as associating X profiles with countries based on user data.[31] Classification features, including sentiment analysis using AI and large language models to detect positive, negative, or neutral tones in mentions, were introduced as a beta in June 2025 and fully released in September 2025, initially covering publications from 2024 onward.[32][33] Overall, these methods support tracking for over 24 million research outputs and more than 256 million mentions.[28] Recent expansions underscore Altmetric's focus on practical impacts: clinical guidelines were added in November 2024[34] to monitor recommendations in medical practice, while new policy sources were added in October 2025, contributing to its coverage of over 200 policy sources worldwide, including governmental reports and white papers to better reflect research influence on decision-making.[35][36] This aggregated data feeds into metrics like the Altmetric Attention Score for quantifying online engagement.[28]Metrics and Calculation
The Altmetric Attention Score
The Altmetric Attention Score is a single, weighted numerical value designed to quantify the volume and quality of online attention garnered by a research output, such as a journal article, book, or dataset. It serves as a composite metric that captures engagement across multiple platforms, offering researchers and institutions a snapshot of a work's broader societal reach beyond traditional citation counts. By aggregating and weighting mentions from tracked sources, the score emphasizes both the quantity of discussions and the relative influence of those sources, with values ranging from 0 for no tracked attention to over 10,000 for exceptional cases of widespread visibility.[37][38] At its core, the Attention Score components include mentions detected from a variety of online channels, such as social media platforms (e.g., Twitter, Facebook), mainstream news sites, policy documents, blogs, and Wikipedia. These elements are not simply tallied but processed through an automated system that updates the score in real-time as new attention is identified, ensuring it reflects current online activity. This dynamic nature allows the metric to evolve with emerging discussions, providing an ongoing measure of a research output's resonance in public and professional spheres.[39][37] Interpreting the score requires contextual benchmarks provided in Altmetric tools, as absolute values vary by discipline and publication venue; higher scores generally indicate stronger visibility and potential influence, with percentiles showing relative performance against comparable outputs (e.g., top 25% means outperforming 75% of peers in the same field or journal). The score is often presented alongside a visual representation like the Donut badge for quick assessment.[40][38] The metric's evolution underscores its refinement for clarity and reliability: originally termed the "Altmetric Score," it was renamed the "Attention Score" in 2016 to avoid misconceptions about measuring scholarly impact and to highlight its focus on attention. In 2024, Altmetric designated the Dimensions database as its preferred primary source for research output identifiers and metadata, improving the score's precision and integration with comprehensive scholarly records.[41][42]Weighting System and Algorithm
The Altmetric Attention Score is computed through an automated algorithm that aggregates a weighted count of unique mentions across tracked online sources, reflecting the volume, source quality, and contextual factors of attention received by a research output. The core formula involves summing contributions from each mention, adjusted by base weights and modifiers: essentially, score = Σ (mentions from source_i × weight_i × modifier factors), where weights prioritize sources with greater perceived influence and reach. This approach ensures that a single mention in a high-impact news outlet contributes more to the score than numerous mentions on lower-reach platforms.[37][43] Base weights are assigned to source categories based on their relative influence, with mainstream news outlets receiving the highest values due to their broad audience and editorial standards, while user-generated content like social media receives lower weights. For instance, policy documents, clinical guidelines, patents, and Wikipedia citations are weighted at 3 points each. Twitter (now X) original posts contribute a base of 1 point before modifiers, establishing it as the reference for social media weighting, whereas other social media mentions (e.g., Facebook, Reddit) start at 0.25, blogs are weighted at 5, and peer-reviewed platforms like F1000 at 1. These weights are tiered within categories—for news, global outlets like The New York Times receive higher multipliers than niche sites based on estimated audience size and reach. Altmetric also applies modifiers for factors such as author influence (e.g., a 1.1× boost for mentions by high-profile accounts, rounded up) and content type (e.g., 0.85× for retweets), further refining contributions by audience engagement potential.[37][44][43]| Source Category | Base Weight | Notes on Adjustments |
|---|---|---|
| News | 8 | Tiered by audience reach (e.g., major outlets > niche blogs) |
| Blogs | 5 | Standard across platforms |
| Policy Documents, Wikipedia, Patents, Clinical Guidelines | 3 | Capped per source/output (e.g., max 3 for Wikipedia) |
| Peer Review, Syllabi, F1000 | 1 | Static per mention |
| Twitter/X (original posts) | 1 | Reference for social media; modifiers for influence/promiscuity |
| Other Social Media (e.g., Facebook, Reddit), Q&A, YouTube | 0.25 | Retweets/reposts at 0.85×; rounded up |
| 0.5 | Per post |
