Hubbry Logo
AltmetricAltmetricMain
Open search
Altmetric
Community hub
Altmetric
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Altmetric
Altmetric
from Wikipedia

Altmetric, or altmetric.com, is a data science company that tracks where published research is mentioned online, and provides tools and services to institutions, publishers, researchers, funders and other organisations to monitor this activity, commonly referred to as altmetrics.[1][2][3] Altmetric was recognized by European Commissioner Máire Geoghegan-Quinn in 2014 as a company challenging the traditional reputation systems.[4]

Key Information

Altmetric is a portfolio company of Digital Science,[5][6] which is owned by Holtzbrinck Publishing Group.[7]

History

[edit]

Altmetric was founded by Euan Adie[8][9] in 2011.[10] Previously a researcher, Adie had already worked on Postgenomic.com, an open source scientific blog aggregator[11] founded in 2006. In 2011, Adie entered an altmetrics app into Elsevier's Apps for Science competition and won.[12][13] The prize money enabled Altmetric to develop a full version of the Altmetric Explorer, released in February 2012.[14]

In July 2012, Altmetric took on additional investment from Digital Science,[1] before being fully acquired in 2016. Altmetric is still a part of the Digital Science group today, with offices in London, Germany, the United States and Australia.

In 2019 Altmetric and Nature received funding from the Google Digital News Innovation Fund to "build a novel tool for measuring the impact of journalism".[15]

Concept

[edit]

A term first coined in the altmetrics manifesto in 2010,[9][16] altmetrics (also known as 'alternative metrics') were developed to provide authors and other stakeholders a more comprehensive record of engagement with scholarly work, particularly that which takes place beyond the academy amongst a broader audience. In order to do this, Altmetric tracks a range of online sites and sources looking for 'mentions' (links or written references) to scholarly outputs (which include journal articles, blogs, data sets and more).[17] Sources of the attention include the mainstream media, public policy documents, social and academic networks, post-publication peer-review forums and, more recently, Wikipedia and the Open Syllabus Project.[18]

The data are tracked in real-time and collated in the Altmetric details pages, which provide a clickable summary of all of the online attention relating to a single research output.

The Altmetric Attention Score and Donut Badge

[edit]

Altmetric employs an algorithm to assign each item an automatically calculated score. Based on the volume and source of attention an item has received, the score is intended to reflect the reach or popularity of the research output. A multicolored 'donut' visualization is also generated to provide a summary of the sources of the attention that an item has received (red for news, light blue for Twitter, etc.).[19][20] Altmetric make the data available via the Altmetric Bookmarklet, a browser plugin, the Explorer platform, a cloud-hosted database, and API.

Trending papers from peer-reviewed psychology journals based on their Altmetric Attention Scores.

The sources include academic citations and academic platforms, patents, posts on social media and web forums, Wikipedia articles, users on Mendeley.[21]

Many publishers, including John Wiley & Sons, Taylor and Francis, The JAMA Network and Springer Nature embed the Altmetric 'Donut' Badges into their journal article and book pages to show the Altmetric score for individual items from within the publisher platform.[22][23][24][25] At least one website, OOIR, is specifically built around the showcase of scientific trends based on Altmetric Attention Scores.[26] Even though there is no clear link between altmetric scores and societal impact, they can be used to predict future citation impact,[27] and may be a target for manipulation.[28]

See also

[edit]

References

[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Altmetric is a London-based data analytics company specializing in , which quantify the online attention and engagement with scholarly research outputs—such as journal articles, datasets, and software—through sources including , news outlets, policy documents, blogs, and , complementing traditional citation metrics. Founded in 2011 by Euan Adie, a former bioinformatics researcher, the company emerged in response to the growing need for broader impact assessment tools amid the rise of platforms, building on the altmetrics concept coined by Jason Priem in 2010 as an alternative to citation-centric evaluation. As part of the portfolio since its early investment, Altmetric has grown to serve academic institutions, publishers, funders, and corporations worldwide by aggregating and analyzing millions of research mentions. The company's core offering, the Altmetric Attention Score, is a weighted, normalized metric that aggregates attention data from diverse online sources, assigning varying weights based on source influence (e.g., higher for mainstream news than ) to provide a snapshot of visibility and societal impact. Altmetric's tools include embeddable badges for articles, customizable dashboards for tracking institutional performance, and APIs for integration into platforms like Dimensions and Figshare, enabling users to benchmark outputs, identify emerging trends, and demonstrate non-academic influence in funding applications. Altmetric serves numerous organizations globally and has contributed to standards efforts like the (NISO) Altmetrics Initiative, emphasizing ethical use and to its methodologies.

Background and History

Founding of Altmetric

Altmetric was founded in 2011 by Euan Adie, a former bioinformatics researcher and senior product manager at Publishing Group. Prior to launching Altmetric, Adie had developed Postgenomic.com between 2005 and 2009, an open-source platform that aggregated content from scientific blogs to facilitate online discussions in the research community. Adie's motivation for founding Altmetric stemmed from his personal experiences as a researcher and author, where he observed limitations in traditional metrics like journal impact factors and citation counts, which often failed to capture the broader, real-time influence of scholarly work, particularly in fields less suited to high-impact publications. He sought to address the growing need to track non-traditional impacts of research, such as mentions in online conversations, , and other digital platforms, thereby providing a more comprehensive view of scholarly attention beyond citations. In July 2012, Altmetric received investment from , a focused on data solutions, which became part of its portfolio to accelerate the development of data-driven insights for the scholarly ecosystem. This funding supported key early milestones, including the release of the full version of the Altmetric Explorer in February 2012, an interactive tool for visualizing attention, and the launch of the company's first later that year, enabling publishers and institutions to embed into their workflows. By 2013, Altmetric had achieved significant early adoption through integrations with major publishers, including Nature Publishing Group, allowing seamless incorporation of attention metrics directly into article pages and enhancing the visibility of online engagement for researchers and readers. These developments marked Altmetric's transition from a startup project to a foundational service in tracking the diverse impacts of scholarly outputs. In 2016, fully acquired Altmetric. Adie served as CEO until 2017, after which he left to found Overton, a company focused on tracking policy-related research impacts.

Development of the Altmetrics Concept

The term "" was coined in 2010 by Jason Priem, Dario Taraborelli, Paul Groth, and Cameron Neylon in their , which proposed tracking scholarly impact through and online mentions as a complement to traditional citation counts. This document emphasized the potential of platforms like (now X), blogs, and reference managers to capture diverse forms of influence, such as public engagement and rapid dissemination, that formal publications often overlook. The development of altmetrics emerged as a direct response to the limitations of established bibliometric indicators, including the , which rely heavily on peer-reviewed citations and thus delay assessment of impact by years while ignoring broader societal reach. Proponents argued that citation-based metrics, while valuable for academic validation, fail to account for immediate feedback loops in digital ecosystems, where scholars increasingly share and discuss work on social platforms—over one-third of researchers used by 2010, for instance. This shift highlighted the need for faster, more inclusive measures to reflect the evolving nature of in a web-connected environment. Early academic contributions further solidified altmetrics' theoretical foundation, notably in a 2013 article by Heather Piwowar and Jason Priem published in the Bulletin of the Association for and Technology. The paper outlined ' role in enabling real-time , demonstrating how online signals could supplement CVs and assessments by quantifying attention from non-traditional sources like news outlets and forums, thus providing a more holistic view of research influence. The concept quickly transitioned toward practical implementation, inspiring initial prototypes in 2011 that tracked online mentions, such as ReaderMeter, which aggregated references from blogs and to scholarly articles. These early tools, developed amid workshops like altmetrics11, paved the way for commercial applications, including the operationalization seen in services like Altmetric.

Core Principles

Definition and Scope of Altmetrics

Altmetrics, short for alternative metrics, encompass the measurement of scholarly impact through online activity and engagement beyond traditional academic citations. Coined in 2010 by Jason Priem, the term refers to metrics that track the broad, rapid influence of research outputs across digital platforms, including mentions, news coverage, policy citations, and discussions. Unlike , which focus primarily on peer-reviewed citations accruing over years, altmetrics capture immediate and diverse forms of attention that reflect real-world dissemination and uptake. The scope of altmetrics is expansive, applying to various research outputs such as journal articles, datasets, , software, and non-traditional research objects (NTROs) like , digital exhibits, and reports. This breadth allows for the assessment of both quantitative signals—such as the volume of shares or downloads—and qualitative indicators of societal impact, including how shapes , informs , or influences non-academic communities. By prioritizing these multifaceted traces, altmetrics provide a more holistic view of research value in an interconnected . Central principles guiding include timeliness, which facilitates real-time monitoring of engagement as it occurs; diversity, encompassing signals from varied sources like social networks, reference managers, and encyclopedias to avoid overreliance on any single channel; and openness, advocating for transparent, publicly available data to support equitable evaluation practices. This framework emerged as a response to the limitations of citation-based systems dominant in the , amid the explosive growth of scholarly output that overwhelmed traditional filters. gained prominence in the 2010s, propelled by initiatives promoting accessibility, , and inclusive impact assessment.

Data Sources and Tracking Methods

Altmetric aggregates attention data from a diverse array of online sources to monitor discussions of scholarly outputs. Primary sources include social media platforms such as X (formerly Twitter), where public posts, quotes, and reposts are tracked; Facebook, limited to public pages; Reddit, focusing on original post titles; and Bluesky, with public posts and reposts added in late 2024. Additional sources encompass mainstream media from over 12,000 global news outlets in multiple languages, blogs via more than 16,000 RSS feeds, Wikipedia edits, policy documents from governmental and non-governmental organizations, and patent citations across nine jurisdictions. Emerging integrations include podcast episodes, tracked starting October 15, 2025, to capture audio-based mentions of research. To link these mentions to specific research outputs, Altmetric employs DOI-based identification alongside other scholarly identifiers like PubMed IDs and ISBNs, ensuring accurate matching even for varied formats such as journal articles, datasets, and reports. Tracking methods vary by source: APIs facilitate real-time collection from platforms including X, Bluesky, Facebook, Reddit, Wikipedia, and YouTube, while web scraping via RSS feeds handles daily updates for news and blogs. For policy documents and clinical guidelines, PDF scanning and text mining extract references, with updates occurring at least monthly; patent data is sourced weekly from JSON feeds. Historical data collection began in October 2011, providing longitudinal coverage for most sources, though some like policy documents extend back to 1928 where available. Data processing involves rigorous steps to enhance reliability, including deduplication by cross-checking identifiers to consolidate mentions across versions of the same output into a single record. Geolocation is applied where possible, such as associating X profiles with countries based on user data. Classification features, including using AI and large language models to detect positive, negative, or neutral tones in mentions, were introduced as a beta in June 2025 and fully released in September 2025, initially covering publications from 2024 onward. Overall, these methods support tracking for over 24 million research outputs and more than 256 million mentions. Recent expansions underscore Altmetric's focus on practical impacts: clinical guidelines were added in November 2024 to monitor recommendations in medical practice, while new policy sources were added in October 2025, contributing to its coverage of over 200 policy sources worldwide, including governmental reports and white papers to better reflect research influence on . This aggregated data feeds into metrics like the Altmetric Attention Score for quantifying .

Metrics and Calculation

The Altmetric Attention Score

The Altmetric Attention Score is a single, weighted numerical value designed to quantify the volume and quality of online garnered by a output, such as a journal article, , or . It serves as a composite metric that captures across multiple platforms, offering researchers and institutions a snapshot of a work's broader societal reach beyond traditional citation counts. By aggregating and weighting mentions from tracked sources, the score emphasizes both the quantity of discussions and the relative influence of those sources, with values ranging from 0 for no tracked to over 10,000 for exceptional cases of widespread visibility. At its core, the Attention Score components include mentions detected from a variety of online channels, such as platforms (e.g., , ), mainstream news sites, policy documents, blogs, and . These elements are not simply tallied but processed through an automated system that updates the score in real-time as new is identified, ensuring it reflects current online activity. This dynamic nature allows the metric to evolve with emerging discussions, providing an ongoing measure of a output's in public and professional spheres. Interpreting the score requires contextual benchmarks provided in Altmetric tools, as absolute values vary by and venue; higher scores generally indicate stronger and potential influence, with percentiles showing relative performance against comparable outputs (e.g., top 25% means outperforming 75% of peers in the same field or journal). The score is often presented alongside a visual representation like the Donut badge for quick assessment. The metric's evolution underscores its refinement for clarity and reliability: originally termed the "Altmetric Score," it was renamed the "Attention Score" in 2016 to avoid misconceptions about measuring scholarly impact and to highlight its focus on attention. In 2024, Altmetric designated the Dimensions database as its preferred primary source for research output identifiers and metadata, improving the score's precision and integration with comprehensive scholarly records.

Weighting System and Algorithm

The Altmetric Attention Score is computed through an automated that aggregates a weighted count of unique mentions across tracked sources, reflecting the volume, source quality, and contextual factors of attention received by a output. The core involves summing contributions from each mention, adjusted by base weights and modifiers: essentially, score = Σ (mentions from source_i × weight_i × modifier factors), where weights prioritize sources with greater perceived influence and reach. This approach ensures that a single mention in a high-impact outlet contributes more to the score than numerous mentions on lower-reach platforms. Base weights are assigned to source categories based on their relative influence, with mainstream news outlets receiving the highest values due to their broad audience and editorial standards, while like receives lower weights. For instance, policy documents, clinical guidelines, patents, and citations are weighted at 3 points each. (now X) original posts contribute a base of 1 point before modifiers, establishing it as the reference for weighting, whereas other mentions (e.g., , ) start at 0.25, blogs are weighted at 5, and peer-reviewed platforms like F1000 at 1. These weights are tiered within categories—for news, global outlets like receive higher multipliers than niche sites based on estimated audience size and reach. Altmetric also applies modifiers for factors such as author influence (e.g., a 1.1× boost for mentions by high-profile accounts, rounded up) and content type (e.g., 0.85× for retweets), further refining contributions by audience engagement potential.
Source CategoryBase WeightNotes on Adjustments
8Tiered by audience reach (e.g., major outlets > niche blogs)
Blogs5Standard across platforms
Policy Documents, , Patents, Clinical Guidelines3Capped per source/output (e.g., max 3 for Wikipedia)
, Syllabi, F10001Static per mention
Twitter/X (original posts)1Reference for social media; modifiers for influence/promiscuity
Other Social Media (e.g., , ), Q&A, 0.25Retweets/reposts at 0.85×; rounded up
0.5Per post
To maintain integrity, excludes self-mentions by authors or affiliated institutions, duplicate content from the same source, and spam or automated posts, preventing artificial inflation. Scores are rounded to whole numbers, with low-weight mentions (e.g., multiple shares) sometimes consolidated to avoid overcounting minor attention. While the Attention Score itself is not directly normalized for disciplinary differences—such as higher baseline attention in (mean scores often exceeding 90 for top journals) versus —Altmetric provides field-specific benchmarks to contextualize scores against comparable outputs, accounting for varying engagement norms across fields. Altmetric maintains transparency by publicly documenting base weights and key modifiers on its support site, allowing researchers to understand general contributions, but the precise remains proprietary to deter manipulation and adapt to evolving online behaviors. Recent developments, such as the feature introduced in beta in June 2025 and fully launched in September 2025, analyze the tone of mentions (positive, negative, neutral) using AI but currently serve as a supplementary tool rather than a direct modifier to weights; future updates could integrate sentiment to refine scoring for more nuanced attention measurement.

Visualization and Tools

The Donut Badge

The Donut Badge is Altmetric's signature visualization tool, presenting a compact circular graphic that encapsulates the Altmetric Attention Score and the distribution of online for a research output. At its core, the badge features a central numerical display of the Attention Score, encircled by a segmented "donut" where each colored arc represents the relative contribution from different sources of , such as red for news outlets, light blue for mentions on , and grey for policy documents. This design allows users to quickly grasp both the overall level of engagement and the diversity of platforms involved, without delving into raw data counts. Functionally, the Donut Badge is interactive: hovering over or clicking the graphic reveals a detailed breakdown of the sources, including hyperlinks to the original mentions for further . Introduced in 2012, it is embeddable across various platforms, including publisher websites, researcher profiles, and institutional repositories, enabling seamless integration to highlight research impact directly alongside scholarly content. The badge's responsive design ensures adaptability to mobile devices, maintaining clarity and usability on smaller screens, while its compatibility with Altmetric's APIs facilitates dynamic, real-time updates in embedded environments. Usage guidelines for the Donut Badge emphasize its interpretive role, aiding quick visual triage in research discovery tools. These features position the Donut Badge as a user-friendly gateway to understanding the broader societal reach of research, distinct from traditional citation metrics.

Altmetric Products and Integrations

Altmetric offers a suite of products designed to track and analyze attention to outputs, including the Altmetric Explorer for Institutions, which serves as a for institutions to monitor portfolios of publications and other scholarly works. Launched in 2015, this tool provides search filters, benchmarking capabilities, customizable reports, and export options to facilitate comprehensive analysis of research impact beyond traditional citations. The Badges API enables publishers and platforms to embed dynamic Altmetric badges directly into journal articles and websites, displaying real-time attention scores and source breakdowns with color-coded visualizations. Complementing this, the Details Page delivers machine-readable data on individual research items via DOIs or journal IDs, supporting visualization and integration needs; as of November 10, 2025, all requests to this require an authentication key to ensure secure access. Altmetric integrates with several key platforms to enhance data flow and coverage, including Dimensions as its preferred metadata source since 2024, for researcher identification, Crossref for DOI resolution, and publishers such as for seamless embedding of badges and analytics. Additionally, it supports non-traditional research outputs (NTROs) through initiatives like the 2025 Top 25 NTRO Repositories report (based on 2024 data), which highlights repositories excelling in online attention tracking. Core features across these products include custom alerts for new mentions, benchmarking reports to compare institutional performance, and export tools for data sharing in formats like CSV or API feeds. Recent enhancements encompass podcast tracking introduced in 2025 to capture audio-based discussions of research, and a sentiment analysis beta launched in 2025, which uses AI to classify mentions as positive, negative, or neutral for deeper contextual insights. For accessibility, Altmetric provides free tools such as the , allowing individual researchers to quickly retrieve attention data for any publication by highlighting text on a webpage, while advanced analytics like the Explorer require institutional subscriptions. These offerings form an where the serves as a key visualization component within broader and dashboard functionalities.

Applications and Criticisms

Use in Research Evaluation

Altmetric serves as a supplementary metric in academic evaluations, particularly within tenure and promotion dossiers, where it demonstrates broader beyond traditional citations. For instance, researchers can include Altmetric Attention Scores to highlight online discussions of their work on platforms like and outlets, providing of societal impact that complements citation counts. A growing number of universities incorporate into their tenure and promotion guidelines, recognizing their role in assessing and interdisciplinary influence. In funding applications, organizations such as the encourage the use of to illustrate the reach and of funded research, aiding in the demonstration of real-world applicability. Institutional dashboards powered by Altmetric enable departments to benchmark performance against peer organizations, tracking trends in online attention to research outputs over time. Beyond academia, Altmetric supports public engagement tracking in grant reports, allowing funders to monitor how supported resonates with diverse audiences including practitioners, advocacy groups, and the general public. Publishers leverage Altmetric metrics for article promotion, integrating badges and scores into journal platforms to showcase engagement and attract readers by highlighting articles with high online visibility. In 2025, Altmetric expanded its tracking of policy mentions, capturing references to research in documents worldwide to demonstrate influence on guidelines, practices, and . Notable case studies illustrate Altmetric's application in high-impact scenarios. During the 2020 COVID-19 pandemic, papers such as "The proximal origin of SARS-CoV-2" achieved Altmetric Attention Scores exceeding 33,000, reflecting widespread discussion in media, policy, and social platforms that underscored their rapid societal relevance. In the 2025 Altmetric Top 25 NTRO Repositories report, non-traditional research outputs (NTROs) like datasets and software from repositories such as Stanford Digital Repository and Zenodo were highlighted for their online attention, emphasizing the growing role of altmetrics in evaluating diverse scholarly contributions beyond journal articles. Altmetric benefits research evaluation by complementing citation-based metrics with insights into societal reach, enabling a more holistic assessment of impact that includes public discourse and policy uptake. The (NISO) provided guidelines in 2016 for the ethical use of , recommending transparent reporting of data sources and caveats to ensure responsible interpretation in evaluations.

Limitations and Debates

One major limitation of Altmetric is its vulnerability to gaming and manipulation, such as through bot-generated mentions on platforms, which can artificially inflate scores without reflecting genuine . Additionally, Altmetric exhibits biases toward English-language content and disciplines where researchers are more active on , potentially underrepresenting non-English scholarship and fields with lower online visibility. Furthermore, there is no established causal relationship between Altmetric scores and research quality or long-term impact, as high does not necessarily indicate scholarly value. Debates surrounding Altmetric often center on its emphasis on the quantity of mentions over their quality, with studies from demonstrating low to moderate correlations between Altmetric Attention Scores and traditional citations, suggesting limited for academic influence. concerns arise from the extensive tracking of online mentions, raising questions about practices and user consent in aggregating activity. The proprietary nature of Altmetric's also fuels criticism for its opacity, as undisclosed changes—such as a 2021 reduction in Twitter's weighting—hinder reproducibility and transparency in scoring. Recent developments have intensified discussions, particularly around the 2025 beta launch of Altmetric's AI-powered feature, which assigns scores to posts but has faced scrutiny for moderate accuracy, with a proof-of-concept model achieving only an F1-score of 0.577 in aligning with judgments. Altmetric's reliance on DOIs for tracking excludes non-DOI outputs like books or reports, and while it captures public mentions, paywalled content may receive less visibility in aggregated data due to reduced sharing. Amid source expansions in 2024-2025, there have been calls for greater standardization in to address inconsistencies across providers and improve reliability. Looking ahead, proposals for deeper AI integration in 2025 aim to provide contextual analysis beyond raw counts, potentially mitigating some biases through advanced . Advocacy for open altmetrics data has grown to counter , promoting accessible datasets that enable independent verification and reduce dependence on proprietary platforms like Altmetric.

References

Add your contribution
Related Hubs
User Avatar
No comments yet.