Scholarly Publishing and Open Access blog

The latest news and answers to your questions about scholarly publishing and open access.


Like and subscribe: Measuring impact of non-traditional research outputs

Published by Ioana Liuta

This blog post was written by Jillian Wertzberger, SFU Library co-op student

Non-traditional scholarly outputs need non-traditional metrics. Defined by Bonnet and Mendez-Brady (2017) as non-journal outputs such as podcasts and videos, alternative outputs have relied on a combination of traditional impact metrics and altmetrics (Bonnet & Mendez-Brady, 2017). However, I would argue that because these outputs are not in print, these metrics are insufficient in gauging their impact. Personally, I know that if I read something interesting in a
newspaper, I’d send the article to my friend, but if I listened to something interesting on a podcast, I’d sooner text my friend a three sentence summary.

Is it possible to gauge impact?

This brings us to a more important question in the field of scholarly communications: can we ever really gauge impact? While Bonnet and Mendez-Brady consider altmetrics suitable for measuring the impact of alternative outputs, I maintain that these outputs, and the information they contain, are disseminated in ways that are not possible to gauge using traditional or alternative metrics. If my friend got me to stop drinking coffee by telling me about a podcast she listened to that said caffeine is bad for you, that is impact, but it is not quantifiable in clicks or downloads. If a professor shows a scholarly video in class, it only appears as one view, but could be watched by hundreds of students. Instead, I believe that a more holistic, qualitative approach that matches the innovation and flexibility of alternative outputs is required.

Is engagement the same thing as impact?

So, what are altmetrics, and why am I being so hard on them? Following the definition from the National Information Standards Organization (2013), altmetrics is a “broad term that encapsulates the collection of multiple digital indicators related to scholarly work” (page 12). These indicators include instances of an output being shared on social media sites, blogs, news outlets, and citation managers such as Mendeley and Zotero (Thelwall et al., 2013, as cited in Ojo, 2023). Altmetrics are meant to work in tandem with traditional metrics, such as citations, in order to glean a cohesive picture of an article’s reach (Ruan et al., 2013). As a previous Radical Access post has asserted, altmetrics do provide insights into digital engagement (Ojo, 2023). However, I think it would be a mistake to mistake these quantifiable metrics for impact. In Sarah Morton and Alisa Cook’s How Do You Know if You Are Making a Difference? (2023), the authors assert that a quantifiable approach to gauging impact leads to researchers becoming entrenched in “what can be measured, rather than what is most important” (page 34). In other words, measuring engagement is not the same as measuring impact. In the previous example, we can only measure that my friend downloaded the podcast about caffeine habits, but we cannot measure if she actually listened to its advice, or told a friend about it. It is clear then, that the rapidly evolving nature of research outputs necessitates a different approach to impact evaluation.


New research outputs require new ways of conceptualizing impact. As traditional journals face saturated markets (Anderson & Moore, 2013), scholars are finding new publishing avenues such as videos, graphics, and softwares to share their work (Borie, 2015). Currently, researchers are still using a combination of traditional metrics, such as citations, alongside alternative metrics to measure the impact of these outputs (Bonnet & Mendez-Brady, 2017). In practice, for example, the peer-reviewed podcast Secret Feminist Agenda (from SFU’s own Dr. Hannah McGregor!) measured its success in subscribers, downloads, and citations (Perkins-Deneault, 2020). But that leads us to a bigger question: is that impact?

But are we making a difference?

Following Mayne’s (2012) elaboration on contribution analysis, I argue that impact should be measured in terms of causes and changes, not clicks. A view or download reveals little information about how the research affected its audience. Instead, Morton and Cook (2023) urge the need for analysis of specific outcomes in order to create a larger “contribution story” (page 128). To achieve this, they posit, a more qualitative approach to data is necessary. In practice, Na and Ye (2017) examined Facebook posts that cited academic articles in the field of psychology. The authors analyzed the content of a random sample of comments, revealing a myriad of motivations for engagement from author self-promotion to individuals vowing to change habits and attitudes based on research findings. If applied to a singular output, this method would get closer to measuring specific outcomes than clicks alone. To apply this approach to an alternative output, researchers could use social media to survey podcast listeners with questions such as: “How likely would you be to reduce your caffeine intake after listening to this podcast?" or "Would you/have you recommended reducing caffeine intake to a friend after listening to this episode?” While traditional metrics and altmetrics can tell us how many citations or downloads a video or podcast has, it would be misleading to equate this with impact. Instead, I believe that we should look towards this more qualitative, holistic means of measuring impact as we innovate and expand the limits of research outputs.

Further resources

You can find out more about altmetrics and traditional impact metrics on our Scholarly Publishing webpages.

Or visit these pages:
On altmetrics, going beyond citations to measure research impact
Peer-reviewed podcasts: Amplify Podcast Network produces podcasts as scholarly communication
 


References

Anderson, R., & Moore, K. B. (2013). Is the Journal Dead? Possible Futures for Serial Scholarship. The Serials Librarian, 64(1–4), 67–79. https://doi.org/10.1080/0361526X.2013.759877 

Bonnet, J. L., & Méndez-Brady, M. (2017). Making the mission visible: Altmetrics and nontraditional publishing. Digital Library Perspectives, 33(4), 294–304.https://doi.org/10.1108/DLP-01-2017-0002

Borie, J. (2015). New Forms of Scholarship and a Serials (R)evolution. Serials Review, 41(3), 176–179. https://doi.org/10.1080/00987913.2015.1069782

Carpenter, T. A., & Lagace, N. M. (2017). Defining community recommended practice for altmetrics: The NISO alternative metrics project completes its work. Performance Measurement and Metrics, 18(1), 9–15. https://doi.org/10.1108/PMM-09-2016-0039

Grimshaw, J. M., Eccles, M. P., Lavis, J. N., Hill, S. J., & Squires, J. E. (2012). Knowledge translation of research findings. Implementation Science : IS, 7(1), 50–50. https://doi.org/10.1186/1748-5908-7-50

Mayne, J. (2012). Contribution analysis: Coming of age? Evaluation, 18(3), 270–280. https://doi.org/10.1177/1356389012451663

Morton, S. & Alisa Cook. (2023). How Do You Know if You Are Making a Difference? In What Data and Evidence Do You Need to See What Difference You Are Making? (pp.30–42). Bristol University Press.

Na, J.-C. and Ye, Y.E. (2017). Content analysis of scholarly discussions of psychological academic articles on Facebook. Online Information Review, 41(3), 337-353. https://doi.org/10.1108/OIR-02-2016-0058

Ojo, O. (2023, April 25). On altmetrics, going beyond citations to measure research impact. Radical Access. https://www.lib.sfu.ca/help/publish/scholarly-publishing/radical-access/impact-metrics-altmetrics 

Perkins Deneault, Tessa. (2020, July 28). Peer-reviewed podcasts: Amplify Podcast Network produces podcasts as scholarly communication. SFU News. https://www.sfu.ca/sfunews/stories/2020/07/peer-reviewed-podcasts--amplify-podcast-neEtwork-produces-podcast.html 

Rachal, K. C., Daigle, S., & Rachal, W. S. (2007). Learning Problems Reported by College Students: Are They Using Learning Strategies? Journal of Instructional Psychology, 34(4), 191.

Ruan, Q Z., Chen, A. D., Cohen, J. B., Singhal, D., Lin, S. J., & Lee, B. T. (2018). Alternative Metrics of Scholarly Output: The Relationship among Altmetric Score, Mendeley Reader Score, Citations, and Downloads in Plastic and Reconstructive Surgery. Plastic & Reconstructive Surgery, 141(3), 801–809. https://doi.org/10.1097/PRS.0000000000004128

Zahedi, Z., Costas, R., & Wouters, P. (2014). How well developed are altmetrics? A cross-disciplinary analysis of the presence of ‘alternative metrics’ in scientific publications. Scientometrics, 101(2), 1491–1513. https://doi.org/10.1007/s11192-014-1264-0 

Blog Tags