New Faculty Research Information Series-Session #5: Beyond Citations: Rethinking Research Impact

Part of leading successful research activities is successfully conveying its significance. This session discuses various impact metrics for research/scholarly activities.

Okay, it’s 2:00. I’m going to get started here. Thank you very much for joining, this semester’s final session of the New Faculty Research Information Series – Beyond Citations: Rethinking Research. Impact. The series will start again in the spring semester, so watch for that. My name is Cheryl Dykstra-Aiello and I’m a research development specialist in, Advancement and Strategy within the Office of Research.

Please note that this session will be recorded. Everyone who RSVPed to the session will receive an email with the link to the recording and these slides as an attachment. And then within a few weeks, the recording will be uploaded with the transcript to our website under the Faculty Training tab at orap.wsu.edu. I’ll be happy to answer any questions if you would just hold them until the end of the session.

Thank you very much.

So as we go through today’s information session, I want you to consider how you might determine what counts for demonstrating your impact. In doing so, take some time to consider these questions. Why does your work matter? How can you effectively communicate this and how much as you work valued by you, by your peers, by those in a position to evaluate you?

But don’t forget to consider the work that, you think is important, but you exclude it from your CV because you think it’s maybe not important in considering evaluation. Think about how you would describe the impact of this work and what the success indicators are that you would look for and how you would capture them and how you would report them systematically.

I want to first define impact, and the online Merriam-Webster dictionary provides two definitions. Today, when I talk about impact, we’re obviously looking at the second definition. So impact is a significant or major effect. And we’re talking about the impact or significant impact of your research, scholarly and or artistic activities. So not just on or across academic disciplines through the advancement of knowledge, but also on society, by influencing things like public policy or improving quality of life.

And that begins with a journey. So, as you consider starting your project, whether it’s research, scholarly or artistic project, you’re really beginning your impact journey. And that’s even as you determine what you need to perform, the activities you plan to do and how you plan to do them. You’re going to have outputs in mind, such as publications, presentations, exhibitions, and maybe beyond that, inventions, new methods, tools, prototypes, artifacts, data sets

and that all means that you should also be considering the need for patents, copyrights, trademarks, licensing and the potential for new startups. So the beginning of your impact journey, the start of everything, really is a great time to reach out to the Commercialization unit in the Office of Research, and if you don’t reach out to them at the very beginning, then you should really think about doing it, at least before your first publication, before your first conference, presentation or exhibition, because you want to, talk to them about the activities and their outputs

because if you publicize them, it can have implications on copyrights, etc., and you can contact them through, the link on this, slide: commercialization.wsu.edu. Whatever your outputs, in order to have input impact, they need to be utilized. And that may be through publications, citations or by inspiring new or expanded research directions or through implementation of new tools or methods you created, and or development of new businesses.

These outcomes can then lead to broader impacts or consequences, on this journey, figure and these outcomes, need to be you need to be able to effectively convey the significant impacts of those, outcomes that, you have on communities, education, the economy, politics, technology, etc..

Three concepts are central to understanding and demonstrating research impact: significance, reach and attribution, or the causality with evidence. “Significance” relates to the depth and the importance of the change brought about by your activity or activities. So a transformative effect. So basically you’re, answering the question, what is the importance of the impact to each beneficiary? And one of the examples of the significance that research can have is that your research saved a life. “Reach,” emphasizes the breadth and scale of dissemination, encompassing how widely the activity output is recognized, accessed, and applied in among and across various audiences.

So how widespread is the impact. An example of this impact that has significance and reach is that the research saved millions of lives. Finally, it’s necessary to describe both significance and reach when demonstrating impact. But it’s also imperative that you provide evidence that links the demonstrated impact back to your activities, outputs and their use. So you need to describe how your activities and their outputs contributed to the impact that you’re conveying.

Going back to reach, there are three key aspects to consider in demonstrating impact reach. You need to think about audience engagement, dissemination, channels, and accessibility, Audience engagement identifying and connecting with the groups that stand to benefit from your outputs. If you tailor communication to the needs and knowledge levels of each intended audience, you will, have better audience engagement

Dissemination channels: choose the channels that are going to boost the likelihood that findings will be heard and understood by the intended audience. So, think about things like, you know, the publications where you’re going to publish the conferences that you’re going to attend, how you’re going to use social media, which social media, media platforms, but if you’re going to do community workshops, and mainstream media coverage. Those are all some of the dissemination channels that you might choose. And then access about accessibility, open access platform summary briefs and translations can significantly expand reach.

Ease of access is central. So the more barriers there are, for instance, paywalls, or if you use a lot of technical jargon or you limit the distribution, that really, decreases the potential reach that your, research might have. Finally, remember that reach is ultimately determined by whether and how the findings are used.

To be utilized and therefore have impact, the outputs of your activity must be recognized, and measuring this recognition involves examining quantitative and qualitative influence markers across various impact dimensions including academic input impact, social and societal impact, economic impact, and practice impact. Publication downloads, reads, and view metrics can demonstrate how many times your publication has been accessed. Citations can indicate if and how the publication is informing other experts within and beyond your discipline.

And I’m going to talk about citation measures on the next slide. And the quality and reach of publications is often reflected in journal impact factors. And those all fall under academic impact. Under societal impact think about news or blog mentions that reflect, mainstream interest and potential to reach nonacademic audiences. Engagement on social media through shares, likes, and comments may indicate public understanding and resonance with your work.

Participation in workshops and or conference, and or conferences that gauges direct interest from a particular audience. Collaboration requests signal deeper involvement, with an interest in applying your outputs. Citations and policy documents indicate that your outputs are shaping and guiding political debate, and adoption of your recommendations demonstrates real world integration of your insights. And those all fall under the social and societal impact.

Under economic impact, you have financial benefits. So through generation of revenue, cost savings for public services or creating jobs, those all have financial benefits that impact the economy. Practice impact, when attributed to your output, surveys on awareness and case studies measure direct use of your activity outputs and collaborative joint projects and formal collaborative agreements really underscore underscore sustained commitment to adoption and implementation of your outputs.

Academic input impact is traditionally determined through citation analysis, whereby the quality of an article, author, or institution is based on the number of citations.

But why use citation analysis in determining quality and impact? It can show impact of a publication through its use. As an example by other authors. So that’s they use it as a basis for their own work or cite it in their own publications. You can discover more about a field or topic by reading the papers that cite the seminal work in your discipline.

And, a particular author’s impact can be determined by seeing how often their work has been cited by others. And so one of the metrics then, through citation analysis is H-index. And it’s used to measure both productivity and citation impact for researchers publications. It’s calculated by listing all of the publications by the researcher, counting the number of citations each publication has received,

the publications are ranked in descending order by citation count, and then the index is the highest number h such that the researcher has h papers each cited at least h times. So, for example, if a researcher has ten papers that were cited 25 times, 20, 15, ten, eight, five four, three two and one, in that list, the fifth paper has eight citations, and at least five papers have more than or equal to five citations, which gives an h index of five.

Supposedly this balance is quantity, the number of papers with quality, the the citations. And it’s, less affected by a few highly cited papers. It’s commonly used in academia to evaluate research impact. But it’s contested and problematic.

So although it’s seemingly unaffected by a few highly cited papers, it remains problematic because the score depends on which data source is used for calculation, and it excludes scholars whose publications aren’t indexed in the data source used for the calculation. So some disciplines, might not be part of the data that is collected. Further, it puts early career scholars at a disadvantage because they have fewer publications and a shorter period for those publications to accrue citations.

Google Scholar and Web of Science H-index scores are compared on this slide, and you can see that the Google Scholar score is higher than the web science score. That’s because Google Scholar includes data for more material types than just peer reviewed articles. These two comparisons also represent a short publication career or early careers researcher, that’s me.

Thus the scores are lower than someone with a much longer career. For instance, my former PI, who has a very long career, has a Google Scholar H-index of 100. So early career scholars are at a disadvantage for Disclosing their their, impact through an H-index score.

The quote on this slide reflects the problem with using only citation analyzes, like the H-index to demonstrate impact. Diverse impacts are impossible to capture through traditional academic metrics such as publications and citations, and cannot be captured by focusing solely on end results of a given research project such as changes in policy or practice. The infographic and this quote are from an article and a follow up blog post, both linked here for you to investigate later if you’re interested.

And, they both focused on how to evaluate and demonstrate impact beyond citations, impact that occurs over a long period of time. Such as in the environmental sciences and impact that’s often hard to track. The authors investigated various ways of evaluating and demonstrating impact, and they synthesized what they found into four rules of thumb for determining how do you evaluate and demonstrate impact.

And they included defining what impact means for your activity or discipline, and being very clear about how your activities are intended to achieve that impact. Attempting to measure below the tip of the iceberg, knowledge or contribution to impact. That is, it changes to understanding knowledge, and or problem framing. Having open ended evaluation criteria in addition to clearly defined expectations so you can capture any unexpected outcomes and understanding that aggregated numerical scores may overshadow conceptual or subtle changes that simple evaluations can inadvertently provide, and incentives to tailor research to meet indicators.

So careful consideration should be given to balancing evaluations to truly capture real and impactful changes. There’s been a movement over the past decade or so to move away from using publication and citation metrics to evaluate and demonstrate impact. Over the next few slides, I want to bring your attention to three key documents, each with its own set of principles that outline a more responsible use of metrics by researchers and institutions.

They are the DORA or the San Francisco Declaration on Research Assessment from 2012, the Leiden Manifesto and the Metric Tide Report, both from 2015.

The DORA was developed in 2012 at the meeting of the American Society for Cell Biology, and it was for all disciplines. It has 18 total recommendations, with four for researchers, which I’ve included on this slide, along with a link to the document should you wish to read it. I bring your attention to the bolded text. The decision should be made by committees based on scientific contact content, rather than publication metrics that a range of article metrics and indicators on personal or supporting statements should be used as impact evidence, and that research assessment practices that rely on journal impact factors are inappropriate and should be challenged.

The Leiden Manifesto for Research Metrics, also linked here, was originally published as a comment in the April 2015 issue of Nature. Ten principles were proposed for measuring research performance, and they emphasized openness, transparency, honoring local or regional excellence, and institutional missions. This is increasingly becoming a requirement for proposals to align research goals with those of the the institution, as well as with those of the funding agency.

And I bring your attention to the six principles, the only one I’ve listed on the slide. It proposes that, accounting for variation by field in publication, in citation practices is important. And that’s because not all disciplines require publications. And I already mentioned that as a, a disadvantage of the H-index. So therefore, if you’re citation, if citation measures are being used and you’re in a discipline that doesn’t really require, publications to indicate impact, then those researchers in that discipline are at a disadvantage.

The Metric Tide Report, consisted of the published results of a 2014 investigation into the roles that quantitative indicators play in the assessing and managing of research. And, those include, the recommendations for responsible metrics, then include robustness or using the best possible data. Humility: quantitative evaluation that supports qualitative expert assessment. Transparency: those being evaluated must be able to test and verify the results.

Diversity: accounting for variation by field and supporting a plurality of research and researchers career paths across the system. Again, think back to the H-index, where early career researchers are a little bit of a disadvantage in reporting H-index scores over, someone who has had a longer time to accumulate citations on their publications and to publish more articles. And reflexivity: or recognition in anticipation of, systemic and potential effects of indicators, and updating them in response.

So that all leads us to, altmetrics, short for alternative metrics. These are modern indicators used to measure the impact and engagement of scholarly research across various online platforms, providing insights beyond traditional citation metrics. Altmetrics are designed to capture the attention, engage, and engagement that research outputs receive in today’s digital age. Unlike traditional metrics such as citation counts and journal impact factors, which often take years to accumulate, altmetrics provide a more immediate and comprehensive view of the research outputs’ influence.

They track mentions and discussions across social media, news articles, blogs, policy documents, and online reference managers, often offering a broader understanding of how research resonates with diverse audiences, including policymakers and the general public.

Altmetrics represent a significant shift in how research impact is measured. It moves beyond traditional citation metrics to encompass a wider array of online interactions. As shown in the previous slide, they gather data from a wide range of online platforms and provide real time assessment of research impact, so it doesn’t take quite as long as, citation measures.

This allows researchers and institutions to gauge the immediate reception of their work. They also highlight the societal relevance of research illustrating the real world effects of findings on government policy, professional fields and community involvement, And by focusing on real time data and diverse sources, altmetrics offer a richer understanding of how research contributes to both academic fields and society at large, and there are increasingly being used by researchers, institutions, and funders to evaluate reach, visibility, and influence of scholarly work.

There are many altmetrics tools, some of which I’ve identified here and provided links for you to explore. They all track impact across various platforms social media, news, blogs, etc.. You’re probably familiar with altmetric and its donut with its various color coded layers by the type of source that is provided, the data and the weighted score that may appear in the whole of the donut.

It’s used by various publications. And here I’ve listed the tool, it’s purpose. They’re all, wanting to measure impact in a different, way than mere citations. And they all have different categories. They look at things like downloads, citations, they look at social media comments, they look at various platforms. So Twitter, Facebook, blogs. They look at mentions in the news, policy documents, and some of them still, also look at citations.

Al open Alex, for instance, looks at citations, affiliations and the type of document. So they’re all, you know, multidimensional. Many of them are open access and, and provide like a visual output. Altmetrics, is one that, really comes to mind, but there are others that show graphs. And you should probably, you know, play around with these to see what they do, and think about, what they are looking at as you’re determining the impact of your own, research.

So with that, how can you increase visibility of your work and your inputs so that they’re captured by the alt metrics tools? I’m going to go through in the next few slides of how to increase your visibility. And the first one is to claim your ORCID, and you can claim that it stands for Open Researcher and Contributor ID by going to the orcid.org/register.

Use your WSU email. This ID travels with you wherever you go. So if you move institutions, it follows you. You just need to update your contact email address. And if you give permission, to the platform work ID will import all of your works from other sources. So, you can, you can have that pull in your work or you can add the add your scholarly works, yourself.

You can, click an option, in your profile on your ID to set up automatic updates. And you can delegate account management to, an assistant if you want to. I would suggest that you add your ORCID to your email signature, to your personal web page, anywhere else that you post, your name, your affiliation, your contact information.

Include that ORCID. It just makes you findable. The more you use it, the more useful it becomes to connecting your scholarship across the web. Another way to increase your visibility and that of your academic activities, is to claim your Google Scholar profile. And also, don’t forget to associate this profile with your ORCID that I just mentioned.

You can sign into your, into scholar.google.com and use your WSU credentials and click on my profile to get started. Use your WSU email on your profile for verification, and that authorizes Google Scholar to display your WSU affiliation as verified. Review Google suggested articles. They come up with a whole list of articles attributed to you.

Make sure that they they actually belong to you. Delete the ones that don’t. And if there are articles that are, listed there more than once, you can merge them. You can set up article updates so that, it, they can be applied automatically or you can actually send them, have them sent to your email for you to approve, to review and approve.

And then you want to consider whether or not you want your profile to be public. If you’re public, then you become searchable by others. If you want to keep it private while you are looking at the list of publications, if you’ve got a lot of publications and you don’t have time to review them all in one sitting, then probably keep your profile private until you can make sure that, everything that is on, that profile is actually attributable to you.

And public or private can be changed at any time, So, so you don’t it just because you make it public doesn’t mean it has to stay public forever.

Another way to increase your visibility is use a digital repository. If you’ve ever shared your preprint prints or other work to your colleagues via email or social media, or posted them on your personal web page or… [Unintelligible]

A digital stable, that means that you’re not going to have broken or migrated links to worry about. You can use that URL no matter where you’re career lands you, so no reposting to a new web page, if you move to a new institution, or if your website moves to a new platform. WSU has, a digital repository. It’s the Research Exchange.

And WSU faculty, staff and students can deposit works such as articles, book chapters, working papers, technical reports, conference presentations, posters, images, media data sets, educational resources, whatever you whatever you share or store preserve on the digital repository at WSU, on the research exchange becomes accessible to other WSU faculty, staff, students, and the general public unless the material has been temporarily embargoed, you can archive data

And you can use the platform as a curricular tool by inviting students to showcase their research. So you, if you’re considering using WSU’s digital repository, you can go, access it through the library guides. I’ve linked it here for you. You’ll need a faculty profile and to get started, contact libraries.research@wsu.edu.

They can help pre-populate your profile with citations and information collected, from Pivot and other sources.

You can increase your visibility through your use of social media.

If you use social media to share your work, take stock of all the platforms that you use. Verify that your contact information is correct. Make sure that your profile associated with your WSU email address, and it look to see if your scholarly identity is connected to that platform if you use it for for networking, etc..

Consider if you’ve reviewed the privacy, settings recently. Ask yourself if you even really like that particular platform for sharing your, work, and really consider carefully about where you want to focus your attention and efforts going forward. As you contemplate this, ask yourself if the platform is practical and useful in your discipline. Does it enable connections with new colleagues?

Is it a platform for communicating your work to a wider audience? As always, remember that whether you engage on a social media or, platform or not, it’s a personal and professional decision, and only you can decide the most meaningful ways in which to engage with these tools. Finally, always keep self-care and care for your colleagues in mind when deciding where to best focus your time and energy and how to share your work

because there are inherent risks associated with social media engagement. And don’t forget to review and understand WSU social media policy about employees use of social media on the job. And then you can check to see also if your college or campus has its own guidelines. Some do. I’ve listed them here. And there is a best practices information site when using social media.

And you can check that out. The link is there. Social media WSU.

If you are interested in learning more about conveying the significance of your work, I encourage you to take the research impact challenge. We did this challenge earlier this year. It’s a series of challenges with ten activities that are designed to help you better understand and manage your online scholarly research presence and measure the success and impact of your work… [Unintelligible]

There’s four self-paced sessions if you go to that site. And there’s five challenges under the online scholarly presence topic, and there’s five challenges. The last five challenges focus on various ways of measuring the success and impact of your scholarly work and strategies for making a compelling case for the value of your work, so measuring the success and impact of your work.

I have several, resources for you to investigate on research metrics. And, as a reminder, these slides and a link to the recording of today’s session will be emailed to you in the next few days, and within a few weeks, will be be available on orap.wsu.edu’s site.