Published on March 15, 2017 by

Editorial: Rethinking our indicators of impact and excellence

As epidemiologists we constantly think about indicators and metrics. This issue of the SACEMA Quarterly is a good illustration of that. For instance, Gert van Zyl discusses how the health impact of early antiretroviral treatment (ART) emerges from its ability to decelerate disease progression in HIV-infected people, prevent onward HIV transmission and show promise in achieving functional cure. Cara Brook describes how in her study of the transmission dynamics of zoonotic viruses, she needed to combine field work, mathematical models and laboratory techniques to arrive at a more complete picture of these complex interactions between bats, viruses and humans. What is clear from these studies is that any comprehensive, nuanced analysis considers multiple metrics of the effects of X on Y because, in isolation, each metric only captures partial effects when X and Y are complex multidimensional systems. A related observation is that the result of a comparative analysis (e.g. does X1 have a more positive effect on Y than X2?) depends on the metrics used.

Given these well-known limitations of simplifying complex dependencies to one-dimensional indicators, isn’t it surprising that many academics have bought into the practice of measuring the quality and impact of their work by a handful of metrics? While our efforts may span a decade or more of developing, testing, and communicating new ideas with hundreds of people (usually students, colleagues, and people responsible for policy making and programme implementation), we seem to accept – perhaps reluctantly but still – that our publication list and H-index* are sufficient summary statistics of what we have done and, worse, useful indicators of how excellent and impactful we are (or aren’t).

While books have been written about the need for more and better indicators of impact and excellence in academia, surprisingly little attention is given to the challenge and value of being engaged and excelling in non-academic activities. I can’t think of any institutionalised incentives or tokens of recognition for people who aim to do good science and train for a sports event, learn a musical instrument or language, be a part-time stay-at-home parent, or prepare home-cooked meals. Not that we need university approval or a promised promotion to set such goals, but I do believe we need to start an open conversation about our joint perception of what it means to be impactful and excel.

Working late nights and weekends, nose to the grindstone (or rather eyes glued to the computer screen), to keep up with internalised publication pressure and high demand for student supervision, is often perceived as a sign of praiseworthy dedication and commitment. On the other hand, taking time out for sport, healthy eating and sleep (!) may mean that one produces fewer papers per year and can supervise fewer postgraduate students. However, I would argue that this alternative choice is a more aspirational goal than being single-mindedly focused on only doing good science and neglecting other areas of life. Why more aspirational? Because it requires extra discipline, of a different kind than what is required to become an overcommitted workaholic. Furthermore, because going against the current is a greater social challenge with a more uncertain outcome.

I contend that scientific discovery through epidemiological modelling and analysis will accelerate and its societal impact will grow if we succeed in becoming healthier, more socially engaged, and more balanced modellers and analysts. Firstly, it is a truism that a healthy lifestyle increases efficiency and creativity at work. Secondly, for scientists to make meaningful, high-impact contributions to today’s conundrums in health policy and practice, mastering “soft” skills is essential. Without empathy, a groundedness in the daily realities of others, and the ability to connect and communicate with people from all segments of society, producing science that changes people’s lives for the better is at best a lucky coincidence. It is precisely through participation in non-professional activities – be it in the sports club, church, book club, or garage band – that this social awareness and connectivity in society is nurtured. Lastly, our collective ability to make science-driven advances in health is strongly dependent on our ability to recruit, train, and retain a critical mass of talented and inspired epidemiologists and data analysts. In this regard, we have a big challenge ahead of us, given that we are competing with the insurance and banking sector for data- and maths-savvy graduate students. We can’t outbid our competitors financially, but I believe we can in fact create not only more attractive work environments, but also more appealing career-and-life paths if we live up to academia’s core values of purpose, mastery, and creative autonomy. In South Africa we are in a unique position to be an example to the rest of the world. We have the climate on our side, plenty of outdoors to be admired and enjoyed, and a plethora of social causes offering an endless stream of opportunities to “give back” and engage with others less fortunate but equally deserving of a fair chance at life. However, none of these attractions actually matter if we fail to protect young (and not-so-young) academics against external and internalised pressure to pursue academic excellence at the cost of neglecting their health, families and personal lives.

As a practical way forward, let us consider a few additions to the standard metrics of impact and excellence that individuals and institutions are assessed by. “Altmetrics” are already in use by a growing number of entities in academia. Typically, these metrics count the number of times a published paper is viewed (i.e. HTML views and PDF downloads), discussed (via journal comments, science blogs, Wikipedia, Twitter, Facebook and other social media), saved (via Mendeley, CiteULike and other social bookmarks), cited (as tracked by Web of Science, Scopus, CrossRef and databases), and recommended (by F1000Prime for example). While Altmetrics can also be gamed and abused to optimise the metrics; they go some way in adding dimensions to the evaluation of how influential a piece of scientific output is. Slightly more adventurous are instruments like Gallup’s Q12 questionnaire that measures, amongst other things, how well staff rate their workplace with respect to receiving recognition for work well done, feeling socially connected to colleagues at work and being offered opportunities to learn and grow. And probably most adventurous would be an active effort from the unit to show that it values social interaction and personal growth beyond the narrow boundaries of the typical scope of the work. Not that having a hobby should be a requirement for being recruited or retained in a job, but the workplace may offer to support one or a few initiates that are coming from the employees, preferably with a majority of the employees behind it, like the contributions to charities and Mandela Day that SACEMA has supported in the past. Other examples may be registering for a sports event as a group, organising a friendly bake-off competition or book club of fiction novels. Many of these initiatives are already taking place spontaneously, but the important thing is to “own them”, for example by proudly inserting them into annual reports to funders and other stakeholders. As individuals and as institutions, we have the power to influence society’s thinking around what it means to engage and excel. Let’s use it.

* A researcher with an H-index of x has published x papers, each of which has been cited in other papers at least x times.