Skip to main content

Citing sources isn't good enough for science anymore

The best research needs ideas from a wider range of people and publications, writes Anita Schjøll Brede.

Next time you read a scientific paper, check how many times the author cites their own previous work, work by co-authors, or work in the same journal. We're not saying this sort of self-citation is never justified, but it might be taking self-promotion a bit far when 445 out of 490 references in a paper are all for the same journal1. Maybe, just maybe, the citation system could use some peer review of its own.

We like to imagine science is more than just a cartel. One of the things that sets science apart from magic is documenting the chains of evidence. This person ran this experiment with result X. That group observed phenomenon Y. So, I wondered about Z and tested it.

It's the so-called citation system, in which scientists nod to research that enabled their own work or point out studies they think are flawed in some way. We all learned a version of it in school, when our teachers made us cite our sources. And no, Wikipedia still doesn't count as a definitive source.

But as the total body of knowledge grows, keeping track of work that's related to yours gets harder and harder. Pretty soon, scientists become like the apocryphal person who looks for their lost glasses in the light of a streetlight. Not because they lost them there, but because that's where they can see.

Most researchers keep track of a few prominent journals and research groups, set up some saved searches on sites in their field, and talk about new papers with colleagues they bump into at conferences. But it's hard to shake the feeling that there's a whole world of work out there which you could explore if you had just a little more time.

Instead, we end up with citation biases. Even without organised citation cartels, researchers cite their own work, that of their immediate colleagues, and the top researchers in the field. It creates a narrow pyramid of citation in each discipline, in which most people pay tribute in the footnotes to their higher-ups. Women get cited less than men, even by other women.

This it is no way to do good science. Useful ideas can come from a neighbouring pyramid of knowledge, for one thing. They can come from scientists working in other languages. And they can definitely come from women.

In physics, a field with one of the most open publishing systems in the scientific world, an Iris client found an obscure Croatian journal article that helped them improve their LED design. The faster good minds can find good ideas, the faster researchers like that can save energy – and our planet.

How we communicate results in science is always changing. Digitisation over the last couple of decades has put things like that Croatian paper at the tips of fingers of many more people than ever before.

But other barriers distort the kind of science we actually read, act on, and cite. This creates self-perpetuating biases of all kinds. For one thing, online search engines have not just sped up how we discover things, it may have narrowed their range2.

It's a bit like taking a taxi ride instead of a bicycle to get somewhere. With a conventional online search, it's like telling the driver where you're going and zoning out. You'll probably get where you're going, but you may not notice as much along the way. Back when researchers had to walk along the stacks of libraries to find the journals they were looking for, they encountered other journals on the shelves. Some were just in the way, but others led to serendipitous, exciting results. Digital librarians and others have struggled with the best way to replicate that, ranging from The Bohemian Bookshelf to the Iris spatial visualization of journal search results.

At least some researchers argue that interdisciplinary reading, and unexpected discoveries, just like the kind you make with a wrong turn in the library stacks or down an unfamiliar street, can lead to useful discoveries3,4. Part of their argument is that if you're trying to make new discoveries, you can't really know ahead of time what to look for. In those cases, a fuzzier, less-targeted search might be more useful.

So just like the bias of being overwhelmed, online searching leads to biases that have to with how you search. A recent study has shown that scientific disciplines use vocabulary differently5 so if you search using the vocabulary of your own field, you're probably missing interesting results in adjacent fields.

That's really just another version of the so-called filter bubble we all live in online. We often tend to seek out media we like, that we agree with, and that reinforces things we already know. It works great for more-of-the-same entertainment, such as that recommended by Netflix and Spotify, but it often fails to challenge us into being creative.

A couple of years ago, researchers at a couple of European institutes were exploring the kind of problem that you can't solve with more-of-the-same thinking. The researchers were asking if they could make reusable rockets from composite materials. In an exercise we call a SciThon, the two teams took competing approaches: one used the literature search site of their choice while the other team used our AI scientific literature search tool, Iris.

By the end of the day, an expert panel rated highest papers found by the Iris-equipped team that hinted at useful applications of silicon-based nanoparticles and adapting a health monitoring system to the liquid rocket engines. Those findings may not have been expected by the search teams, but they turned out to be relevant to the problem6. Perhaps because they were not rocket science, they had new things to contribute to, well, rocket science.

Science needs lots of ways of beating its own biases. We need people to be able to peer over the cubicle wall at what their colleagues are doing and ask a quick question or to make a suggestion. But we also need them to be able to do the same thing with people living on the other side of the world, speaking a different language, and working in a different field altogether.

We can't keep talking past each other and we can't keep leaving out important voices. Citation as we know it doesn't help us go far enough to connect papers and busy scientists. We need to use all the new tools available, from artificial intelligence to the latest in data visualisation, and we probably need to keep inventing new ones if we want our science to outgrow our own citation biases.

Anita Schjøll Brede is CEO of Iris.ai

Topics

Read more about:

Artificial intelligence (AI)

Media Partners