Google Scholar reveals its most influential papers for 2020
Artificial intelligence papers amass citations more than any other research topic.
13 July 2020
VCG / Contributor / Getty
Google Scholar has released its annual ranking of most highly cited publications. Artificial intelligence (AI) research dominates once again, accumulating huge numbers of citations over the past year.
Computer vision research in particular attracts a high number citations over a short period of time. Many of the most highly cited papers in this ranking are centred on object detection and image recognition – research that is crucial for technologies such as self-driving cars and surveillance.
The high citations numbers for AI-related papers mirror the increasing importance governments around the world are placing on the technologies they underpin.
In February, the United States government announced its commitment to double research and development spending in non-defense AI and quantum information science by 2022.
In April, the European Commission announced that it is increasing its annual investments in AI by 70% under the research and innovation programme, Horizon 2020.
Google Scholar is the largest database in the world of its kind, tracking citation information for almost 400 million academic papers and other scholarly literature.
The 2020 Google Scholar Metrics ranking, which is freely accessible online, tracks papers published between 2015 and 2019, and includes citations from all articles that were indexed in Google Scholar as of June 2020.
The most highly-cited paper of all, "Deep Residual Learning for Image Recognition", published in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, was written by a team from Microsoft in 2016. It has made a huge leap from 25,256 citations in 2019 to 49,301 citations in 2020.
“Deep learning”, a seminal review of the potential of AI technologies that was published in Nature in 2015, has had an increase in citations from 16,750 in 2019 to 27,375 in 2020.
It is the most highly-cited paper in the listing Nature, which is ranked by Google Scholar as the most influential journal, based on a measure called the h5-index, which is the h-index for articles published in the last five years.
Three of the top five papers listed by Google Scholar for Nature are related to AI. Two are genetics papers. Citations counts for the AI papers are significantly higher.
For example, the AI paper, "Deep learning", with the highest number of citations for Nature, has 27,375. The paper, “Analysis of protein-coding genetic variation in 60,706 humans”, is the highest ranked non-AI-related paper published in Nature, and has 6,387 citations.
Of the 100 top-ranked journals in 2020, six are AI conference publications. Their papers tend to amass citations much faster than papers in influential journals such as The New England Journal of Medicine, Nature, and Science.
Such rapid accumulation of citations may be in part explained by the fact that at these annual conferences that can attract thousands of attendees from around the world, new software, which is often open source, is shared and later built upon by the community.
Below is our 2020 selection of Google Scholar’s most highly-cited articles published by the world's most influential journals.
See our 2019 coverage for a selection that includes the high-performers mentioned above.
1.“Adam: A Method for Stochastic Optimization” (2015)
International Conference on Learning Representations
47,774 citations
Adam is a popular optimization algorithm for deep learning – a subset of machine learning that uses artificial neural networks inspired by the human brain to imitate how the brain develops certain types of knowledge.
Adam was introduced in this paper at the 2014 International Conference on Learning Representations (ICLR) by Diederik P. Kingma, today a machine learning researcher at Google, and Jimmy Ba from the Machine Learning Group at the University of Toronto, Canada. Adam has since been widely used in deep learning applications in computer vision and natural language processing
The ICLR, one of the most prestigious conferences on machine learning, is an important platform for researchers whose papers are accepted. In May 2020, the conference drew 5,600 participants from nearly 90 countries to its virtual sessions – more than double the turnout in 2019, at 2,700 physical attendees.
2.“Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks” (2015)
Neural Information Processing Systems
19,507 citations
Presented at the 2015 Neural Information Processing Systems annual meeting in Canada, this paper describes what has now become the most widely-used version of an object detection algorithm called R-CNN.
Object detection is a major part of computer vision research, used to identify objects such as humans, cars, and buildings in images and videos.
The lead author, Shaoqing Ren, is also a co-author on Google’s most-cited paper for 2020, "Deep Residual Learning for Image Recognition", which has amassed almost 50,000 citations. Read more about it here.
That paper was co-authored by Ross Girshick, one of the invertors of R-CNN and now a research scientist at Facebook AI.
In the same week that “Faster R-CNN” was presented by Ren and his colleagues, Girshick presented a paper on “Fast R-CNN”, another version of R-CNN, at a different conference. That paper, presented at the 2015 IEEE International Conference on Computer Vision in Chile, has amassed more than 10,000 citations.
3.“Human-level control through deep reinforcement learning” (2015)
Nature
10,394 Citations
After “Deep learning” (mentioned above), which is Nature’s most highly cited paper in the Google Scholar Metrics ranking, this paper is the journal’s second-most cited paper for 2020.
It centres on reinforcement learning – how machine learning models are trained to make a series of decisions by interacting with their environments.
The paper was authored by a team from Google DeepMind, a London-based organization acquired by Google in 2014 that has developed AI technologies for the diagnosis of eye diseases, energy conservation, and to predict the complex 3D structures of proteins.
4.“Attention Is All You Need” (2017)
Neural Information Processing Systems
9,885 citations
Authored by researchers at Google Brain and Google Research, this paper proposed a new deep learning model called the Transformer.
Designed to process sequential data such as natural language, Transformer is used by translation, text summarization, and voice recognition technologies, and other applications that use sequence analysis such as DNA, RNA, and peptide sequencing. It’s been used, for example, to generate entire Wikipedia articles.
Earlier this year, researchers at Google predicted that Transformer could be used for applications beyond text, including to generate music and images.
The paper was part of the proceedings from the 2017 Neural Information Processing Systems conference held in Long Beach, California.
5.“The Third International Consensus Definitions for Sepsis and Septic Shock (Sepsis-3)” (2016)
JAMA
8,576 citations
The first formal revision of the definitions of sepsis and septic shock in 15 years, this paper describes a condition that’s estimated to affect more than 30 million people worldwide every year.
Led by the European Society of Intensive Care Medicine and the Society of Critical Care Medicine, the study convened a task force of 19 critical care, infectious disease, surgical, and pulmonary specialists in 2014 to provide a more consistent and reproducible picture of sepsis incidence and patient outcomes.
The paper, led by Mervyn Singer, professor of intensive care medicine at University College London, is by far the most highly cited paper in JAMA. The second-most highly cited paper, on opioids, has 3,679 citations, according to Google Scholar.
6.“limma powers differential expression analyses for RNA-sequencing and microarray studies” (2015)
Nucleic Acids Research
8,328 citations
limma is a widely used, open source analysis tool for gene expression experiments, and has been available for more than a decade. A large part of its appeal is the ease at which new functionality and refinements can be added as new applications arise.
This paper, led by Matthew Ritchie from the Molecular Medicine Division of the Walter and Eliza Hall Institute of Medical Research in Melbourne, Australia, is presented as a review of the “philosophy and design” of the limma package, looking at its recent and historical features and enhancements.
The journal, Nucleic Acids Research, while ranked outside the top 10 of Google Scholar’s most influential journals, has more papers with 3,000+ citations each than The Lancet (ranked 4th).
7.“Mastering the game of Go with deep neural networks and tree search” (2016)
Nature
8,209 citations
Viewed as one of the most challenging classic games to master, Go is a 2,500-year-old game that will put any player – living or otherwise – through their paces.
In 2016, a computer program called AlphaGo defeated the world Go champion, Lee Sedo, in what would be hailed as a major milestone for AI technology. AlphaGo was the brainchild of computer scientist David Silver when he was a PhD student at the University of Alberta in Canada.
This paper, co-led by David Silver and Aja Huang, today both research scientists at Google DeepMind, describes the technology that underpins AlphaGo. It is the third-most highly cited in Nature, according to Google Scholar.
In 2017, the team introduced AlphaGo Zero, which improves on previous iterations by using a single neural network, rather than two, to evaluate which sequence of moves is the most likely to win. That paper is the eighth-most cited in Nature.