Prem Devanbu, Collaborators Receive 2022 ICSE Most Influential Paper Award
In 2012, UC Davis Computer Science Distinguished Professor Prem Devanbu, his then-colleague Zhendong Su, postdoctoral scholars Abram Hindle and Earl Barr and their collaborator and former student Mark Gabel changed the field of software engineering with their paper, “On the Naturalness of Software.” Ten years later, the paper’s legacy has been recognized with a Most Influential Paper Award from the 2022 International Conference on Software Engineering (ICSE).
“It is not often that research is conducted that changes the course of a field,” said the University of British Columbia’s Gail Murphy in a 2016 technical perspective on the paper. “The demonstration by the authors that software is natural and that statistical language models apply fundamentally opens up new approaches to creating scalable, useful software development tools.”
Just like English has terms like “bread and butter” that are predictable and almost always written in the same way, code has commonly-used and repetitive phrases like i = i + 1. This observation—which was first made in a paper written by Gabel and his advisor, Su—led to a friendly argument between Su and Devanbu about what they could do with it. They eventually landed on the idea of using natural language models, which analyze, predict and write text, to do the same thing for code.
“These [repetitive phrases] are common enough that you can capture them in a mathematical, computational model and use it to help programmers,” said Devanbu. “It turns out these models are really good at figuring out what usually goes together and then they can help programmers write the code.”
The paper was the first to suggest this idea and it outlined a vision of different ways language models could help programmers. In doing so, it opened up a whole new field of research. Industry leaders like Google, Microsoft and Amazon now have groups in this area and employ several UC Davis alumni. Today, these models can hold their own in coding competitions against skilled human programmers, generate code given a prompt and translate code from one programming language to another. Devanbu also notes how impactful deep-learning has been in this area.
“Since we did this work, the whole deep learning revolution has come in and has essentially turbocharged the idea and made it really powerful,” he said. “Almost every research that cites our 2012 paper today is about deep learning.”
The authors have all had success in the paper’s wake. Devanbu and Su, now at ETH Zurich, continued their distinguished research careers in the area and have been well-funded by the National Science Foundation and industry. Barr is now a professor at University College London and Hindle is an Associate Professor at the University of Alberta. The authors keep in touch and Devanbu, Barr and Hindle re-united virtually at ICSE 2022 to talk about their paper and its legacy.
“It’s the paper I feel happiest to have been involved in in my career,” said Devanbu. “It is definitely the most cited paper, and it was also the most fun paper to write. I was also very happy to work with the people I worked with.”