Rasmus Kyng officially joined the Department of Computer Science at ETH Zurich at the beginning of September 2019 as Tenure Track Assistant Professor of Theoretical Computer Science. Get to know him in this short interview.

Professor Rasmus Kyng
Professor Rasmus Kyng joined the Department of Computer Science in September 2019.

Professor Kyng, welcome to ETH! What are your current research interests?
I’m interested in some of the foundational questions concerning fast algorithms and optimisation. One of my areas of focus is numerical linear algebra, which, in turn, is used a lot in scientific computing, machine learning and data science. It is a very exciting time to be working in this area because we might be on the cusp of more theoretical breakthroughs, and, at the same time, some maturing research directions are having a significant practical impact.

Zooming in, one of the central tools of numerical linear algebra is linear equation solvers, and we’ve been making great strides in developing the theory of these solvers over the past 15 years. If we can continue expanding this theory and apply it to more types of linear equations, it could be useful in a wide range of fields. In one sense, these algorithms make only a quantitative difference: hopefully, you will be able to solve much bigger problems and solve them much more quickly. But ultimately, for practitioners, this can turn into a qualitative difference if it means they can now model systems that were previously intractable or required huge supercomputing clusters. In the next five years, I hope this will be the case in scientific computing and in applied operations research. 
In the longer term, if we can get excellent linear equation solvers for sufficiently broad classes of matrices, then we could potentially make it viable to replace first-order optimisation (i.e. making progress based on gradient descent) with second-order optimisation (i.e. making progress based on solving linear equations) in many domains.

I’m also very excited about algorithms and optimisation topics outside of numerical linear algebra. Historically, continuous optimisation research and discrete algorithm design have been carried out in separate communities, but over the past decade, there has been a lot of activity at the boundary between the two, and this exchange has been very fruitful. This recent progress makes it natural to revisit some of the core problems in optimisation and ask: are our algorithms optimal, or can we do even better? On the one hand, it looks increasingly plausible that we could develop better algorithms for important classes of problems, such as solving linear programs, and I’m looking forward to spending time on this. On the other hand, we’re also getting better at proving (conditional) lower bounds in fine-grained complexity theory. So, I hope that over the next few years, we’ll get a better understanding of both what is and isn’t possible in optimisation and fast algorithms.

What is the impact of your research on society?
Ultimately, I’m trying to discover what algorithms should look like for many basic problems that practitioners are hoping to solve in scientific fields, engineering and data analysis in general.

Over the past few years, my colleagues and I have been working on solving something called Laplacian linear equations. Right now, I’m trying to show that this research can lead to practical ways of numerically solving difficult instances of elliptic partial differential equations. I hope this will make it possible to solve very ill-conditioned diffusion problems, which come up in all sorts of scientific work. Many scientists, ranging from climate researchers to chemical engineers, rely on current state-of-the-art software to try to solve these equations in their daily work. But the software has trouble coping with the most challenging instances. I’ve been meeting practitioners to understand exactly what these instances are like and to see if we can help them do better. There are also several other challenging classes of linear problems practitioners currently struggle to solve that I think we’ll soon make much more tractable.
More generally, I try to keep an eye out for applied areas where practitioners are working to solve the kinds of problems I might be able to help with. Another example of that is the issue of routing electricity through transmission grids, which seems like a good fit for the kinds of ideas we have been working on in algorithmic spectral graph theory. I hope to spend more time on that at some point and see if my toolbox can improve the way in which this routing is carried out.

Where were you working before you came to ETH?
I just finished a postdoc at Harvard’s Theory of Computation research group. Before that, I spent a semester as a postdoctoral fellow at the Simons Institute for the Theory of Computing at UC Berkeley after finishing my doctorate at Yale in the summer of 2017.

Which courses will you be teaching at ETH?
I’m not teaching in the autumn, although I might host a reading group. I’m still deciding what I’ll teach in the spring. It will be a Master’s level course, most likely on recent topics in fast algorithms and optimisation from a theoretical computer science perspective. I’d like students to become acquainted with some of the exciting activity in these fields, and give them tools that will be useful for continuing research in these areas.

Name an interesting fact about your research.
Like most researchers in mathematical fields, I like working through small examples by hand to develop an understanding of a problem. But to go beyond the small examples, I often rely on coding up numerical experiments to handle complex questions and analyse larger instances. It’s both fun and challenging trying to come up with useful experimental hypotheses to test. And it’s still an underappreciated skill that I think one can get a lot of mileage out of. There’s a nice quote by the mathematician V.I. Arnold along those lines. He said that ‘mathematics is the part of physics where experiments are cheap’.

What advice would you give to students who are just starting out in computer science?
I think that the most important things to learn are processes: how to learn things well, how to approach a new subject and how to conduct research.

I’m mostly focused on the science aspect of computer science. Other parts of the discipline are really a matter of engineering. I think of the latter as a distinct but equally important aspect. Aspiring scientists and engineers should think carefully about what questions excite them and what kind of answers they find satisfying.

Finally, students should realise that computer scientists have developed an incredible set of tools for thinking about computation, and this is now impacting all areas of society. But computer science is not everything, and there are many other important ways to view the world.

JavaScript has been disabled in your browser