Prof. Niklaus Wirth, one of the great pioneers of computer technology, has been praised as a designer of programming languages, compilers, operating systems and even hardware, and his research is strongly guided by the constant search for simplicity, clarity and efficiency. The following interview sought not only to highlight Niklaus Wirth’s contemporary views on the latest technological developments, but also to capture, to some degree, the personality of the computer scientist through and through.

Prof. Niklaus Wirth, one of the great pioneers of computer technology, said in an interview in 1997: "A good designer must rely on experience, on precise, logic thinking, and on pedantic exactness. No magic will do." As a university professor, he inspired many young people in understanding science and technology and, even more importantly, to create themselves meaningful and enduring technologies.

At first a fire-breathing revolutionary, later an extraordinary statesman, Nelson Mandela was universally adored and idolized. He is widely quoted as saying "it always seems impossible until it’s done." Accomplishing great things and releasing breakthrough innovations that transform sectors and societies will always involve opposition and setback. What inspired you to start a career in science, and what is your proudest personal achievement as a researcher?
In primary school, I first wanted to become a steam-engine driver, later a pilot. I never aspired to become a scientist but rather an engineer who understands nature and does something useful with this knowledge. My urge to construct things led me from designing and building model airplanes, to installing a chemistry lab in the basement, and finally to electronics, building radios, transmitters, measuring equipment, television sets. What drove me? Curiosity. In hindsight, the proudest moments of my career were becoming an Assistant Prof. at Stanford in 1963 (before any computer science departments existed), then becoming a professor at the renowned ETH in 1968, and lastly obtaining the prestigious Turing Award from the ACM in 1984.

Every January, tens of thousands flock to Las Vegas for the International Consumer Electronics Show, one of the largest technology trade shows in the world. In a nutshell, the industry is trying to move on to “the next big thing” and is always looking to uncover the next generation of computing be it in your car, on your TV or on your wrist. Are there any particular (commercial) innovations that have caught your attention?
I have always been primarily interested in technical achievements rather than commercial "innovations" and profits, the latter often being relatively useless gadgets and gismos. It is mainly the tremendous advances in semiconductor technology (thanks to physics and chemistry) rather than informatics that open the door to new applications. Improvements in speed and storage capacity by factors of thousands represent quantum leaps, and it is difficult to predict their effects. The last such leap was the marriage of computing and communication causing everybody to grab their mobile phone as soon as they sit down in a tram, a railway, or an airplane. Perhaps the next leap will be to implant these gadgets in one's brain. Progress?

Our department strongly supports the IT-dreamjobs campaign, which strives to fuel young people's passion for computing. However, given Edward Snowden's recent revelations on the National Security Agency's broad collection of information and technological spying, one may have doubts about the freedoms promised by the internet. Let's assume the White House invites you for an off-the-record dinner. What advice would you, as a software engineer, give president Obama in striking a reasonable balance between the threats posed by global terrorism and the right to privacy?
I am grateful for not being consulted by the White House, and I frankly would not know how to advise President Obama. He has enough advisors! But I am surprised by the naivety of people (and of many politicians). Even if one is not informed about the latest state of spy technology, one must assume that it is extensively used to gather information everywhere. People spread their personal joys and grievances freely into the air on telephones, on the internet in blogs, as if there were nothing more interesting than their banal "news". – Intelligence agencies will continue to use the latest tricks, because that is their task. As long as they are not caught, nobody will object.

What made you select software and languages as your research topics?
The first time I got in touch with what was later to be called "software" was at Berkeley in 1961. There I discovered a small group of young people fiddling with a program translating a language into binary code for the IBM 704 computer. This program was written in the very language it processed. I found this rather intriguing, but understanding this program was a sheer nightmare, and a sole woman was the only person who seemed to possess some knowledge of it. I decided that bringing order and some scientific principles into this mess might be a worthwhile topic for a dissertation. The publication of the Algol 60 Report promised to bring the witchcraft of compiling onto a scientific level. The results were the principle of operator precedence parsing and the Euler language, created in 1963. However, it remained largely an academic experiment: suitable for obtaining a degree, but not a useful product in practice. I therefore continued to work towards orderly, well-defined and practical programming languages, always implementing them and writing compilers. The first was Algol W (1966). Coming back to Switzerland I found that no available language was suitable for teaching on a satisfactory level. Thus, I continued development creating Pascal in 1970. From then on, my goal was to create a language that was scientifically clean, i.e. defined not in terms of a mechanism (or even a specific computer), but in terms of a mathematical structure of axioms and derivation rules. This turned out to be an elusive goal. New languages turned out to be overly complex due to the demands of practitioners, and some collapsed under their complexity. Other languages became too restrictive for use in practice because of their academic straight-jackets. I created Modula-2 (1979), which was Pascal extended with a module concept for system development in teams, but it became unduly complicated. Only a rigorous stripping of unnecessary features and a cleaning up of "bells and whistles" brought relief with the language Oberon (1988). The driving force for all these years was to be able to teach programming in a scientific, mathematical manner, devoid of computer jargon and idiosyncrasies and based on proper abstractions. Oberon (in 2007) has reached this goal to a remarkable degree. My message is this: programs must not (only) be coded to "run" on obedient computers, but they must (also) be written to be read and understood by humans. In summary, my research has always been influenced by teaching, and I followed the time-honored credo at ETH to value the symbiosis between research and teaching.

For several decades, you have been maintaining close contact with Russian academics. In December last year, president Putin granted amnesty to a formidable list of (celebrity) prisoners. Everyone understands that the pardons were a one-off event and not a systemic reform. However, there is an amorphous group of middle-class workers, small business owners, intellectuals, artists, writers and political outsiders who do not accept the regime’s rules. As someone knowledgeable about Russia, Prof. Wirth, what makes you optimistic that the liberties of this important minority will last longer than the winter Olympics in Sochi?
My first visit to Russia was in 1990, and it was an eye opener. I met many scientists and experienced a hospitality, generosity and warmth like hardly anywhere else. The people I met had evidently enjoyed an excellent education, knowledgeable in matters from Pushkin to Wilhelm Tell and quantum physics. Western journalists too often paint a dark picture, being caught in prejudices and traditional beliefs. It is unfair to make straight comparisons between Switzerland and Russia, as their histories, cultures, people and sizes are incomparable. About the near future in Russia (and elsewhere), and in particular about the fate of Russian science and the academy, I am not optimistic.

JavaScript has been disabled in your browser