Barbara Liskov Wins Turing Award
ACM cites 'foundational innovations' in programming language design
Dr. Dobb's Journal
Mar 10, 2009
Barbara Liskov has won the Association for Computing Machinery's A.M. Turing Award, one of the highest honors in science and engineering, for her pioneering work in the design of computer programming languages. Liskov's achievements underpin virtually every modern computing-related convenience in people's daily lives.
Liskov, the first U.S. woman to earn a PhD in computer science, was recognized for helping make software more reliable, consistent and resistant to errors and hacking. She is only the second woman to receive the honor, which carries a $250,000 purse and is often described as the "Nobel Prize in computing."
Liskov heads the Programming Methodology Group in the Computer Science and Artificial Intelligence Laboratory at MIT, where she has conducted research since 1972. Last year, she was named an Institute Professor, the highest honor awarded to an MIT faculty member.
Liskov's early innovations in software design have been the basis of every important programming language since 1975, including Ada, C++, Java and C#.
Liskov's most significant impact stems from her influential contributions to the use of data abstraction, a valuable method for organizing complex programs. She was a leader in demonstrating how data abstraction could be used to make software easier to construct, modify and maintain. Many of these ideas were derived from her experience at MIT in building the VENUS operating system, a small timesharing system that dramatically lowers the cost of providing computing and makes it more interactive.
In another contribution, Liskov designed CLU, an object-oriented programming language incorporating clusters to provide coherent, systematic handling of abstract data types. She and her colleagues at MIT subsequently developed efficient CLU compiler implementations on several different machines, an important step in demonstrating the practicality of her ideas. Data abstraction is now a generally accepted fundamental method of software engineering that focuses on data rather than processes.
Building on CLU concepts, Liskov followed with Argus, a distributed programming language. Its novel features led to further developments in distributed system design that could scale to systems connected by a network. This achievement laid the groundwork for modern search engines, which are used by thousands of programmers and hundreds of millions of users every day and which face the challenges of concurrent operation, failure and continually growing scale.
Her most recent research focuses on techniques that enable a system to continue operating properly in the event of the failure of some of its components. Her work on practical Byzantine fault tolerance demonstrated that there were more efficient ways of dealing with arbitrary (Byzantine) failures than had been previously known. Her insights have helped build robust, fault-tolerant distributed systems that are resistant to errors and hacking. This research is likely to change the way distributed system designers think about providing reliable service on today's modern, vulnerable Internet.
On the occasion of her winning the Turing Award, MIT Institute Professor Barbara Liskov discussed her role in shaping in the past, present and future of computer science.
Q: When you began your career in computer science, it was still a relatively young field. How have you seen this discipline evolve over time -- at MIT and elsewhere?
A: The change has been tremendous. When I started, most of the field was unexplored and there were obvious problems everywhere -- lots of low-hanging fruit, but also very fundamental issues that were poorly understood and very confusing. Today the field is on a very sound foundation. There are still many problems to work on, but now this work happens in the context of all that has gone before. When I started, this context was missing, so you just struck out on your own.
Q: Looking back at your career, what is the single accomplishment of which you are most proud?
A: Probably the development of the concept of data abstraction and the CLU programming language. This work was done at MIT in the 1970s.
Q: Where do you plan to focus your research going forward?
A: Today I am working primarily on distributed systems -- systems that run on many computers connected by a network like the Internet. My focus recently has been on the security of online storage. I believe that more and more users will store their information online, but the storage they use needs to be implemented so that they don't lose their information, their information is available when they need it, and they can be confident that their confidential information will not be leaked.
Q: As the first woman to earn a PhD in computer science, what advice would you give to other women who are considering going into this field?
A: I have found computer science to be a wonderful field to work in. I think the main reason is that the kind of thinking and problem-solving it requires matches my abilities. I believe that finding work to do that you like and are good at is the most important way to find a satisfying career. Young women (and young men) who find that computer science is a match for them should pursue it. There is lots of interesting work remaining to be done.
Q: When you began studying computer science at Stanford, computers were big mainframes and the Internet was still in the distant future. Today, computers fit in the palm of our hands -- many are much smaller -- and the Internet is ubiquitous. Given that you have watched these transformations over the last five decades from a front-row seat, what do you think the next half-century will hold?
A: I don't have a crystal ball! It seems obvious that computers and the Internet will continue to be very important to individuals, companies and society. But I don't know the exact form this will take.
Copyright © 2006 CMP Media LLC