An inteview with Brian Kernighan, co-developer of AWK and AMPL

Computerworld's series on the most popular programming languages continues as we chat to Brian Kernighan

Computerworld is undertaking a series of investigations into interesting programming languages. In the past we have spoken to Larry Wall, creator of the Perl programming language, Don Syme, senior researcher at Microsoft Research Cambridge, who developed F#, Simon Peyton-Jones on the development of Haskell, Alfred v. Aho of AWK fame, S. Tucker Taft on the Ada 1995 and 2005 revisions, Microsoft about its server-side script engine ASP, Chet Ramey about his experiences maintaining Bash, Bjarne Stroustrup of C++ fame and Charles H. Moore about the design and development of Forth.

We’ve also had a chat with the irreverent Don Woods about the development and uses of INTERCAL, as well as Stephen C. Johnson on YACC, Steve Bourne on Bourne shell, Falcon creator Giancarlo Niccolai, Luca Cardelli on Modula-3, Walter Bright on D, Brendan Eich on JavaScript, Anders Hejlsberg on C#, Guido van Rossum on Python, Prof. Roberto Ierusalimschy on Lua, John Ousterhout onTcl, Joe Armstrong on Erlang and Rich Hickey on Clojure. We recently spoke to Martin Odersky about the darling of Web 2.0 start-ups and big corporates alike, Scala.

More recently, we heard from Groovy Project Manager, Guillaume Laforge. He told us the development story behind the language and why he thinks it is grooving its way into enterprises around the world.

This time we spoke with Brian Kernighan — a figure who helped popularise C with his book, co-written with the creator Dennis Ritchie, The C Programming Language and contributed to the development of AWK and AMPL.

Want to see a programming icon interviewed? Email Computerworld or follow @computerworldau on Twitter and let us know.

You maintain you had no part in the birth of C, but do you think the language would have been as successful as it has been without the book?

Brian Kernighan: The word is not ‘maintained’; it's ‘stated accurately’. C is entirely Dennis Ritchie's work. C would have done just fine on its own, since as a language it achieved a perfect balance among efficiency, expressiveness, and power. The book probably helped, though I think more in spreading the language early on than in its ultimate acceptance. Of course, it helped enormously to have Dennis as co-author, for his expertise and his writing.

In the ten years since you launched The Practice of Programming, a separate book written with Rob Pike, has the way programmers operate changed enough for you to consider amending any parts of the publication?

Programming today depends more and more on combining large building blocks and less on detailed logic of little things, though there's certainly enough of that as well. A typical programmer today spends a lot of time just trying to figure out what methods to call from some giant package and probably needs some kind of IDE like Eclipse or XCode to fill in the gaps. There are more languages in regular use and programs are often distributed combinations of multiple languages. All of these facts complicate life, though it's possible to build quite amazing systems quickly when everything goes right. I think that the advice on detailed topics in The Practice of Programming is sound and will always be — one has to find the right algorithms and data structures, one has to test and debug and worry about performance, and there are general issues like good notation that will always make life much better. But it's not clear to me or to Rob that we have enough new good ideas for a new book, at least at the moment.

What advice do you have for young programmers starting out? Would you recommend a grounding in Cobol like you had, for example?

Every language teaches you something, so learning a language is never wasted, especially if it's different in more than just syntactic trivia. One of Alan Perlis's many wise and witty epigrams says, "A language that doesn't affect the way you think about programming is not worth knowing". On the other hand, I would not suggest Cobol as a primary focus for most people today — I learned it as part of a summer job and long ago, not because it taught me something new (though it did that as well). No matter what, the way to learn to program is to write code, and rewrite it, and see it used, and rewrite again. Reading other people's code is invaluable as well. Of course all of these assume that the code is good; I don't see a lot of benefit in reading a lot of bad code, other than to learn what to avoid, and one should, of course, not write bad code oneself. That's easier said than done, which is why I stress rewriting.

Join the Computerworld newsletter!

Error: Please check your email address.

Tags a-z of programming languagesBrian KernighanAMPLCAWK

More about Amazon Web ServicesBell LabsBillEclipseFacebookFredGoogleLinuxMicrosoftNUScala




Lovely to see his humility.

Daniel Butler


I recommend that Mr. Kernighan add Ruby to his list sooner rather than later. While it is true that "a language that doesn't affect the way you think about programming is not worth knowing," nowadays, it's also true that "a language that can't be affected by the way you think about programming is not worth knowing." Ruby's power comes from its LISP-like flexibility, its expressiveness, and the power to make DSLs which cleanly map to the way humans should be thinking about programming in specific domains.

That power, and the joy of reading beautifully crafted code (which often require little additional commentary), make Ruby a aesthetic pleasure to work with, unlike the awkward hacks that similarly-capable Perl or Python programs force you to craft.

And anything you've heard about Ruby's slowness in the past is no longer the case with the latest optimized interpreters.



Perl/Ruby/Python are very similar. If you know one of these, it should be smarter to learn something different. I'd recommend something functional, since that seems to become more popular in the future (my next language will be Haskell or Erlang).

Devon Jones


I'll offer an opposing point of view. As a Python and Ruby developer, I would say that there is no point in picking up Ruby if you already know Python. While Ruby offers 'prettier' code, Python has focused more on ensuring that it's parts are strong, and work well for edge cases. Ruby has focused more on what I call "happy path" development, where things work well, so long as you stick to the parts that the community widely uses. As an example, if you are using Mysql, everything is hunky dory, but if you use Postgres, the drivers suffer from neglect.

After spending a few years working on both daily, I think of Python and Ruby in terms of Beauty and the Beast. Ruby, which is pretty holds the role of Fair Is Foul, Python which is more functional but less pretty holds the role of Foul Is Fair

Skeptical Technologist


Who needs security, anyway? The C language has certainly touched all of our lives in important ways... perhaps most of all by making it so difficult to write reliable secure code and therefore facilitating the loss of billions of dollars to security flaws caused by buffer overflows and similar coding errors... not to mention the billions spent on security "enhancement" software that wouldn't have been necessary if the underlying systems had not been riddled with C-borne defects. The concept of a language that has subscripts, but neither arrays nor strings, might have been groundbreaking, but that part couldn't be called brilliant. It made the language easier to implement and more efficient, to be sure, but oh, what a price we've paid since then. A moment of convenience, a lifetime of regret.



Security is overrated.

Security is overrated.

Ease of use, and performance trumps all.

Let me know when you manage to have a OS/Application/System worth a damn using anything else than C or a variant thereof.

The only problem with C, is that the world is full of programmers that are not fit to use a pocket calculator, not program a computer.

Get rid of the chaff, and the rest of us can manage to use C quite well, thank you.

And btw, "Technologist"? Go away kid!



There's some great C programmers out there. You're obviously not one of them.



@perl/ruby/python are similar statement.

No. Let's leave python forever as who since Stuart Feldmans's make utility ever considered whitespace operable? Unbelievable. Not worthy.

So this aside, how can you compare ruby and perl? By their literal names? I can't stand the Perl syntax but it truly is an art. Still not sure how ruby had perl in it's rearview mirror wity syntax and I find them grossly different. Ruby is bloated in solution context.

(real) C programmer


> Skeptical Technologist

> Skeptical Technologist
> "loss of billions of dollars to security flaws and C-borne defects"

The word of an "expert" -that has never heard of memory pools, dynamic buffers, garbage collectors... all coded in C.

1. If Java and script language can use those toys, why C could not also?

2. Ruby, Perl, Python, etc. are crippled by core security flaw. Not C.

3. The only "lifetime of regret" we might have is to see idiots talking without knowing.

People then say, give me a live example of all those tools used by C.
Fair enough.

The (free) TrustLeap G-WAN ANSI C scripts:

- are 120x faster than Apache/PHP;
- use memory pools and dynamic buffers (no overflows);
- take 50 KB of code (not 50 MB);
- are under constant attack since they were shipped;
- have exposed no vulnerability (another world record).

Sure, "Skeptical Technologist", with all his science, would "not call it brilliant".



"Let me know when you manage to have a OS/Application/System worth a damn using anything else than C or a variant thereof."

Yes. Unisys MCP systems. For HLL implementation, they predate C and Unix by at least five years going back to the Burroughs B5000. All system software is written in Algol-like languages, not assembler (there is no assembler) with HLL syntax like C. Security is ensured at a low level since all bounds and buffer overruns and much more are checked and crash the program straight away, producing very readable stack dump, rather than going on and producing havoc. They have true virtual memory, built into the machine, not something kludged on top.

Many of the concepts from the B5000 have gone into today's software. Like systems programming in HLLs, virtual memory, OO programming, virtual machines, multiple processors, removal of Von Neumann bottleneck, etc, etc. But nothing was ever quite done as well, resulting in many of the current headaches of the computer industry, which Burroughs had mainly solved way back in 1965. C is really pretty flimsy and a poor variant of ALGOL for systems programming.



Sir,I hope You continue to write some good books like "The practice of programming" and guide students like me.
I was really intrested reading your books.


Dharmaraj Iyer


Great questions! You might also be interested in this interview with Brian Kernighan: he covers his personal history and the creation of the C book, and he offers advice to students, graduates, and retirees.



I have read the entire blog. It is really very nice. There are many good ideas and experience of the eminent personalities have been shared at here. Thanks for sharing your ideas.

<a href="">depression</a>



Hmm, Better late than never.
I began using grep, then AWK, in the mid 80's, applied it to RE3, legal software reverse engineering, then bumped into S/P: Smalltalk-Prolog. I thought I had it made. in the meantime all have been trashed, by "improvisers". How can one improve on perfection: Java being a relic of Smalltalk.
At the UTS (sydney) 1989, Prfs Debenham and Montgomery demonstrated how to design maintainable RDBMS for S/P.
Unfortunately they employed traditional B and R modeling technique.
It was like watching Mr Bean, pack a suitcase; what doesn't fit he cut of with scissors. Mind boggling. The most memorable (UTS) statement was, OO wiil not succeed without a Problem Taxonomy. It is impossible, employing the Pragmatic method.
With all due respects the methods expressed in 2009 are still wrong in 2010.
Am happy to share my discoveries with anyone who is still in the software industry and not wasting their lives in IT.

Comments are now closed

UK plans Australia-style data retention regime

MORE IN Business