Preface

Let me open with a disclaimer. What follows is an unabashedly opinionated diatribe certain to offend many programmers. Craftsmen inevitably grow defensive of their favourite tools and practices. Rather than respect this natural sensitivity, I will exploit it, shamelessly intensifying the vitriol for attention. I care little whether one applauds or deplores my views; I mainly want my audience to feel compelled to read on.

However, I try to be informative as well as incendiary. Whether one finds my statements profound or unsound, whether one abhors or adores my examples, and whether one is amused or bemused by my observations, I hope one leaves wiser.

Thanks!

Thanks to Dan Henry, Asim Jalis, Harold Lee, Gazsó Attila, Tim MacEachern, Boyko Bantchev, Kevin Easton, Dillon Shook, Samy Bahra, Ed Catmur, Daniel Griffith, and Doug McIlroy for corrections and suggestions.

A C Odyssey

I began my journey with BASIC, when I was easily amused: for example, I liked being able to change the colour of the screen with a simple command.

A few years later, I heard about C. I followed a poor-quality 14-part tutorial for 14 days. I hated it: there was this extra compilation step, which often produced cryptic error messages. It seemed unnecessarily fussy. I had to declare my variables. Semicolons, braces and parentheses had to be just right. I went back to BASIC.

A few more years later, I learned x86 assembly language. Now this was cool! I had direct control over everything. More importantly, I knew what my PC was doing when it compiled and ran a program. From this point I was able to quickly learn the innards of DOS: the boot sequence, the inner workings of executables, how files were laid out in FAT filesystems, how devices were controlled, how multitasking could be simulated.

By the way, I feel all programmers benefit from learning assembly. There are times it pays off to know what the CPU is doing. In any event, if one is interested in computers, how can one not be curious about how they work? It leads one to question instruction set design, another fascinating and explosive topic.

At this stage, though I did use C and assembler from time to time, and dabbled in other languages such as C++ and FORTH, I was most comfortable with BASIC. But I sensed it had no geek cred, and I abandoned it upon entering university.

Object-oriented languages were the latest fashion. Its highly contagious ideas spread through the computer science department, soon infecting me. I turned up my nose at Java because it compromised object-oriented principles. Instead, I was enchanted by Eiffel, a language so clean that it’s sterile. I became a rabid Eiffel zealot, even trying to prove its superiority in a programming contest. Long time no C.

Then during grad school, I ported one of my pet projects from Eiffel to C. I forget why. Normally I’d mix C and Eiffel to talk to the graphics library, but perhaps I had tired of writing glue code.

I slowly awoke from a dream, or more accurately, a mild nightmare. My C code also worked, except it was faster. It was more concise, which in turn made it easier to maintain. It was…better. So why was I bothering with all this object-oriented nonsense?

Since rediscovering C, I have felt annoyed that I had bought into object-oriented programming. An easy mistake, as both industy and academia tirelessly promoted objects, and still do today. The constant and pervasive assault irks me, though I am cheered whenever I read articles from dissenters. It’s time I added my voice.

Update

Perhaps it’s confirmation bias, but I believe the world is changing for the better. Programmers have lost their taste for object hierarchy spaghetti, implementation inheritance is now viewed with suspicion, and the perpetrators of the two languages that I despise the most, C++ and Java, have at last seen the light and introduced lambda expressions and type inference.

My new pet hate is dynamic typing. I can understand why it became popular: nobody wanted to write C++ and Java boilerplate, so they thought the solution was to get rid of static typing. The right solution is to get rid of clumsy type annotations, and use type inference so we can still have static type checks.

Most amazingly, I found a language that dethrones C at last: Haskell.

A tale told by an idiot

Let me close with more disclaimers. Empty vessels make the most noise. I have yet to create a substantial popular project, while authors possessing far more intelligence, experience, and politeness have already written excellent books on programming. Rather than marvel at my sound and fury, serious readers should quit right now and start with Kernighan and Pike’s The Practice of Programming.

This classic shows why the design of the data structures is the true heart of programming, and why language choice is almost irrelevant. In hindsight, this is obvious after considering how we program humans: textbooks can be written in any language; the tricky part is presenting the content clearly.

Therefore, programming language proselytizers like myself should be viewed with suspicion. If I truly wanted to talk about improving code quality, I would instead be discussing the rules of their Appendix A.


Ben Lynn blynn@cs.stanford.edu