I ran across Joel on Software's latest post this evening. Mostly, it's full of good advice for college students considering a career in software development - learn to write well, for instance. I'll heartily second and third that one. I can barely believe how much barely literate email I get on a daily basis. Spelling mistakes, poor grammar - it doesn't matter how good your arguments are, if you can't express yourself clearly, no one will ever hear them.
The only thing I questioned was his advice on learning C. Now, I don't dispute the value of knowing C - the simple fact is, the low level API's of Windows, Unix, and Linux are all in C. if you ever intend to drop down from a high level language (Smalltalk, Java, what have you) - it's going to be to C. Having said that, here's what Joel says:
Part two: C. Notice I didn't say C++. Although C is becoming increasingly rare, it is still the lingua franca of working programmers. It is the language they use to communicate with one another, and, more importantly, it is much closer to the machine than "modern" languages that you'll be taught in college like ML, Java, Python, whatever trendy junk they teach these days. You need to spend at least a semester getting close to the machine or you'll never be able to create efficient code in higher level languages. You'll never be able to work on compilers and operating systems, which are some of the best programming jobs around. You'll never be trusted to create architectures for large scale projects. I don't care how much you know about continuations and closures and exception handling: if you can't explain why while (*s++ = *t++); copies a string, or if that isn't the most natural thing in the world to you, well, you're programming based on superstition, as far as I'm concerned: a medical doctor that doesn't know basic anatomy, passing out prescriptions based on what the pharma sales babe said would work.
Here's the first thing - most developers simply aren't going to be writing compilers or JITS - they are going to be writing stock business applications. As such, I'd argue that knowing about exceptions is going to be a lot more relevant than being intimate with low level C details. Still, this isn't the reason I linked to this - it's the linked article that concerned me:
That opens another whole can of worms: memory allocators. Do you know how malloc works? The nature of malloc is that it has a long linked list of available blocks of memory called the free chain. When you call malloc, it walks the linked list looking for a block of memory that is big enough for your request. Then it cuts that block into two blocks -- one the size you asked for, the other with the extra bytes, and gives you the block you asked for, and puts the leftover block (if any) back into the linked list. When you call free, it adds the block you freed onto the free chain. Eventually, the free chain gets chopped up into little pieces and you ask for a big piece and there are no big pieces available the size you want. So malloc calls a timeout and starts rummaging around the free chain, sorting things out, and merging adjacent small free blocks into larger blocks. This takes 3 1/2 days. The end result of all this mess is that the performance characteristic of malloc is that it's never very fast (it always walks the free chain), and sometimes, unpredictably, it's shockingly slow while it cleans up. (This is, incidentally, the same performance characteristic of garbage collected systems, surprise surprise, so all the claims people make about how garbage collection imposes a performance penalty are not entirely true, since typical malloc implementations had the same kind of performance penalty, albeit milder.)
Hmm - now I'm no expert on garbage collection, but I spent plenty of time working in C a decade or so ago, and I've spent all the time since in Smalltalk. Modern gc systems do not have the problems that malloc has - and moreover, they remove memory management from the error ridden hands of the application developer, and put it instead in the hands of a framework with global knowledge of the application. It's a huge win for all concerned - better performance for all but the smallest proportion of applications (such as those living in embedded systems with constrained resources). Joel is spreading bad knowledge here - not unlike his complete lack of cluefulness on exception handling.
I think Joel is getting to the point where he just doesn't know what he doesn't know...