This article was spurred by ``There's No Such Thing as Knowing your Computer 'All the Way to the Bottom'''.
The author of the article being rebutted claims to have three decades of experience, and the initial advice is good, as one shouldn't learn C or C++; necessarily write programs which directly interface with the hardware; or use languages with a reputation for speed solely on that; yet the claim of one learning a machine ``all the way to the bottom'' to better understand programming being misguided is one I disagree with.
As an aside, trivial research reveals that mainframes had the concept of thread before C was created and so the author's claim that the C language is older than threads is false. I agree the C is poor for teaching how a machine works, but the claim that programmers have no better tool is again false. The Ada programming language is more capable of taking advantage of the particular machine than C if only due to how it doesn't require particular machine representations unless explicitly commanded in this. The author seems to make the mistake here in believing that a higher language is in some ways less capable of being optimized specially than a lower language. It's the higher languages, through basic abstraction, which permit the most efficient use of machine resources, by avoiding unnecessary constraints and other such things which inhibit using specialized facilities implicitly. Learning a machine code, also advised against therein, will teach one how a machine works at a more fundamental level, contrary to the claim. Becoming familiar with a machine code will teach which operations and arrangements of data are more fundamental to the machine and more easily done; the implementation of a higher language necessarily requires some understanding of the machine code at some level, and one will learn how subroutines and other such things are implemented in the process.
The second main point lies against writing rather low-level software, such as an operating system; I don't entirely disagree. Firstly, the author seems to be under the impression that any such systems software must be written in lower languages. Learning of virtual memory and scheduling and the like can be done without any practical application in the same way sorting algorithms can be learned, but that doesn't make implementing them without merit. Some relevant advice would be writing such for a simplified virtual machine. Secondly, the author focuses on system software which largely lacks any outside relevance to other programming, but system software such as an interface to TCP or UDP or an abstraction for an I/O device are still system software and also have a wider relevance. The author misses the opportunity to recommend writing a compiler to learn more of systems software; any manner of compiler could do for this, such as one which compiles a command language to the system requests.
The third main point advises against learning an efficient programming language. The confusion here lies in the idea that a more efficient programming language is also lower-level or more difficult to program satisfactorily in. One can use a programming language which is more efficient in memory and speed than others whilst also being easier to program in; that's to write the author fails to notice he can have all qualities; APL is such a language when the programming involves numerical tasks. He properly notices that lower languages make more efficient yet more complex algorithms more difficult to write and he then cites the typical premature optimization argument. I'd recommend writing every aspect of a program to be efficient, perhaps leaving more complex optimizations as an option, and to only profile and optimize selectively if the program is then still too slow; in general, programmers should prefer algorithms with good time complexity over those which happen to be efficient and which needs no profiling to accomplish.
The author closes by arguing what people mean by ``all the way to the bottom'' and, while perhaps it seems convincing, this is sophistry in my eyes. For the programmers, there is a clear bottom to the machine, which is the machine code, the lowest language of the machine or lowest generally permitted to program in. A machine with user-programmable microcode would have microcode as its lowest level. To argue that one needs to understand the hardware itself to know the machine's bottom is sophistry. A proper machine's behavior is its hardware description and one would only need work around any flaw if one seeks to use hardware with the imperfection.
The author continues to then cite ``Joel's Law of Leaky Abstractions'', which is sophistry involving performance characteristics as evidence of leaks in everything from TCP to an array; to argue arrays are a leaking abstraction purely because different access patterns may perform differently is a nice way to excuse one's incompetence in writing a good abstraction; simply claim the problem is actually impossible and then continue with your incompetent ways; I don't believe this Joel considers writing an abstraction proven to have no ``leaks'', but he already thinks this is impossible, without proof, and so I can understand why he doesn't bother.
In closing, I can agree that using a higher language is generally better for learning programming; I believe the author may merely perhaps be ignorant of languages which are both high and low level all at once. An automatic computer can be understood at its lowest level, from a programming viewpoint, and this will likely have other tasks using the machine easier to comprehend in some way; a properly abstract and high programming language is free of many tangential considerations listed and most any is suitable for writing systems software.
I may continue to reflect on this and revise the article soon in turn, particularly after asking the author of the rebutted article for his thoughts.