Okay so it's a good time to review what happened to programming historically.
The Fundamental Theorem of Programming (if there ever could be one) goes
The angels never came down out of the clouds and told men how to program a computer.
This Fundamental Theorem should always be remembered as it reverberates in everything about programming and programming languages and operating systems.
Because the angels never handed us stone tablets that detailed the "right way" to make a computer and program it, we have inherited a large body of
CONVENTIONS for how to do this. Furthermore, we, us humans invented and adopted all of these conventions. There was never a divine intervention. We created all of this.
I'm going to try to go a little bit over the history to hit the paradigms. Then I will discuss a little bit of the history of the "Personal Computer" (P.C.) as it really exhibits the Fundamental Theorem.
Fundamental OntologiesThe idea of storing instructions in RAM, reading those instructions as a list and performing them one after the other is called the "von Neumann architecture" and was invented somewheres between 1947 and 1949 by John von Neumann. This ontology is now called the : Programmable Computer. The particular paper in which von Neumann described how this could be done with a machine is considered variously the "Bible of Computer Science" or "The beginning of Computer Science".
1. Programmable Computer
(already discussed above)
2. Procedural Programming
This ontology of program structure is centered around subroutines, functions, and "methods". Top-to-bottom execution is assumed , and this is often called imperative by grad students. Procedural Programming is the method du joure for assembly language.
3. Object-Oriented Programming . OOP
The inventor of C++ realized pretty quickly into the 1980s that OOP must be integrated into the language called C. The conversion (as messy as it was) of C into C with classes is going to haunt and reverberate to the present day, in all the arguments that surround modern languages. The fundamental ontology of OOP is given by the Liskov Substitution principle. I was watching a symposium on modern programming languages, and the speaker was talking and the crowd was mostly quiet and passive. The speaker then described the fundamental ontology of OOP by saying
When in a situation in which software systems must interact, you program to an interface , not to an implementation.
Upon hearing this, a single person in the audience hooted and clapped. That single nugget of wisdom neatly encapsulates the entire paradigm and "approach" of OOP. There are many more important aspects to is, such as "high cohesion" and "Low coupling" which I will not expand upon for the time being. In any case, OOP was understood as being superior to Procedural in almost every conceivable way.
4. AGILE
Outside of software development, the laymen are unaware of the nature of programming computers. Every young arrogant 19 year-old thinks that he is "smart" and can "do anything". Programming computers has shown us, in a violent way that being smart is not good enough. Conventions must be adopted. It is excruciatingly difficult to program robust, fault-tolerant software.
OOP began to dominate software dev between the late 1980s throughout the 1990s. AGILE is a paradigm that came to want to extend the mere "correctness" of software. Once correctness-of-exection is (mostly) taken care of, we want software that is extensible and open to modification at short notice. Even with fully correct code bases, we still have the problem that a single change to one part destroys the entire software system. These systems are said to be "brittle" to change.
I like to tell the story about the young, arrogant 19 year old software dev. He is given some assignment involving multithreading. In his naivete and arrogance of his youth, he thinks he can "knock it out" in three days. He writes multithreading code that is absolutely horrible. Perhaps worse, he has no idea how horrible it is, because generally speaking multithreaded code cannot be tested. Any developer can fall into the trap of testing multithreaded code base, it passes the test, then he ships it. "Welp. It works. Ship it to the customer." Only weeks later do the crashes begin because of un-seen race conditions and thread deadlock conditions that occur only once in a blue moon.
The very same young software developer is seen at a conference on programming six years later. In his presentation the very first powerpoint slide says :
"Software is Hard."This harkens back to the Fundamental Theorem of Programming. We were never given a guarantee "from out of the clouds", as it were, that the human brain is suitable or capable of programming a computer. We cobble together our conventions, abide by them, and lurch into the future learning from our mistakes as we go.
5. Functional Programming
Functional programming has emerged as an alternative ontology to OOP. Some more excited geeks have even suggested at conferences that Functional Programming will "supplant" or "replace" Object-Oriented approaches. While academia is certainly ready to take the plunge into F.P., industry and business have had only a lukewarm reaction to it. I know for example that twitter runs its servers using Scala. ( a language I happen to personally love.) Scala, however, is not a so-called "pure" functional language, as are Scheme and Haskell. Scala is some kind of hybrid imperative/functional beast. Despite being a hybrid, I am prepared to defend Scala "to the death" -- if someone wants to take it there.
The Personal ComputerWe rewind our historical clocks to a time between 1979 and 1983ish. This is when the PC is going to be born. And it's birth is revealing and eye-opening in equal amounts. At the end of this period an operating systems developer (names Bill Gates) is going to declare ,
Nobody will ever need more than 640 kilobytes of RAM.
Let's put this horrible blunder in perspective. At the time it was uttered, PCs were monochrome green screened little office trinkets. The economy models sported a wopping 32 kilobytes of RAM total, and the more expensive luxury PCs had 64 kilobytes of RAM. Thus at this time, 640KB was literally 10 times the amount of RAM seen in a industrial-grade PC.
But this was not the only blunder in the messy distorted and unbelievable birth of the PC.
So
ISA is an acronym for Instruction Set Architecture. Modern PCs, on paper anyways, encode their instructions using an ISA called x86-64. Let's go back to 1979 and 1980 and discuss where this "x86 ISA" came from. The x86 instruction set was , for all intents, designed by four guys in a garage in 1979.
All the men involved in its initial formulation were under the impression that they would ship a PC with that microchip in it, it would sell for 3 years, they would make their money, and then the business would go belly-up. No harm. No foul. That was the plan. So four men got together in a garage and lain down the ISA for x86, and did so in less than 2 months of thought and effort.
Little did they know, that in merely 15 years, x86 ISA would be the de facto standard in all Personal Computers in the market. Instead of making some chump change for three years, and then "going belly up" the PC industry exploded like a nuclear blast through society. Offices and downtown high rise office buildings suddenly needed a supply of 200 PCs so they could transform , modernize and streamline their work.
A horrendous and cobbled Instruction Set Architecture (x86) was still infesting all the computers in homes and offices by the late 1990s. A competing architecture, that was the result of decades of sweat, tears, and academic publications was called RISC or "Reduced Instruction Set Computer". RISC was superior to x86 ISA in every conceivable way. But PCs were not using it.
Chip designers at Intel eventually came to realize how fubared x86 ISA is. Somewheres around the year 2001-2003, Intel processor designers introduced a few gadgets called the Pentium II and the Pentium III. They fit into the motherboards like a slot (if you recall). The interesting thing about Pentium II is that the processor will read in x86 instructions from memory, and then literally convert them to equivalent RISC instructions to be performed in the core of the CPU. As you can imagine, this adds an enormous amount of circuitry 'overhead' to the processor core. Nevertheless, the benefits of RISC in terms of speed, simplicity, caching, and pipe-lining actually made the CPUs better in actual performance. That should give the reader an idea of how terrible x86 ISA was. I mean, we are , after all talking about an architecture that was developed by four guys in a garage.
I am a huge personal advocate for the adoption of RISC V. I am ready to defend RISC V "to the death" if anyone wants to take it there.
But back to the main core of this thread. Programming ontologies. This comes down to a question about whether we believe that Functional Programming is "the future" of how humans program computers, or whether it is a fad.
Your thoughts?