The 3 Level Model of Programming Languages

Remember, Linux is the only OS of any relevance. Windows is moribund and trying to grab Linux. MacOS isn't even their own. BSD doesn't exist. Thus, any proposed solutions should ignore anything but Linux and OpenVMS.
Post Reply
loman
Posts in topic: 1
Posts: 5
Joined: Wed Mar 24, 2021 12:50 pm

Technical

The 3 Level Model of Programming Languages

#1

Post by loman » Wed Mar 24, 2021 2:23 pm

I'm going to describe my favorite way of thinking about programming languages. It's helped clarify things for me, and hopefully will be adopted by the larger programming community.

People need to rate programming languages.
Image
And the rating scale that normally gets applied is based on attributes such as whether the language is fast vs. slow, compiled vs. interpreted, object-oriented vs. functional, etc. These types of classifications miss the point. What we want is an integrated view of why differences in programming languages lead to a better or worse effort in solving our particular problems.

The old dichotomies are breaking down, and we should be looking for a new way to understand programming languages. In the past, looking at code speed was simple: if a language was compiled, it was faster than if it were interpreted. These days, many interpreted language modules perform complex tasks using compiled libraries, so they are just as fast, and usually faster than your hand-rolled compiled code. In the past, object-oriented was the way to go, but now functional programming is seeing a resurgence now that it's clear that object-oriented code can be rather slow and difficult to maintain. This blurring of the old boundaries seems to be universal.

I see a continuum of languages as they relate to Thinking Like A Machine vs. Dealing With Complex Interacting Objects.

A coding style that forces you to Think Like A Machine is one where the processes used by the computational hardware are built into the language, forcing you to write in idioms that translate well to the machine you're running on. Examples of this would be Forth (forces you to think of the call stack), C/C++ (forces you to think using indirection), Lisp (forces you to think of recursion, which leads to better optimization of code). And all these languages have a healthy dose a mathematical operators and size/type allocations for the variables.

Conversely, Dealing With Complex Interacting Objects is one where the objects themselves are represented and all trace of the computational mechanism is eliminated. Examples of this would be LabView software (the control objects and their underlying data flow & signals are graphically represented), and bash shell language (deals with devices, directories, assorted commands to manipulate said objects & pipe data between commands and/or objects). In such ideal languages, the mathematics of the underlying computer should disappear. It's not that you can't perform arithmetical calculations, it's simply that the arithmetical operations of the underlying computer is of no concern.

Thinking like a machine is the low level approach. Thinking with objects is the high level approach. The middle approach is to insulate the programmer from both of the polar extremes. Examples of such languages are Basic, Perl, Python, TCL, and every macro language or scripting language ever invented. You can still perform mathematics, but it's much easier to perform because you don't have to worry about how the machine is going to behave. This makes such languages perfect for new students of programming. Also, there are no huge objects built into the language. The language may, in fact, be using objects, but they aren't complex and the user doesn't need to know much about them. There are many ways to insulate a programmer from the reality of the situation, but the one constant is that they feel that Things Just Work.

The impetus for my writing up my observations are the proliferation of various language cults: Python, Rust, and Matlab are prime examples. These languages have been very successful due to a large number of developers, and I would say in spite of the obvious drawbacks of the languages. Python has already split its ecosystem, and I've had no end of problems with building python packages. The first problem is due to carelessness, and the second is a structural problem (Python isn't a build system or repository system, yet it now is invading this space). Rust is being oversold as solving the memory allocation and garbage collection problem. However, other languages also solve this problem with libraries and classes to preserve maximum flexibility. This attempt to insulate the programmer from the machine is moving rust towards a second tier language. And the cargo system is clearly a 3rd tier language task. It's a mistake to think that one size will fit all situations, and it's a mistake to think that losing options in a low level language represents progress. Matlab is a classic example of trying to convert a second tier language into a third tier language. The language design is bolt-on centric (lots of additional modules that do the work) randomly added to the language. This is not a problem for a second tier language... it's an absolute benefit. However, trying to make it take on complex mathematics as well as making complex presentations is a step too far.

What I'm seeing is that languages having a natural place in the this continuum are now trying to do everything. As result, the codebases become increasingly difficult to maintain. I see there is a high demand for Python programmers currently. I believe this is because Python becomes increasingly difficult to work with if you are trying to take over the world using just that language. Layout languages such as HTML/CSS, PDF, and LaTeX are stuck somewhere between a second tier and top tier language. And these also have become increasingly taxing to work with.

To be clear, the extremes of the continuum (1st tier and 3rd tier) is what makes most of the profit in the working world. First tier languages solve things quickly and take on the most algorithmically intense tasks, but are difficult to make work because of all the gritty details that go into this code. Third tier languages can be used to design and implement mechanisms of commercial value, because commercial value stems from machinery of some sort, modeled as interacting objects by the language and output as machine controls for manufacturing. The biggest problem is in the 2nd tier languages (which I know from personal experience) and comes from trying to use your favorite second tier language to do everything. I have to say that Visual Basic does a pretty good job at this, but I think most people will agree that writing a physics simulation program or a factory automation program strictly in Visual Basic is not going to be as efficient as choosing another language. Conversely, I think that embedded C/C++ is mostly a waste of time, unless your controller is really low power or really low ability. Most device manufacturers should buy a better controller and use a third tier language. They wouldn't be scrounging for programmers, and the code would be more maintainable.

I hope these somewhat controversial views will eventually become dominant. It took me a while to conceptualize what is going on in the modern world. It's just the worst to hear that your favorite second tier language is a throw-away for most purposes. But I think someone needs to point out that you can't stay in school forever.
word count: 1204

Post Reply