All the programs that could ever be expressed by a language

Discussions on everything related to the software, electronic, and mechanical components of information systems and instruments.

All the programs that could ever be expressed by a language

Postby Terry on February 18th, 2019, 2:19 am 

Is the language designer determines the fundamental ontology of a language from which everything are derived? What is the fundamental ontology then?
So what the programmers do is just combination, parameterization and organization of the fundamental ontology of a language?
For a finite and fix compiler or interpreter crafted by the language designer, what is the size of all the programs that could ever be expressed by the language? Infinity or what sort of infinity it is?
When considering such an infinity, what factors need to be considered to determine its order of magnitude in the infinite series of infinities? Is it the degree of freedom in the system?

I am not sure whether I am asking the right questions. Feel free to express your views on these related topics.
Terry
Member
 
Posts: 449
Joined: 01 Sep 2008


Re: All the programs that could ever be expressed by a langu

Postby PaulN on February 18th, 2019, 4:42 pm 

I could be mistaken, but I think the guiding principle of new threads is to pick one specific topic and offer your take on it, in order to stimulate thought and feedback from others. May I suggest a starting point? Define what you mean, in plain English, by "fundamental ontology of a language."
PaulN
 


Re: All the programs that could ever be expressed by a langu

Postby Terry on February 19th, 2019, 3:58 am 

I think the question below explain in more details.

What's Ontology of Programing Language?
http://xahlee.info/comp/programing_ontology.html

If you study philosophy, computer science and mathematics, you will have these questions lingering around.
Terry
Member
 
Posts: 449
Joined: 01 Sep 2008


Re: All the programs that could ever be expressed by a langu

Postby someguy1 on February 19th, 2019, 6:22 pm 

Terry » February 18th, 2019, 12:19 am wrote:When considering such an infinity, what factors need to be considered to determine its order of magnitude in the infinite series of infinities? Is it the degree of freedom in the system?


There are only countably many Turing machines. That's the upper limit for any programming language.

Now if you are considering different models of computations such as oracle machines, that could in theory be a different story. But if you're talking about conventional programming languages, there are at most countably many programs in any such language.

And there are at least countably many programs if you allow the infinite sequence of programs:

x = 1
x = 2
x = 3
...

with each line being a distinct program. Of course this is a theoretical result, since any actual physical implementation will have an upper limit on the size of the value on the right.
someguy1
Member
 
Posts: 758
Joined: 08 Nov 2013


Re: All the programs that could ever be expressed by a langu

Postby hyksos on February 20th, 2019, 2:15 am 

Okay so it's a good time to review what happened to programming historically.

The Fundamental Theorem of Programming (if there ever could be one) goes

The angels never came down out of the clouds and told men how to program a computer.

This Fundamental Theorem should always be remembered as it reverberates in everything about programming and programming languages and operating systems.

Because the angels never handed us stone tablets that detailed the "right way" to make a computer and program it, we have inherited a large body of CONVENTIONS for how to do this. Furthermore, we, us humans invented and adopted all of these conventions. There was never a divine intervention. We created all of this.

I'm going to try to go a little bit over the history to hit the paradigms. Then I will discuss a little bit of the history of the "Personal Computer" (P.C.) as it really exhibits the Fundamental Theorem.

Fundamental Ontologies
The idea of storing instructions in RAM, reading those instructions as a list and performing them one after the other is called the "von Neumann architecture" and was invented somewheres between 1947 and 1949 by John von Neumann. This ontology is now called the : Programmable Computer. The particular paper in which von Neumann described how this could be done with a machine is considered variously the "Bible of Computer Science" or "The beginning of Computer Science".

1. Programmable Computer
(already discussed above)

2. Procedural Programming
This ontology of program structure is centered around subroutines, functions, and "methods". Top-to-bottom execution is assumed , and this is often called imperative by grad students. Procedural Programming is the method du joure for assembly language.

3. Object-Oriented Programming . OOP
The inventor of C++ realized pretty quickly into the 1980s that OOP must be integrated into the language called C. The conversion (as messy as it was) of C into C with classes is going to haunt and reverberate to the present day, in all the arguments that surround modern languages. The fundamental ontology of OOP is given by the Liskov Substitution principle. I was watching a symposium on modern programming languages, and the speaker was talking and the crowd was mostly quiet and passive. The speaker then described the fundamental ontology of OOP by saying

When in a situation in which software systems must interact, you program to an interface , not to an implementation.

Upon hearing this, a single person in the audience hooted and clapped. That single nugget of wisdom neatly encapsulates the entire paradigm and "approach" of OOP. There are many more important aspects to is, such as "high cohesion" and "Low coupling" which I will not expand upon for the time being. In any case, OOP was understood as being superior to Procedural in almost every conceivable way.


4. AGILE

Outside of software development, the laymen are unaware of the nature of programming computers. Every young arrogant 19 year-old thinks that he is "smart" and can "do anything". Programming computers has shown us, in a violent way that being smart is not good enough. Conventions must be adopted. It is excruciatingly difficult to program robust, fault-tolerant software.

OOP began to dominate software dev between the late 1980s throughout the 1990s. AGILE is a paradigm that came to want to extend the mere "correctness" of software. Once correctness-of-exection is (mostly) taken care of, we want software that is extensible and open to modification at short notice. Even with fully correct code bases, we still have the problem that a single change to one part destroys the entire software system. These systems are said to be "brittle" to change.

I like to tell the story about the young, arrogant 19 year old software dev. He is given some assignment involving multithreading. In his naivete and arrogance of his youth, he thinks he can "knock it out" in three days. He writes multithreading code that is absolutely horrible. Perhaps worse, he has no idea how horrible it is, because generally speaking multithreaded code cannot be tested. Any developer can fall into the trap of testing multithreaded code base, it passes the test, then he ships it. "Welp. It works. Ship it to the customer." Only weeks later do the crashes begin because of un-seen race conditions and thread deadlock conditions that occur only once in a blue moon.

The very same young software developer is seen at a conference on programming six years later. In his presentation the very first powerpoint slide says :

"Software is Hard."

This harkens back to the Fundamental Theorem of Programming. We were never given a guarantee "from out of the clouds", as it were, that the human brain is suitable or capable of programming a computer. We cobble together our conventions, abide by them, and lurch into the future learning from our mistakes as we go.

5. Functional Programming

Functional programming has emerged as an alternative ontology to OOP. Some more excited geeks have even suggested at conferences that Functional Programming will "supplant" or "replace" Object-Oriented approaches. While academia is certainly ready to take the plunge into F.P., industry and business have had only a lukewarm reaction to it. I know for example that twitter runs its servers using Scala. ( a language I happen to personally love.) Scala, however, is not a so-called "pure" functional language, as are Scheme and Haskell. Scala is some kind of hybrid imperative/functional beast. Despite being a hybrid, I am prepared to defend Scala "to the death" -- if someone wants to take it there.

The Personal Computer
We rewind our historical clocks to a time between 1979 and 1983ish. This is when the PC is going to be born. And it's birth is revealing and eye-opening in equal amounts. At the end of this period an operating systems developer (names Bill Gates) is going to declare ,
Nobody will ever need more than 640 kilobytes of RAM.

Let's put this horrible blunder in perspective. At the time it was uttered, PCs were monochrome green screened little office trinkets. The economy models sported a wopping 32 kilobytes of RAM total, and the more expensive luxury PCs had 64 kilobytes of RAM. Thus at this time, 640KB was literally 10 times the amount of RAM seen in a industrial-grade PC.

But this was not the only blunder in the messy distorted and unbelievable birth of the PC.

So ISA is an acronym for Instruction Set Architecture. Modern PCs, on paper anyways, encode their instructions using an ISA called x86-64. Let's go back to 1979 and 1980 and discuss where this "x86 ISA" came from. The x86 instruction set was , for all intents, designed by four guys in a garage in 1979.

All the men involved in its initial formulation were under the impression that they would ship a PC with that microchip in it, it would sell for 3 years, they would make their money, and then the business would go belly-up. No harm. No foul. That was the plan. So four men got together in a garage and lain down the ISA for x86, and did so in less than 2 months of thought and effort.

Little did they know, that in merely 15 years, x86 ISA would be the de facto standard in all Personal Computers in the market. Instead of making some chump change for three years, and then "going belly up" the PC industry exploded like a nuclear blast through society. Offices and downtown high rise office buildings suddenly needed a supply of 200 PCs so they could transform , modernize and streamline their work.

A horrendous and cobbled Instruction Set Architecture (x86) was still infesting all the computers in homes and offices by the late 1990s. A competing architecture, that was the result of decades of sweat, tears, and academic publications was called RISC or "Reduced Instruction Set Computer". RISC was superior to x86 ISA in every conceivable way. But PCs were not using it.

Chip designers at Intel eventually came to realize how fubared x86 ISA is. Somewheres around the year 2001-2003, Intel processor designers introduced a few gadgets called the Pentium II and the Pentium III. They fit into the motherboards like a slot (if you recall). The interesting thing about Pentium II is that the processor will read in x86 instructions from memory, and then literally convert them to equivalent RISC instructions to be performed in the core of the CPU. As you can imagine, this adds an enormous amount of circuitry 'overhead' to the processor core. Nevertheless, the benefits of RISC in terms of speed, simplicity, caching, and pipe-lining actually made the CPUs better in actual performance. That should give the reader an idea of how terrible x86 ISA was. I mean, we are , after all talking about an architecture that was developed by four guys in a garage.

I am a huge personal advocate for the adoption of RISC V. I am ready to defend RISC V "to the death" if anyone wants to take it there.


But back to the main core of this thread. Programming ontologies. This comes down to a question about whether we believe that Functional Programming is "the future" of how humans program computers, or whether it is a fad.

Your thoughts?
User avatar
hyksos
Active Member
 
Posts: 1651
Joined: 28 Nov 2014


Re: All the programs that could ever be expressed by a langu

Postby someguy1 on February 20th, 2019, 4:11 pm 



For sake of discussion, computing has not changed since the invention of the stored program computer, the von Neumann architecture. All else is minor details. Procedural, objects, and functional programming are ways of organizing programs, they aren't substantially different ontologies. Nor has the notion of what is computable changed since Turing's paper of 1936. So I would argue that programming hasn't changed at all in 80 years. Sure the technology changes, but not the underlying ontology.

When some clever future genius figures out how to build a physical oracle machine to go beyond the TM; THAT will be a revolution. Absent that, all you have is evolution.
someguy1
Member
 
Posts: 758
Joined: 08 Nov 2013
TheVat liked this post


Re: All the programs that could ever be expressed by a langu

Postby someguy1 on February 21st, 2019, 1:42 am 

hyksos » February 20th, 2019, 12:15 am wrote:I know for example that twitter runs its servers using Scala. ( a language I happen to personally love.)


Wow. You know how you don't hear about something then you hear about it twice and it clicks with something else? I read your post earlier, then I was surfing Youtube on my Roku (I am a cordcutter, no more cable!) and I watched a lecture where a guy was explaining Category theory to a room full of Scala programmers.

I know Category theory from back in the day. It's one of the high water marks of my mathematical education. It's this great and strange leap from doing math with equations, this thing's equal to that thing which is equal to some other thing. Instead, you start drawing diagrams with dots and arrows, and "chasing the diagram" as they say. It's profoundly different that whatever anyone thought of mathematics before they see it.

It turns out that functional programming is category theory. It's a very strange development. Category theory was some super abstract math invented in the 1940's that's been taking over large part of math, but not all. For example analysis isn't categorical. But modern algebra, geometry, and even logic are now all seen as living within the categorical framework; and that's the language of these fields these days. But it's slippery and very mysterious to get hold of at first. It's ironically referred to by mathematicians as "abstract nonsense." That's because it doesn't appear to say anything important at all, until you suddenly see your entire field of math reconceptualized in this new framework.

In fact that's a good word for it. In software, a framework is a set of concepts and APIs (application programmer interfaces) in which you can design a related set of programs. In that sense, you can think of set theory -- the old Cantorian two-step that revolutionized the foundations of math -- as a framework; and Category theory as a new framework for math.

It's interesting that Category theory has now started to interest programmers and programming languages. I have to admit that even as an experienced programmer, I can't actually relate to a program as a category. I'm more familiar with mathematical categories, like sets, topological spaces, and so forth.

So maybe I'll look into Scala a bit. Thanks for the pointer!
someguy1
Member
 
Posts: 758
Joined: 08 Nov 2013


Re: All the programs that could ever be expressed by a langu

Postby Terry on February 21st, 2019, 8:58 pm 

Thanks guys for showing your ideas.

someguy1 wrote:There are only countably many Turing machines. That's the upper limit for any programming language.


So, what can go beyond countable infinity?

Suppose we take the ontology at the bit level and treat programs as series of bit stream. So, every bit stream could possibly be a program. Here we have the bit stream itself and the bit stream as representation with semantics imposed on it. If we take the binary representation of real number and biject the bit stream with real numbers. Is the cardinality of the bit stream the same as real number?
Terry
Member
 
Posts: 449
Joined: 01 Sep 2008


Re: All the programs that could ever be expressed by a langu

Postby someguy1 on February 21st, 2019, 9:33 pm 

Terry » February 21st, 2019, 6:58 pm wrote:Thanks guys for showing your ideas.

someguy1 wrote:There are only countably many Turing machines. That's the upper limit for any programming language.


So, what can go beyond countable infinity?

Suppose we take the ontology at the bit level and treat programs as series of bit stream. So, every bit stream could possibly be a program. Here we have the bit stream itself and the bit stream as representation with semantics imposed on it. If we take the binary representation of real number and biject the bit stream with real numbers. Is the cardinality of the bit stream the same as real number?



Programs are finite strings of symbols by definition.

Of course if you allow infinitely long programs then there are as many programs as real numbers. But that's now how they define programs in CS. I'm sure some academics study infinitely long programs, but (as far as science currently knows) no such model is conceivable in the physical universe.
someguy1
Member
 
Posts: 758
Joined: 08 Nov 2013


Re: All the programs that could ever be expressed by a langu

Postby hyksos on February 22nd, 2019, 1:59 pm 

someguy1 » February 21st, 2019, 9:42 am wrote:Category theory was some super abstract math invented in the 1940's that's been taking over large part of math, but not all.
. . .
It's ironically referred to by mathematicians as "abstract nonsense." That's because it doesn't appear to say anything important at all, until you suddenly see your entire field of math reconceptualized in this new framework.

Category Theory comes across as some kind of big inside joke that clever people are playing on the rest of us.
User avatar
hyksos
Active Member
 
Posts: 1651
Joined: 28 Nov 2014


Re: All the programs that could ever be expressed by a langu

Postby TheVat on February 22nd, 2019, 2:28 pm 

Tononi's "phi" or Integrated Information Theory has an interesting ontology of information systems. He and Christof Koch have been working on this for a little over a decade now. I wish I had time to expound on it, but you can find them pretty easily.
User avatar
TheVat
Forum Administrator
 
Posts: 7212
Joined: 21 Jan 2014
Location: Black Hills


Re: All the programs that could ever be expressed by a langu

Postby someguy1 on February 24th, 2019, 8:58 pm 

hyksos » February 22nd, 2019, 11:59 am wrote:Category Theory comes across as some kind of big inside joke that clever people are playing on the rest of us.


I have in mind to write an article explaining category theory to people who have maybe taken Discrete math or a year or two of calculus and would like to know what all the fuss is about.

I have a thesis that I can do this by building up one example from first principles, from how it would look in calculus class to what the same idea looks like in category theory.

I'm limited by the fact that I barely understood all this several decades ago. But the Internet has allowed me to take a fresh look. And the fact that category theory has by now "infected" a lot of areas where I wouldn't have imagined it could have any application ... like computer science, but also economics, quantum physics, linguistics, logic ... there's a market for an article that lets people bridge the gap. All the elementary expositions I've seen are terrible.

At the moment I'm stuck on my example. I am trying to show how the familiar Cartesian product of two sets is expressed in category theory. I have to prove that it's unique up to isomorphism; that is, that if I had some "other" object or thingie that behaves like the Cartesian product; it must be isomorphic to the Cartesian product. It sounds a little crazy, right? But there is some philosophy behind it. Rather than defining the Cartesian product by what it is; we define it by what it does.

That's the kind of thinking that goes on. Here's an analogy. In high school we learn algebra, how to manipulate equations.. Math majors take a subject called abstract algebra, which consists of studying algebraic structures like groups, rings, and fields. These are sets with operations defined on them that we call plus and times, but that are not necessarily the familiar plus and times from our youth. Now when people hit graduate school, they reconceptualize abstract algebra using diagrams instead of equations. Category theory may be thought of as abstract abstract algebra.
someguy1
Member
 
Posts: 758
Joined: 08 Nov 2013


Re: All the programs that could ever be expressed by a langu

Postby hyksos on February 27th, 2019, 1:29 am 

Now when people hit graduate school, they reconceptualize abstract algebra using diagrams instead of equations. Category theory may be thought of as abstract abstract algebra.


Functions going from a domain Set to a codomain Set, are replaced by "natural transformations" going from a category to a category.


category theory has by now "infected" a lot of areas where I wouldn't have imagined it could have any application ... like computer science,

There are structures in Scala literally referred to as monads. There are likely many more (monoids) of such examples in Haskell and F#. However, I don't know a lick about either language.
User avatar
hyksos
Active Member
 
Posts: 1651
Joined: 28 Nov 2014


Re: All the programs that could ever be expressed by a langu

Postby Terry on February 27th, 2019, 4:53 am 

For binary itself, it is only countably infinite. If we treat binary as universal representation, is it arbitrary infinite?

A bit stream map to machine, byte and source code via a representation cascade has one-to-many correspondence in different context. The ontology or fundamental building block will be different at different level of abstraction as well. We can assign meaning to the bit stream such that it represents something in one language and something else or nonsense in another language. The infinite bit stream provides slot for meaning allocation.

We can imagine things coming from nothing with details filling in at different level of abstraction or category until the final specific objects take shape and instanciate as when we are writing programs. On the other hand, we can start with all the possible finite details and reduce them to get to specificity as specialization of cells coming from the full genome.
Terry
Member
 
Posts: 449
Joined: 01 Sep 2008


Re: All the programs that could ever be expressed by a langu

Postby someguy1 on February 27th, 2019, 9:50 pm 

hyksos » February 26th, 2019, 11:29 pm wrote:Functions going from a domain Set to a codomain Set, are replaced by "natural transformations" going from a category to a category.


I haven't yet figured out what a natural transformation is. I think that what you described is a functor. A functor is an operation that maps objects in one category to objects in another; and also arrows (aka morphisms) in the first category to the second. Now CERTAIN functors are natural, but most aren't.

This is what I think I know about natural transformations. You know how in linear algebra you have a vector space V and a "dual space" V*; and that in the finite-dimensional case, V and V* are isomorphic. But the isomorphism is awkward. You have to pick a particular basis, and use that to construct the isomorphism. Pick a different basis and you get a different isomorphism. Mathematicians hate to have to pick a basis.

Now V* is itself a vector space and so it has a dual, V**. Therefor V and V** are isomorphic by transitivity of isomorphism. But there's more! The isomorphism from V to V** is "natural" in the sense that we don't have to pick a basis. I'm leaving out all the technical details here of course.

As I understand it, a natural transformation in category theory is a way of giving a precise mathematical definition to this idea of naturality. The isomorphism from V to V** satisfies some technical condition that the iso from V to V* doesn't. I aspire to understand this.

hyksos » February 26th, 2019, 11:29 pm wrote:There are structures in Scala literally referred to as monads. There are likely many more (monoids) of such examples in Haskell and F#. However, I don't know a lick about either language.


Yes, somehow "functional programming is category theory." A lot of computer people are coming to category theory from this direction. It's just fascinating to me, having seen a little category theory so long ago. I never imagined it could possibly have any practical use whatsoever.

Monads aren't monoids, that can be confusing. In the presentations I've seen, monads are one of the LAST things you learn in basic category theory. There's objects/arrows, and there are functors, then natural transformations, then adjoints, and THEN finally they introduce monads. I don't know if I'm up to that yet. But this is all very interesting.
someguy1
Member
 
Posts: 758
Joined: 08 Nov 2013


Re: All the programs that could ever be expressed by a langu

Postby Terry on March 14th, 2019, 7:25 am 

A CPU With Just One Instruction
https://www.youtube.com/watch?v=jRZDnetjGuo

How can it be proven Turing complete? Every function ever conceivable can be derived from subtraction?
Terry
Member
 
Posts: 449
Joined: 01 Sep 2008


Re: All the programs that could ever be expressed by a langu

Postby someguy1 on March 14th, 2019, 5:19 pm 

Terry » March 14th, 2019, 5:25 am wrote:A CPU With Just One Instruction
https://www.youtube.com/watch?v=jRZDnetjGuo

How can it be proven Turing complete? Every function ever conceivable can be derived from subtraction?


Can we get a quick summary in case we don't want to sit through an add for veganism followed by a video?
someguy1
Member
 
Posts: 758
Joined: 08 Nov 2013


Re: All the programs that could ever be expressed by a langu

Postby hyksos on March 15th, 2019, 3:43 pm 

someguy1 » March 15th, 2019, 1:19 am wrote:
Terry » March 14th, 2019, 5:25 am wrote:A CPU With Just One Instruction
https://www.youtube.com/watch?v=jRZDnetjGuo

How can it be proven Turing complete? Every function ever conceivable can be derived from subtraction?


Can we get a quick summary in case we don't want to sit through an add for veganism followed by a video?

You can emulate the entire RISC instruction set by using clever sequences of an existing machine instruction called SUBLEQ. "Subtract and Branch if Less than or EQual."

The host of the video then compiles a C program down to assembly language, and then assembles that into a giant sequence of SUBLEQs. The program sorts an array of integers. Technically, you could have a computer whose processor has an ISA made up of a single instruction. It could compute anything (given enough time.)
User avatar
hyksos
Active Member
 
Posts: 1651
Joined: 28 Nov 2014


Re: All the programs that could ever be expressed by a langu

Postby someguy1 on March 15th, 2019, 4:18 pm 

hyksos » March 15th, 2019, 1:43 pm wrote:The host of the video then compiles a C program down to assembly language, and then assembles that into a giant sequence of SUBLEQs. The program sorts an array of integers. Technically, you could have a computer whose processor has an ISA made up of a single instruction. It could compute anything (given enough time.)


Thanks much. Text is so much quicker than video.

For the OP: But what is the point of the exercise? I recall that all the operators of propositional logic can be reduced to the Sheffer stroke, but what of it? It's a cute technical exercise but it has no deep meaning as far as I know. Likewise a one-instruction computer. Also, every actual computer instruction set has a no-op instruction that does nothing, to resolve timing issues. Sometimes you want to do nothing during a machine cycle. So at best you could build a computer out of two instructions. But that doesn't have any deeper meaning. Does it?
someguy1
Member
 
Posts: 758
Joined: 08 Nov 2013


Re: All the programs that could ever be expressed by a langu

Postby hyksos on March 16th, 2019, 1:38 am 

But that doesn't have any deeper meaning. Does it?

(erase link so I can edit under a deadline)
User avatar
hyksos
Active Member
 
Posts: 1651
Joined: 28 Nov 2014


Re: All the programs that could ever be expressed by a langu

Postby hyksos on March 16th, 2019, 2:17 am 

But that doesn't have any deeper meaning. Does it?

Turing completeness comes up in stranger places.
It is possible that while 'most' things can be computed, many other things cannot be. Computability , in the Church-Turing sense, may be as weak as these toy systems make it seems.
User avatar
hyksos
Active Member
 
Posts: 1651
Joined: 28 Nov 2014



Return to Computers

Who is online

Users browsing this forum: No registered users and 6 guests