The Myth of The Next Big Language
Language oriented programming is not a new idea, but it has taken a long time to become a significant influence on the technology standards of the software industry. In my mind, there are two main long-wave reasons for this:
- The finicky and complicated nature of learning metaprogramming and templates in C++, and the total absence of such constructs from Java.
- The widespread lack of knowledge and understanding of lexical analysis and formal language theory amongst "ordinary" programmers.
Boost contains all of the tricks you could possibly need for supporting all kinds of functional idioms and algorithmic computational tasks. But while you can do functional programming, emulate Lisp, or anything possible really, in C++; how many people actually do?
Javascript, the world's most popular programming language (if the prevalence of cut-n-paste web coding, and the IE on every copy of Windows around the world is to be taken seriously), allows even beginning programmers the power to hack the basic object structures of the language. There's no pre-processor, no compiler, no makefile, it's just "File open..." in a web browser. If you could learn how to use MS Word and passed elementary arithmetic, you could probably learn how to program in Javascript.
When you've learned programming like this, it is much much more difficult to make the transition to languages that require a Babel-like thought process of higher and lower levels of abstraction to use. In C++, programmers are considering an arrangement of control flow descending from main(), memory allocation, the abstract type system, and how it translates to the executable output that is produced from the compiler. Not to mention a much more complex syntax. If you're not considering all of these things and more, it's likely that your program won't work. In Javascript, of course, what works is limited to the rather massive constraints of the web browser.
So if C++, was or is the big language, and Java usurped it in the realm of web servers and lately, the JVM as a language runtime, Javascript is something that is very difficult to compare in a satisfactory manner. Regardless of what you consider to be the Next Big Language, in a broader context, these "contenders" are so intertwined that it becomes difficult to consider how they could be considered as wholly separate. But if we did want to separate them, what if we considered the situation in the same way that we percieve the globalized economy? C++ has grown enormously since the 1980's and continues to be refined and improved. Java is growing and groaning under it's own massive weight. Despite recent hype, it is by no means out of the running in the web server market. Perl literally built the world wide web - that surely means it deserves to be considered a Big Language. Python has seen steady growth and wide distribution, Ruby and Lua are exploding in popularity. And alternatives like Haskell are growing more and more popular too. All the languages are getting bigger, some are bigger than others, but by and large, most of the languages are intimately and intricately tied together. They build each other and copulate with each other. They don't exist in isolation! How many times does this need to be repeated?
One thing these languages have in common, is optimization for a certain type of processor architecture. Mostly, thats splatting instructions at x86, although there are all kinds of similar and non-similar platforms. A drastic oversimplification, I'm sorry, but I'm in a hurry today... The key thing to consider is how imperative and functional languages still compile down to the same basic instructions in the vast majority of applications. Getting underneath that dominant interface is what it will take to radically alter the future of computer technology.
New computer architectures are going to require new strategies of getting from the language to the physical computation. Not just parallel processing, but working with completely different models of computation that are theoretically Turing equivalent, but the actual physical workings will require a very different approach to program design than what we are used to.
One of the most significant and beautiful things in computer science is the elegant symmetry of equivalence between grammars, finite-automata, and regular expressions. If you are really serious about understanding how a compiler works, you will need a deep appreciation of this. It is like the harmonious unfurling of a limitless flower connecting the borders of mathematics, logic, and linguistics. Engineering a modern compiler is made possible by the subtleties of these formal language abstractions.
Let's say we took an alphabet (finite sequence of symbols), and then made a mapping from the symbols to a particular arrangement of nested sets. Let's say these nested sets are called "membranes". Each membrane is labeled with a symbol from the alphabet. The membrane configuration is the particular topology of the nested sets. It moves from a start configuration to a halt configuration. It is possible to show, based on the mapping through formal language theory, that the interactions of membranes using passthrough and dissolve rules to change the boundary configuration has Turing completeness. A dynamically evolving set of membranes is performing effective computation. If we could find a way of encoding data in biomembranes and algorithmic logic in their reactive transformations, we could chemically compute using living cells. This is an amazingly hard physics and engineering problem, but the theory and possibility is there. We can imagine a future where computers can be grown from cells, and evolved into particular configurations to suit specific computational tasks. But in that case, how will we pass a grammar over the sets to control the configuration? My intuition is that membrane computing will require a radical visual approach to programming that is unanticipated by text based idioms.
And I didn't even mention quantum computing, or the challenges of designing languages to get the most out of quantum logic.
The defining problem for the next generation of languages is to provide a more complete model for adapting to the kinds of topological configurations resplendent in recent advances in physics and molecular biochemistry. Soon, fabrics, gels, surfaces, and even particles in the air itself may become computationally aware, but it's fairly clear that current mainstream programming languages lack the abstractions and tools to adequately deal with radically new models of IO and computation itself. I'd like to see more language designers and philosophers of software thinking and exploring beyond the easily-conceivable next 3-5 years of technology. Project these concepts, and tear the field wide apart!