What is Missing from Modern Programming Languages?
The programming community is full of yawn inducing debates over static vs dynamic typing, object oriented style, or mine-is-better-than-yours. Unfortunately, there’s still a lot of important dialogue about the standard features of modern programming languages that hasn’t happened yet.
Where are the graphs?
NoSQL discussions are everywhere you look in the web community these days and it seems people are finally starting to grok the idea of graph oriented databases. But why should we even need to delegate graph operations to a separate data-layer?
One of my ongoing frustrations is the lack of abstract data types and native language support for graph structures in popular programming languages. Despite graphs being quite possibly the central object of computer science, if you look for methods to directly manipulate graphs in the standard libraries of any of these languages, you’ll be left scratching your head. Instead, lists and arrays have ascended the throne, and many programmers believe that they need nothing more than hash tables to achieve world domination.
If we could take the low level implementation of a Set ADT, and add a few extensions to model edges joining nodes, we might start to get somewhere.
That’s why I am excited by Maglev. If you thought this is just another method of storage for Ruby domain models, you missed the point. Being able to seamlessly persist in-memory objects lays open the capability for graph oriented data manipulation in the native language without special extensions or the need to move the data structures in-and-out of a separate serialization format. If Ruby had native implementations of a Graph or Tree class that augmented the native Set class, the possibilities would be limitless.
Where are the pipelines?
Pipes are a fundamental aspect of Unix programming, and the pipe and filter architecture is a fundamental aspect of computing, much the same as the request-response or read-write pattern.
But this style of programming is something you simply don’t see in many modern languages. It’s rare to find data flow tools as language primitives, rather than a framework construction, so users of the languages are generally left to define their own abstractions where this is needed.
The key feature of pipeline processing is that filters are able to begin processing data synchronously before the pipeline has returned. Data can move through multiple filters and pipes while the initiating pipe continues to stream data. This is very different from simply chaining functions, where the input to the next function must always be the final result of the preceding function.
Streamlined list processing aka. pipes in Python by Anh Hai Trinh totally nails this concept, and presents a sleek implementation that overloads the <<
operator and should make non-Python programmers immensely jealous.
Worn Debates
The key theme of the current era of programming is the rise of multi-paradigm languages and polyglot programming. Yet, most of the time developers are still wedged between the rock of object orientated imperative languages and the hard-place of a functional style and its algebra of programs.
Sometimes, it’s easy to forget that there are a lot more models of programming than just static vs dynamic, imperative vs functional, or object oriented vs procedural.
It would be really nice to see this understanding begin to seep in to lower level language implementations. In the meantime, this missing behavior will continue to be augmented by libraries, clever hacks, and separate domain specific languages.