Programming languages, by their own nature, are quickly created and changed. Every new niche, need, or market demands that new languages be invented to meet their requirements. A document, written in early 1995, listed nothing less than 2350 different languages. Another page, which catalogs versions of a program written in various languages, lists more than 500 languages.
Even with such abundance of languages, it’s easy to see that many of them are just variations in overlapping programming concepts. In fact, as the aforementioned lists show, many of them are just variants of a base languages with a few new features thrown in to deal with some unknown demand. If we observe many of the languages created in the last ten years, we will notice that there was little evolution in conceptual terms. Perl, Python, PHP, Java, C, C++, C#, and Delphi exhibit few real differences among themselves. In most cases, those differences amount to nothing more than syntactic sugar, which changes the way some constructions are written without changing their meaning. Just to clarify, I’m using evolution here to denote paradigm changes as opposed to changes that happen because of aesthetic choices or pressures by competitive features. Although all the languages previously mentioned are imperative languages, the same pattern can be observed in other languages classes. For example, many functional languages are only rehashing Lisp themes. To compensate for that homogeneity, software development seems to have changed to focus on the creation of methodologies that just make better use of already existing languages. Extreme Programming and UML are some examples of that trend. Both methodologies can be used with almost any kind of programming languages, but they don’t change the languages to which they are applied. Another focus is the creation of class libraries to cope with the limitations of some languages — poorly, in many cases.
An exception to the pattern above is the slow adoption of concepts like aspect-oriented programming, design by contract, and generics. Although those concepts are not new, their implementation is still limited. While some languages display some of those concepts naturally — Smalltalk is an evident example — and others are seeking to introduce it, many other languages seem to gravitate around their original concepts, and change only in the most restricted way. Delphi, a variant of Object Pascal, is such a language. After becoming an object-oriented language, in 1989, it has remained almost unchanged. Most new features were already old when they were implemented, and only came as a result of market pressures. Other languages, like Beta, are trying to introduce new paradigms in programming, but they seem to have limited commercial success and remain constrained to the academic realms.
It’s worth noting that many of the most used languages today have just gone beyond their procedural infancy. PHP is an obvious example. Other languages were already born as oriented-object languages, but didn’t introduce any new concepts in their syntax and/or semantic domains. Python and Java are typical cases of those languages. That means many languages are just repeating history, trapped in an old evolutionary cycle.
In spite of the reasoning above, one can ask: Is language evolution a necessary step in solving new kinds of problems? Take Smalltalk, for instance. It has remained essentially unchanged for decades. Even so, it has shown itself incredibly able to cope with the new demands of a modern programming world. Lisp is a similar example. It’s one of the oldest programming languages in use today, but it’s able to adapt itself to new tasks without any significant change in its core. It’s interesting to realize that some companies keep their use of those languages a secret, while others suggest that those languages are their competitive advantage against rivals. Is the future of programming a commercial return of those languages? We are talking about languages that were created more than 30 years ago.
Smalltalk and Lisp share a characteristic that can explain the fact that they remain current: their simplicity. Smalltalk has just five keywords and a few syntactical and semantic rules. Even so, it’s generally believed that it takes fewer lines of code to implement a given task in Smalltalk than in other programming languages. Lisp follows the same pattern. The expressions that make the language core allow it to have an incredible flexibility, and make it able to implement any construct required by a given task, also resulting in fewer lines of code. The fact that both languages use the concept of an image that holds the whole development and run-time environment also contributes immensely to their efficiency. The edit-compiled-run-debug cycle becomes edit-run-debug.
Although language evolution remains seemingly limited, a current tendency may have a positive impact in the development of new programming concepts: the greater adoption of virtual machines. The era of inefficient virtual machines is over, and new implementations combining compilation and interpretation exhibit excellent performance and make a good use of hardware resources. Virtual machines are naturally flexible and are a good field for experimentation that is not restricted to traditional environments that closely replicate the bare metal. It’s worth noting that both Smalltalk and Lisp are usually implemented in virtual machines. Some new languages try to achieve the flexibility offered by virtual machines with the introduction of new keywords that only result in a raise in the overall complexity of those languages.
Another concept that has gained wider acceptance in the last years is that of dynamic typing. Many modern languages have opted for dynamic typing to improve productivity and reduce errors. It’s curious that Java and C#, languages created with market considerations in mind, use static typing, and are forced to provide resources to “violate” that concept because of its failings. (Boxing is one of such “violations”, which tries to remedy the gap between value and reference types in those languages.)
So, what is the evolutionary path that new languages will follow? Obviously the current programming needs will not remain constant. Are oriented-object models, such as those that exist today, sufficient to take care of those new requirements? Will languages like Smalltalk and Lisp, that have proved themselves impervious to the passage of time, be able to remain up to date without significant changes in their semantic structure? I don’t even know how to begin answering those questions, but I believe they are crucial to the future of software development.