Introduction to Evolution of Programming LanguagesN
A computer needs to be given instructions in a programming language that it understands. A programming language is an artificial language that can be used to control the behavior of computer. Programming languages, like human languages, are defined through the use of syntactic and semantic rules, to determine structure and meaning respectively. Programming languages are used to facilitate communication about the task of organizing and manipulatiing information, and to express algorithms precisely.
In order to understand why programming languages are as they are today, and to predict how they might develop in the future, we need to know something about how they evolved. The more natural and close to the problem domain, the easier it is to get the machine to do what you want.
We can summarize the goal of the assignment as answers to questions such as these follows:
- What is a good programming language?
- How should we choose an appropriate language for a particular task?
This assignment dicusses the main programming languages that had impact on programming and evolution of the programming languages in the chronological order under the headings as in contents and then the conclusion about the evolution of the programming languages.
- Before 1940
- The 1940s
- The 1950s and 1960s
- 1967-1978: establishing fundamental paradigms
- The 1980s: consolidation, modules, performance
- The 1990s: the Internet age
- Current trends
Before 1940: Early History – The first programmers
The first programming languages predate the modern computer. At first, the languages were codes.
During a nine-month period in 1842-1843, Ada Lovelace translated Italian mathematician Luigi Menabrea’s memoir on Charles Babbage’s newest proposed machine, the Analytical Engine. With the article, she appended a set of notes which specified in complete detail a method for calculating Bernoulli numbers with the Engine, recognized by some historians as the world’s first computer program. But some biographers debate the extent of her original contributions versus those of her husband.
The Jacquard loom, invented in 1801, used holes in punched cards to represent sewing loom arm movements in order to generate decorative patterns automatically.
Herman Hollerith realized that he could encode information on punch cards when he observed that train conductors would encode the appearance of the ticket holders on the train tickets using the position of punched holes on the tickets. Hollerith then proceeded to encode the 1890 census data on punch cards.
The first computer codes were specialized for the applications. In the first decades of the twentieth century, numerical calculations were based on decimal numbers. Eventually it was realized that logic could be represented with numbers, as well as with words. For example, Alonzo Church was able to express the lambda calculus in a formulaic way. The Turing machine was an abstraction of the operation of a tape-marking machine, for example, in use at the telephone companies. However, unlike the lambda calculus, Turing’s code does not serve well as a basis for higher-level languages, its principal use is in rigorous analyses of algorithmic complexity.
Like many “firsts” in history, the first modern programming language is hard to identify. From the start, the restrictions of the hardware defined the language. Punch cards allowed 80 columns, but some of the columns had to be used for a sorting number on each card. Fortran included some keywords which were the same as English words, such as “IF”, “GOTO” (go to) and “CONTINUE”. The use of a magnetic drum for memory meant that computer programs also had to be interleaved with the rotations of the drum. Thus the programs were more hardware dependent than today.
To some people the answer depends on how much power and human-readability is required before the status of “programming language” is granted. Jacquard looms and Charles Babbage’s Difference Engine both had simple, extremely limited languages for describing the actions that these machines should perform. One can even regard the punch holes on a player piano scroll as a limited domain-specific language, albeit not designed for human consumption.
The 1940‘s: Von Neumann and Zuse
In the 1940s the first recognizably modern, electrically powered computers were created. The limited speed and memory capacity forced programmers to write hand tuned assembly language programs. It was soon discovered that programming in assembly language required a great deal of intellectual effort and was error-prone.
In 1948, Konrad Zuse published a paper about his programming language Plankalkul (plan calculus). However, it was not implemented in his time and his original contributions were isolated from other developments. Konrad Zuse began work on Plankalkul, the first algorithmic programming language, with an aim of creating the theoretical preconditions for the formulation of problems of a general nature. Seven years earlier, Zuse had developed and built the world’s first binary digital computer, the Z1. He completed the first fully functional program-controlled electromechanical digital computer, the Z3, in 1941. Only the Z4 – the most sophisticated of his creations(survived World War II).
Konrad Zuse (Plankalkul):
- in Germany, in isolation because of the war
- defined Plankalkul (program calculus) circa 1945 but never implemented it.
- Wrote algorithms in the language, including a program to play chess.
- His work finally published in 1972.
- Included some advanced data type features such as :
» Floating point, used twos complement and hidden bits
» records (that could be nested).
Von Neumann led a team that built computers with stored programs and a central processor ENIAC was programmed with patch cords.
Machine Codes (40’s) :
- Initial computers were programmed in raw machine codes.
- These were entirely numeric.
- What was wrong with using machine code?
- Poor readability
- Poor modifiability
- Expression coding was tedious
- Inherit deficiencies of hardware, e.g., no indexing or floating point numbers
Some important languages that were developed in this period include:
- 1943 – Plankalkül (Konrad Zuse)
- 1943 – ENIAC coding system
- 1949 – C-10
The 1950’s & 1960’s : The First Programming Language
In the 1950s the first three modern programming languages whose descendants are still in widespread use today were designed :
- FORTRAN (1955): The “FORmula TRANslator”, first higher level programming language invented by John Backus.
- LISP: The “LISt Processor”, first language outside the von Neumann model invented by John McCarthy.
- COBOL: The COmmon Business Oriented Language, first business oriented language created by the Short Range Committee, heavily influenced by Grace Hopper.
FORTRAN was introduced in 1957 at IBM by a team led by John Backus. The “Preliminary Report” describes the goal of the FORTRAN project:
The IBM Mathematical Formula Translation System or briefly, FORTRAN, will comprise a large set of programs to enable the IBM 704 to accept a concise formulation of a problem in terms of a mathematical notation and to produce automatically a high-speed 704 program for the solution of the problem. (Quoted in (Sammet 1969).)
This suggests that the IBM team’s goal was to eliminate programming! The following quotation seems to confirm this:
If it were possible for the 704 to code problems for itself and produce as good programs as human coders (but without the errors), it was clear that large benefits could be achieved. (Backus 1957)
It is interesting to note that, 20 years later, Backus (1978) criticized FORTRAN and similar languages as “lacking useful mathematical properties”. He saw the assignment statement as a source of inefficiency: “the von Neumann bottleneck”. The solution, however, was very similar to the solution he advocated in 1957 — programming must become more like mathematics: “we should be focusing on the form and content of the overall result”.
Although FORTRAN did not eliminate programming, it was a major step towards the elimination of assembly language coding. The designers focused on efficient implementation rather than elegant language design, knowing that acceptance depended on the high performance of compiled programs.
FORTRAN has value semantics. Variable names stand for memory addresses that are determined when the program is loaded.
The major achievements of FORTRAN are:
- efficient compilation
- separate compilation (programs can be presented to the compiler as separate subroutines, but the compiler does not check for consistency between components)
- demonstration that high-level programming, with automatic translation to machine code, is feasible.
The principal limitations of FORTRAN are:
Flat, uniform structure : There is no concept of nesting in FORTRAN. A program consists of a sequence of subroutines and a main program. Variables are either global or local to subroutines. In other words, FORTRAN programs are rather similar to assembly language programs: the main difference is that a typical line of FORTRAN describes evaluating an expression and storing its value in memory whereas a typical line of assembly language specifies a machine instruction (or a small group of instructions in the case of a macro).
Limited control structures : The control structures of FORTRAN are IF, DO, and GOTO. Since there are no compound statements, labels provide the only indication that a sequence of statements form a group.
Unsafe memory allocation : FORTRAN borrows the concept of COMMON storage from assembly language program. This enables different parts of a program to share regions of memory, but the compiler does not check for consistent usage of these regions. One program component might use a region of memory to store an array of integers, and another might assume that the same region contains reals. To conserve precious memory, FORTRAN also provides the EQUIVALENCE statement, which allows variables with different names and types to share a region of memory.
No recursion : FORTRAN allocates all data, including the parameters and local variables of subroutines, statically. Recursion is forbidden because only one instance of a subroutine can be active at one time.
Functional programming was introduced in 1958 in the form of LISP by John McCarthy. The following account of the development of LISP is based on McCarthy’s (1978) history.
The important early decisions in the design of LISP were:
- to provide list processing (which already existed in languages such as Information Processing Language (IPL) and FORTRAN List Processing Language (FLPL)) the operands of an expression)
- to use the concept of “function” as widely as possible (cons for list construction; car and cdr for extracting list components; cond for conditional, etc.)
- to provide higher order functions and hence a notation for functions (based on Church’s (1941) λ-notation)
- to avoid the need for explicit erasure of unused list structures.
McCarthy (1960) wanted a language with a solid mathematical foundation and decided that recursive function theory was more appropriate for this purpose than the then-popular Turing machine model. He considered it important that LISP expressions should obey the usual mathematical laws allowing replacement of expression. Another way to show that LISP was neater than Turing machines was to write a universal LISP function and show that it is briefer and more comprehensible than the description of a universal Turing machine. This was the LISP function eval[e, a], which computes the value of a LISP expression e, the second argument a being a list of assignments of values to variables. . . . Writing eval required inventing a notation for representing LISP functions as LISP data, and such a notation was devised for the purpose of the paper with no thought that it would be used to express LISP programs in practice. (McCarthy 1978)
After the paper was written, McCarthy’s graduate student S. R. Russel noticed that eval could be used as an interpreter for LISP and hand-coded it, thereby producing the first LISP interpreter. Soon afterwards, Timothy Hart and Michael Levin wrote a LISP compiler in LISP, this is probably the first instance of a compiler written in the language that it compiled.
It is interesting to note that the close relationship between code and data in LISP mimics the von Neumann architecture at a higher level of abstraction. LISP was the first in a long line of functional programming languages. Its principal contributions are listed below:
Names : In procedural programming languagess, a name denotes a storage location (value semantics). In LISP, a name is a reference to an object, not a location (reference semantics). The two objects have different memory addresses. A consequence of the use of names as references to objects is that eventually there will be objects for which there are no references: these objects are “garbage” and must be automatically reclaimed if the interpreter is not to run out of memory. The alternative requiring the programmer to explicitly deallocate old cells would add considerable complexity to the task of writing LISP programs. Nevertheless, the decision to include automatic garbage collection (in 1958!) was courageous and influential.
Lambda : LISP uses “lambda expressions”, based on Church’s λ-calculus, to denote functions. However, the lambda expression itself cannot be evaluated. Consequently, LISP had to resort to programming tricks to make higher order functions work.
Dynamic Scoping : Dynamic scoping was an “accidental” feature of LISP. It arose as a side-effect of the implementation of the look-up table for variable values used by the interpreter. A LISP interpreter constructs its environment as it interprets. Although dynamic scoping is natural for an interpreter, it is inefficient for a compiler. Interpreters are slow anyway, and the overhead of searching a linear list for a variable value just makes them slightly slower still. A compiler, however, has more efficient ways of accessing variables, and forcing it to maintain a linear list would be unacceptably inefficient. Consequently, early LISP systems had an unfortunate discrepancy: the interpreters used dynamic scoping and the compilers used static scoping. Some programs gave one answer when interpreted and another answer when compiled!
Interpretation : LISP was the first major language to be interpreted. Originally, the LISP interpreter behaved as a calculator. It evaluated expressions entered by the user, but its internal state did not change. It was not long before a form for defining functions was introduced to enable users to add their own functions to the list of built-in functions.
A LISP program has no real structure. On paper, a program is a list of function definitions, the functions may invoke one another with either direct or indirect recursion. At run-time, a program is the same list of functions, translated into internal form, added to the interpreter.
COBOL (Sammett 1978) introduced structured data and implicit type conversion. When COBOL was introduced, “programming” was more or less synonymous with “numerical computation”. COBOL introduced “data processing”, where data meant large numbers of characters. The data division of a COBOL program contained descriptions of the data to be processed.
Another important innovation of COBOL was a new approach to data types. The problem of type conversion had not arisen previously because only a small number of types were provided by the programming languages. COBOL introduced many new types, in the sense that data could have various degrees of precision, and different representations as text. The choice made by the designers of COBOL was radical: type conversion should be automatic.
The assignment statement in COBOL has several forms, including
MOVE X to Y.
If X and Y have different types, the COBOL compiler will attempt to find a conversion from one type to the other. In most programming languages of the time, a single statement translated into a small number of machine instructions. In COBOL, a single statement could generate a large amount of machine code.
Another milestone in the late 1950s was the publication, by a committee of American and European computer scientists, of “a new language for algorithms” the ALGOL(the “ALGOrithmic Language”).
Environment of development:
- FORTRAN had (barely) arrived for IBM 70x
- Many other languages were being developed, all for specific machines
- No portable language; all were machine dependent
- No universal language for communicating algorithms.
ACM and GAMM met for four days for design
>> Goals of the language:
- Close to mathematical notation
- Good for describing algorithms
- Must be translatable to machine code
Algol 60 :
This report consolidated many ideas circulating at the time and featured two key language innovations:
- arbitrarily nested block structure: meaningful chunks of code could be grouped into statement blocks without having to be turned into separate, explicitly named procedures;
- lexical scoping: a block could have its own variables that code outside the chunk cannot access, let alone manipulate.
Another innovation, related to this, was in how the language was described:
- a mathematically exact notation, Backus–Naur Form, was used to describe the language’s syntax. Nearly all subsequent programming languages have used a variant of Backus-Naur Form to describe the context-free portion of their syntax.
Algol 60 was particularly influential in the design of later languages, some of which soon became more popular. The Burroughs large systems were designed to be programmed in an extended subset of Algol.
Algol 68 :
Algol’s key ideas were continued, producing ALGOL 68:
- syntax and semantics became even more orthogonal, with anonymous routines, a recursive typing system with higher-order functions, etc.;
- not only the context-free part, but the full language syntax and semantics were defined formally, in terms of Van Wijngaarden grammar, a formalism designed specifically for this purpose.
Algol 68’s many little-used language features (e.g. concurrent and parallel blocks) and its complex system of syntactic shortcuts and automatic type coercions made it unpopular with implementers and gained it a reputation of being difficult. Niklaus Wirth actually walked out of the design committee to create the simpler Pascal language.
- 1951 – Regional Assembly Language
- 1952 – Autocode
- 1954 – FORTRAN
- 1955 – FLOW-MATIC (forerunner to COBOL)
- 1957 – COMTRAN (forerunner to COBOL)
- 1958 – LISP
- 1958 – ALGOL 58
- 1959 – FACT (forerunner to COBOL)
- 1959 – COBOL
- 1962 – APL
- 1962 – Simula
- 1964 – BASIC
- 1964 – PL/I
1967-1978: Establishing fundamental paradigms
The period from the late 1960s to the late 1970s brought a major flowering of programming languages. Most of the major language paradigms now in use were invented in this period:
- Simula, invented in the late 1960s by Nygaard and Dahl as a superset of Algol 60, was the first language designed to support object-oriented programming.
- C, an early systems programming language, was developed by Dennis Ritchie and Ken Thompson at Bell Labs between 1969 and 1973.
- Smalltalk (mid 1970s) provided a complete ground-up design of an object-oriented language.
- Prolog, designed in 1972 by Colmerauer, Roussel, and Kowalski, was the first logic programming language.
- ML built a polymorphic type system (invented by Robin Milner in 1973) on top of Lisp, pioneering statically typed functional programming languages.
Each of these languages spawned an entire family of descendants, and most modern languages count at least one of them in their ancestry
Many programs are computer simulations of the real world or a conceptual world. Writing such programs is easier if there is a correspondence between objects in the world and components of the program. Simula originated in the Norwegian Computing Centre in 1962. Kristen Nygaard proposed a language for simulation to be developed by himself and Ole-Johan Dahl (1978). Key insights were developed in 1965 following experience with Simula I:
- the purpose of the language was to model systems,
- a system is a collection of interacting processes,
- a process can be represented during program execution by multiple procedures each with its own Algol-style stacks.
The main lessons of Simula I were:
- the distinction between a program text and its execution,
- the fact that data and operations belong together and that most useful programming constructs contain both.
Simula 67 was a general purpose programming language that incorporated the ideas of Simula I but put them into a more general context. The basic concept of Simula 67 was to be “classes of objects”. The major innovation was “block prefixing” (Nygaard and Dahl 1978).
Prefixing emerged from the study of queues, which are central to discrete-event simulations. The queue mechanism (a list linked by pointers) can be split off from the elements of the queue (objects such as trucks, buses, people, and so on, depending on the system being simulated). Once recognized, the prefix concept could be used in any context where common features of a collection of classes could be abstracted in a prefixed block or, as we would say today, a superclass.
Dahl (1978, page 489) has provided his own analysis of the role of blocks in Simula.
- Deleting the procedure definitions and final statement from an Algol block gives a pure data record.
- Deleting the final statement from an Algol block, leaving procedure definitions and data declarations, gives an abstract data object.
- Adding coroutine constructs to Algol blocks provides quasi-parallel programming capabilities.
- Adding a prefix mechanism to Algol blocks provides an abstraction mechanism (the class hierarchy).
- coroutines that permit the simulation of concurrent processes,
- multiple stacks, needed to support coroutines,
- classes that combine data and a collection of functions that operate on the data,
- prefixing (now known as inheritance) that allows a specialized class to be derived from a general class without unnecessary code duplication,
- a garbage collector that frees the programmer from the responsibility of deallocating storage.
Simula itself had an uneven history. It was used more in Europe than in North America, but it never achieved the recognition that it deserved. This was partly because there were few Simula compilers and the good compilers were expensive. On the other hand, the legacy that Simula left is considerable: a new paradigm of programming.
C is a very pragmatic programming language. Ritchie (Ritchie 1996) designed it for a particular task, systems programming , for which it has been widely used. The enormous success of C is partly accidental. UNIX, after Bell released it to universities, became popular, with good reason. Since UNIX depended heavily on C, the spread of UNIX inevitably led to the spread of C.
C is based on a small number of primitive concepts. For example, arrays are defined in terms of pointers and pointer arithmetic. This is both the strength and weakness of C. The number of concepts is small, but C does not provide real support for arrays, strings, or boolean operations.
C is a low-level language by comparison with the other programming languages discussed in this section. It is designed to be easy to compile and to produce efficient object code. The compiler is assumed to be rather unsophisticated (a reasonable assumption for a compiler running on a PDP/11 in the late 60’s) and in need of hints such as register. C is notable for its concise syntax. Some syntactic features are inherited from Algol 68 (for example, += and other assignment operators) and others are unique to C and C++ (for example, postfix and prefix ++ and –).
Smalltalk originated with Alan Kay’s reflections on the future of computers and programming in the late 60s. Kay (1996) was influenced by LISP, especially by its one-page metacircular interpreter, and by Simula. It is interesting to note that Kay gave a talk about his ideas at MIT in November 1971, the talk inspired Carl Hewitt’s work on his Actor model, an early attempt at formalizing objects. In turn, Sussman and Steele wrote an interpreter for Actors that eventually became Scheme. The cross-fertilization between object oriented programming language and functionning programming language that occurred during these early days has, sadly, not continued.
The first version of Smalltalk was implemented by Dan Ingalls in 1972, using BASIC (!) as the implementation language . Smalltalk was inspired by Simula and LISP; it was based on six principles :
- Everything is an object.
- Objects communicate by sending and receiving messages (in terms of objects).
- Objects have their own memory (in terms of objects).
- Every object is an instance of a class (which must be an object).
- The class holds the shared behaviour for its instances (in the form of objects in a program list).
- To evaluate a program list, control is passed to the first object and the remainder is treated as its message.
Principles 1–3 provide an “external” view and remained stable as Smalltalk evolved. Principles 4–6 provide an “internal” view and were revised following implementation experience. Principle 6 reveals Kay’s use of LISP as a model for Smalltalk . McCarthy had described LISP with a one-page meta-circular interpreter, one of Kay’s goals was to do the same for Smalltalk.
Smalltalk was also strongly influenced by Simula. However, it differs from Simula in several ways:
- Simula distinguishes primitive types, such as integer and real, from class types. In Smalltalk, “everything is an object”.
- In particular, classes are objects in Smalltalk. To create a new instance of a class, you send a message to it. Since a class object must belong to a class, Smalltalk requires metaclasses.
- Smalltalk effectively eliminates passive data. Since objects are “active” in the sense that they have methods, and everything is an object, there are no data primitives.
- Smalltalk is a complete environment, not just a compiler. You can edit, compile, execute, and debug Smalltalk programs without ever leaving the Smalltalk environment.
The “block” is an interesting innovation of Smalltalk. A block is a sequence of statements that can be passed as a parameter to various control structures and behaves rather like an object. In the Smalltalk statement 10 timesRepeat: [Transcript nextPutAll: ’ Hi!’] the receiver is 10, the message is timesRepeat and [Transcript nextPutAll: ’ Hi!’] is the parameter of the message.
The first practical version of Smalltalk was developed in 1976 at Xerox Palo Alto Research Center (PARC). The important features of Smalltalk are:
- everything is an object
- an object has private data and public functions
- objects collaborate by exchanging “messages”
- every object is a member of a class
- there is an inheritance hierarchy (actually a tree) with the class Object as its root
- all classes inherit directly or indirectly from the class Object
- garbage collection.
The computational model of a logic programming language is some form of mathematical logic. Propositional calculus is too weak because it does not have variables. Predicate calculus is too strong because it is undecidable. (This means that there is no effective procedure that determines whether a given predicate is true.) Practical logic programming languages are based on a restricted form of predicate calculus that has a decision procedure.
The first logic programming language was Prolog, introduced by Colmerauer (1973) and Kowalski (1974). The discussion in this section is based on (Bratko 1990).
Prolog is interactive, it prompts with ?- to indicate that it is ready to accept a query. Prolog responds to simple queries about facts that it has been told with “yes” and “no”. If the query contains a logical variable — an identifier that starts with an upper case letter — Prolog attempts to find a value of the variable that makes the query true.
Prolog syntax corresponds to a restricted form of first-order predicate calculus called clausal form logic. It is not practical to use the full predicate calculus as a basis for a programming language because it is undecidable. Clausal form logic is semi-decidable: there is an algorithm that will find a proof for any formula that is true in the logic. If a formula is false, the algorithm may fail after a finite time or may loop forever. The proof technique, called SLD resolution, was introduced by Robinson(1965). SLD stands for “Selecting a literal, using a Linear strategy, restricted to Definite clauses”.
The proof of validity of SLD resolution assumes that unification is implemented with the occurs check. Unfortunately, the occurs check is expensive to implement and most Prolog systems omit it. Consequently, most Prolog systems are technically unsound, although problems are rare in practice.
A Prolog program apparently has a straightforward interpretation as a statement in logic, but the interpretation is slightly misleading. For example, since Prolog works through the rules in the order in which they are written, the order is significant. Since the logical interpretation of a rule sequence is disjunction, it follows that disjunction in Prolog does not commute.
The important features of Prolog include:
- Prolog is based on a mathematical model (clausal form logic with SLD resolution).
- “Pure” Prolog programs, which fully respect the logical model, can be written but are often inefficient.
- In order to write efficient programs, a Prolog programmer must be able to read programs both declaratively (to check the logic) and procedurally (to ensure efficiency).
- Prolog implementations introduce various optimizations (the cut, omitting the occurs check) that improve the performance of programs but compromise the mathematical model.
- Prolog has garbage collection.
The 1960s and 1970s also saw considerable debate over the merits of “structured programming”, which essentially meant programming without the use of Goto. This debate was closely related to language design: some languages did not include GOTO, which forced structured programming on the programmer. Although the debate raged hotly at the time, nearly all programmers now agree that, even in language s that provide GOTO, it is bad programming style to use it except in rare circumstances. As a result, later generations of language designers have found the structured programming debate tedious and even bewildering.
Some important languages that were developed in this period include:
- 1968- Simula
- 1970 – Pascal
- 1970 – Forth
- 1972 – C
- 1972 – Smalltalk
- 1972 – Prolog
- 1973 – ML
- 1978 – SQL (initially only a query language, later extended with programming constructs)
The 1980s: consolidation, modules, performance
The 1980s were years of relative consolidation. C++ combined object-oriented and systems programming. The United States government standardized Ada, a systems programming language intended for use by defense contractors. In Japan and elsewhere, vast sums were spent investigating so-called fifth-generation programming languages that incorporated logic programming constructs. The functional languages community moved to standardize ML and Lisp. Rather than inventing new paradigms, all of these movements elaborated upon the ideas invented in the previous decade.
However, one important new trend in language design was an increased focus on programming for large-scale systems through the use of modules, or large-scale organizational units of code. Modula, Ada, and ML all developed notable module systems in the 1980s. Module systems were often wedded to generic programming constructs—generics being, in essence, parameterized modules (see also polymorphism in object-oriented programming).
Although major new paradigms for programming languages did not appear, many researchers expanded on the ideas of prior languages and adapted them to new contexts. For example, the languages of the Argus and Emerald systems adapted object-oriented programming to distributed systems.
Ada (Whitaker 1996) represents the last major effort in procedural language design. It is a large and complex language that combines then-known programming features with little attempt at consolidation. It was the first widely-used language to provide full support for concurrency, with interactions checked by the compiler, but this aspect of the language proved hard to implement.
Ada provides templates for procedures, record types, generic packages, and task types. The corresponding objects are: blocks and records (representable in the language); and packages and tasks (not representable in the language). It is not clear why four distinct mechanisms are required (Gelernter and Jagannathan 1990). The syntactic differences suggest that the designers did not look for similarities between these constructs.
The parameters of a record type are optional. If present, they have a different form than the parameters of procedures.
Of course, programmers hardly notice syntactic differences of this kind: they learn the correct incantation and recite it without thinking. But it is disturbing that the language designers apparently did not consider passible relationships between these four kinds of declaration. Changing the syntax would be a minor improvement, but uncovering deep semantic similarities might have a significant impact on the language as a whole, just as the identity declaration of Algol 68 suggested new and interesting possibilities.
Wirth (1982) followed Pascal with Modula–2, which inherits Pascal’s strengths and, to some extent, removes Pascal’s weaknesses. The important contribution of Modula–2 was, of course, the introduction of modules. (Wirth’s first design, Modula, was never completed. Modula–2 was the product of a sabbatical year in California, where Wirth worked with the designers of Mesa, another early modular language.)
A module in Modula–2 has an interface and an implementation. The interface provides information about the use of the module to both the programmer and the compiler. The implementation contains the “secret” information about the module. This design has the unfortunate consequence that some information that should be secret must be put into the interface. For example, the compiler must know the size of the object in order to declare an instance of it. This implies that the size must be deducible from the interface which implies, in turn, that the interface must contain the representation of the object. (The same problem appears again in C++.)
Modula–2 provides a limited escape from this dilemma: a programmer can define an “opaque” type with a hidden representation. In this case, the interface contains only a pointer to the instance and the representation can be placed in the implementation module.
The important features of Modula–2 are:
- Modules with separated interface and implementation descriptions (based on Mesa).
The 1980s also brought advances in programming language implementation. The RISC movement in computer architecture postulated that hardware should be designed for compilers rather than for human assembly programmers. Aided by processor speed improvements that enabled increasingly aggressive compilation techniques, the RISC movement sparked greater interest in compilation technology for high-level languages.
Language technology continued along these lines well into the 1990s.
Some important languages that were developed in this period include:
- 1983 – Ada
- 1983 – C++
- 1985 – Eiffel
- 1987 – Perl
- 1989 – FL (Backus)
The 1990s: the Internet age
The 1990s saw no fundamental novelty, but much recombination as well as maturation of old ideas. A big driving philosophy was programmer productivity. Many “rapid application development” (RAD) languages emerged, which usually came with an IDE, garbage collection, and were descendants of older languages. All such languages were object-oriented. These included Object Pascal, Visual Basic, and C#. Java was a more conservative language that also featured garbage collection and received much attention. More radical and innovative than the RAD languages were the new scripting languages. These did not directly descend from other languages and featured new syntaxes and more liberal incorporation of features. Many consider these scripting languages to be more productive than even the RAD languages, but often because of choices that make small programs simpler but large programs more difficult to write and maintain. Nevertheless, scripting languages came to be the most prominent ones used in connection with the Web.
Pascal was designed by Wirth (1996) as a reaction to the complexity of Algol 68, PL/I, and other languages that were becoming popular in the late 60s. Wirth made extensive use of the ideas of Dijkstra and Hoare (later published as (Dahl, Dijkstra, and Hoare 1972)), especially Hoare’s ideas of data structuring. The important contributions of Pascal included the following :
- Pascal demonstrated that a programming language could be simple yet powerful.
- The type system of Pascal was based on primitives (integer, real, bool, . . . .) and mechanisms for building structured types (array, record, file, set, . . . .). Thus data types in Pascal form a recursive hierarchy just as blocks do in Algol 60.
- Pascal provides no implicit type conversions other than subrange to integer and integer to real. All other type conversions are explicit (even when no action is required) and the compiler checks type correctness.
- Pascal was designed to match Wirth’s (1971) ideas of program development by stepwise refinement. Pascal is a kind of “fill in the blanks” language in which all programs have a similar structure, determined by the relatively strict syntax. Programmers are expected to start with a complete but skeletal “program” and flesh it out in a series of refinement steps, each of which makes certain decisions and adds new details. The monolithic structure that this idea imposes on programs is a drawback of Pascal because it prevents independent compilation of components.
Pascal was a failure because it was too simple. Because of the perceived missing features, supersets were developed and, inevitably, these became incompatible. The first version of “Standard Pascal” was almost useless as a practical programming language and the Revised Standard described a usable language but appeared only after most people had lost interest in Pascal.
Like Algol 60, Pascal missed important opportunities. The record type was a useful innovation (although very similar to the Algol 68 struct) but allowed data only. Allowing functions in a record declaration would have paved the way to modular and even object oriented programming.
Nevertheless, Pascal had a strong influence on many later languages. Its most important innovations were probably the combination of simplicity, data type declarations, and static type checking.
Java (Arnold and Gosling 1998) is an object oriented programming language introduced by Sun Microsystems. Its syntax bears some relationship to that of C++, but Java is simpler in many ways than C++. Key features of Java include the following:
- Java is compiled to byte codes that are interpreted. Since any computer that has a Java byte code interpreter can execute Java programs, Java is highly portable.
- The portability of Java is exploited in network programming: Java bytes can be transmitted across a network and executed by any processor with an interpreter.
- Java offers security. The byte codes are checked by the interpreter and have limited functionality. Consequently, Java byte codes do not have the potential to penetrate system security in the way that a binary executable (or even a MS-Word macro) can.
- Java has a class hierarchy with class Object at the root and provides single inheritance of classes.
- In addition to classes, Java provides interfaces with multiple inheritance.
- Java has an exception handling mechanism.
- Java provides concurrency in the form of threads.
- Primitive values, such as int, are not objects in Java. However, Java provides wrapper classes, such as Integer, for each primitive type.
- A variable name in Java is a reference to an object.
- Java provides garbage collection
Some important languages that were developed in this period include:
- 1990 – Haskell
- 1991 – Python
- 1991 – Java
- 1993 – Ruby
- 1993 – Lua
- 1994 – ANSI Common Lisp
- 1995 – PHP
- 2000 – C#
- 2008 – JavaFX Script
Programming language evolution continues, in both industry and research. Some of the current trends include:
- Mechanisms for adding security and reliability verification to the language: extended static checking, information flow control, static thread safety.
- Alternative mechanisms for modularity: mixins, delegates, aspects.
- Component-oriented software development.
- Metaprogramming, reflection or access to the abstract syntax tree
- Increased emphasis on distribution and mobility.
- Integration with databases, including XML and relational databases.
- Support for Unicode so that source code (program text) is not restricted to those characters contained in the ASCII character set; allowing, for example, use of non-Latin-based scripts or extended punctuation.
- XML for graphical interface (XUL, XAML).
- Mathematically oriented views of programming favour values. Simulation oriented views favour objects. Both views are useful for particular applications. A programming language that provides both in a consistent and concise way might be simple, expressive, and powerful. The “logical variable” is a third kind of entity, after values and objects. It might be fruitful to design a programming language with objects, logical variables, unification, and backtracking. Such a language would combine the advantages of object oriented programming language and logic programming
- It is not hard to design a large and complex programming language by throwing in numerous features. The hard part of design is to find simple, powerful abstractions to achieve more with less. In this respect, programming language design resembles mathematics. The significant advances in mathematics are often simplifications that occur when structures that once seemed distinct are united in a common abstraction. Similar simplifications have occurred in the evolution of programming languages: for example, Simula. But programs are not mathematical objects. A rigorously mathematical approach can lead all too rapidly to the “Turing tar pit”.
- The evolution of programming languages shows that, most of the time, practice leads theory. Designers and implementors introduce new ideas, then theoreticians attempt to what they did and how they could have done it better. There are a number of programming languages that have been based on purely theoretical principles but few, if any, are in widespread use. Theory-based programming languages are important because they provide useful insights. Ideally, they serve as testing environments for ideas that may eventually be incorporated into new mainstream programming languages. Unfortunately, it is often the case that they show retroactively how something should have been done when it is already too late to improve it.
- Programming languages have evolved into several strands: procedural programming, now more or less subsumed by object oriented programming, functional programming, logic programming and its successor, constraint programming. Each paradigm performs well on a particular class of applications. Since people are always searching for “universal” solutions, it is inevitable that some will try to build a “universal” programming language by combining paradigms. Such attempts will be successful to the extent that they achieve overall simplification. It is not clear that any future universal language will be accepted by the programming community, it seems more likely that specialized languages will dominate in the future.