I'm a bit surprised not to see any papers by Simon Peyton-Jones (of Haskell fame) on the list. I guess most of the papers on this list were old classics and many of the authors have already passed away or retired.
SPJ is one of the most relevant programming language researchers of the past 20 years, but maybe his work is "too new" for this list. Reading his books and papers taught sparked the interest in programming languages in me.
I think it lacks the property of being "a paper", but I'll happily nominate The Implementation of Functional Programming Languages Simon Peyton Jones, Prentice Hall 1987
Those are papers on the theory of programming languages, not implementation. As far as I know, none of the works of Peyton-Jones fit into that category.
> As far as I know, none of the works of Peyton-Jones fit into that category.
This is incorrect. Looking at his list publications [1], a good deal of his publications are about programming language theory and a lot fewer are about practical implementations. Some of the theory papers are quite Haskell-centric (or use Haskell as an example language) but others are more general.
Guy Lewis Steele Jr. RABBIT: A compiler for SCHEME ftp://publications.ai.mit.edu/ai-publications/pdf/AITR-474.pdf
OK, I think the first two sections are plenty. You can search for the rest yourself on Google Scholar (I found most of these as the first hit on Google Scholar, one or two took looking through a few hits or doing a Google search as well)
The problem with pay walled journals is that they devalue old but good papers by making them harder to get ahold of. Anyways, the really good ones will find their way onto the open web somehow, the merely good ones maybe not.
'Can Programming Be Liberated from the von. Neumann Style?' by John Backus is another great work very worth studying:
'Conventional programming languages are growing ever more enormous, but not stronger. Inherent defects at the most basic level cause them to be both fat and weak: their primitive word-at-a-time style of programming inherited from their common ancestor—the von Neumann computer, their close coupling of semantics to state transitions, their division of programming into a world of expressions and a world of statements, their inability to effectively use powerful combining forms for building new programs from existing ones, and their lack of useful mathematical properties for reasoning about programs.'
Thankfully, 30+ years later these problems are widely fixed ;)
The most recent paper in the first section was written in 1978.
The majority of the ones in the first two sections were written before I was born; only three were written in the 1990's.
The only one from the 21st century (published in 2000) is in the least-prestigious third section.
Does it take time for good ideas to accumulate the recognition needed to be included in this list? Or was all the low-hanging fruit quickly picked in the early days of the computer age, and only hard problems remain on which little progress has been made?
Haven't any new ideas been introduced by the innovations of modern languages like Python, Lisp, Haskell and Ruby? Or even C++, Java and JavaScript?
Or is language implementation, almost without exception, a series of engineering refinements that happen in a decades-long process after theoretical foundations are first published?
> Or was all the low-hanging fruit quickly picked in the early days of the computer age, and only hard problems remain on which little progress has been made?
Yes, most of the ground breaking theories were researched and documented decades ago. And some of that stuff has yet to make itself to mainstream programming languages (e.g. type inference).
> Haven't any new ideas been introduced by the innovations of modern languages like Python, Lisp, Haskell and Ruby? Or even C++, Java and JavaScript?
Apart from Lisp and Haskell, the "modern" languages you mention are rather boring when it comes to programming language design and there isn't too much research behind them. There has been a lot of practical work and research in garbage collection, run time systems, compiler back ends, just in time compilation and virtual machines. But when it comes to language design, the mainstream programming languages are really conservative.
The relationships between theoretical innovations and practical innovations is particularly indistinct in Computer Science.
A language like JavaScript might be interesting because it is simple and ubiquitous and embedded in a powerful application platform - but that does not mean it is of much interest from a theoretical CS perspective.
> that does not mean it is of much interest from a theoretical CS perspective
JavaScript theoretically uninteresting? Are you kidding?
What about its unusual prototypical inheritance? JSON-style literals? Its notation for anonymous functions? The approach modern JS frameworks take to single-threaded asynchronous execution?
IMHO, JS is not the best language for getting work done [1], but it has lots of innovations which are worthy of study.
[1] If you need to do Web frontend stuff, obviously JavaScript is the only game in town, unless you're into Flash, Java applets, or compilers with JS backends.
It can certainly be interesting, but most of the interesting research being done is related to the intersection of languages and security. There do exist formalisations, but they are derivative works based on the GREAT papers.
Notation is for the most part boring. The way you write function literals has no effect on the semantics of the language.
The prototype inheritance is also boring, the small step semantics being two inference rules[1].
JSON-style literals are what. A way of writing a POD. Who cares. It is boring.
Asynchronous execution is also boring. It was novel in the 1960s, it is not novel now.
There is interesting stuff about notation (in my opinion to do with parsing but perhaps thats my bias speaking), but people being "picky" about notation is theoretically dull.
If you're interested, "Programming Language Syntax and Semantics" by D. A. Watt, and "Syntax of Programming Languages: Theory and Practice" by R. C. Backhouse is probably useful in answering some of those questions.
In reality, notation is in the eye of the beholder. Some people like brackets and braces to split up the flow, where as some prefer whitespace as delimiters. There are papers out there on this (which I can't find right now).
It was the language Self (as far as I know) that introduced prototype based programming. The initial definition of Self happened in the mid-80s if I remember right and there were a ton of papers formalizing various aspects of the language including theories for prototype inheritance.
Must say that I don't consider notation as theoretically interesting. It is practically interesting obviously. (And things like anonymous functions have existed since Lisp)
> single-threaded asynchronous execution
There is an incredible amount of research in concurrency models. It is a little hard to point to a single source; but the theories of CSP and the actor model have been around for a long time and it has been incorporated in to various languages in more or less the Node JS style.
Don't get me wrong. Javascript has many interesting things to it practically. But the theory has been around for a long time.
"Interesting" and even "practically useful" do not imply "theoretically interesting from an academic perspective".
Of course, the measure for academics is "can I get something published on the topic somewhere respectable" - it (mostly) doesn't have much to do with whether something is cool or useful.
> Haven't any new ideas been introduced by the innovations of modern languages like Python, Lisp, Haskell and Ruby? Or even C++, Java and JavaScript?
Those modern languages are all older than 2000. Some of the papers cited in this list are indeed related to Haskell, Lisp (do you really consider lisp modern?). And Compiling works that will have a lasting impact needs some time to reflect on the importance of new works.
This list is essentially centered on logic in programming languages (lambda calculus, types theory, semantics, monads), hence the absence of python, ruby, javascript: those languages picked a lot from this theory but did not bring much novel material.
Ugh, if you look at the bottom of the pretty great works you will see Guy Steele's master's thesis where he defined the Scheme programming language and its compiler.
I know when I was writing my thesis I was pretty disgusted that he ruined it for all of us mere mortals. ;)
I think Claude Shannon's (of information theory fame) master's thesis beats out all others in the field of computer science.
"A Symbolic Analysis of Relay and Switching Circuits", 1938, proves that you can use electrical switches to do boolean algebra! He basically invented the digital computer, as a master's thesis.
That is wonderful. As a physicist whose first language was FORTRAN, I particularly like "It is a syntax error to write FORTRAN while not wearing a blue tie."
If anyone doesn't know, Benjamin Pierce is the author of a highly regarded book, Types and Programming Languages. (I'm yet to study it, its advanced supposedly...)
Only one paper by Filinski? I've been picking through 'Declarative Continuations and Categorical Duality' where he shows the duality between values and continuations by generalizing lambda calculus to a symmetric version. It's incredible beautiful.
My second remark is that our intellectual powers are rather geared to master static
relations and that our powers to visualize processes evolving in time are relatively
poorly developed. For that reason we should do (as wise programmers aware of our
limitations) our utmost to shorten the conceptual gap between the static program and
the dynamic process, to make the correspondence between the program (spread out in
text space) and the process (spread out in time) as trivial as possible.
From Dijkstra's "Go To Statement Considered Harmful"
SPJ is one of the most relevant programming language researchers of the past 20 years, but maybe his work is "too new" for this list. Reading his books and papers taught sparked the interest in programming languages in me.