Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The most recent paper in the first section was written in 1978.

The majority of the ones in the first two sections were written before I was born; only three were written in the 1990's.

The only one from the 21st century (published in 2000) is in the least-prestigious third section.

Does it take time for good ideas to accumulate the recognition needed to be included in this list? Or was all the low-hanging fruit quickly picked in the early days of the computer age, and only hard problems remain on which little progress has been made?

Haven't any new ideas been introduced by the innovations of modern languages like Python, Lisp, Haskell and Ruby? Or even C++, Java and JavaScript?

Or is language implementation, almost without exception, a series of engineering refinements that happen in a decades-long process after theoretical foundations are first published?



> Or was all the low-hanging fruit quickly picked in the early days of the computer age, and only hard problems remain on which little progress has been made?

Yes, most of the ground breaking theories were researched and documented decades ago. And some of that stuff has yet to make itself to mainstream programming languages (e.g. type inference).

> Haven't any new ideas been introduced by the innovations of modern languages like Python, Lisp, Haskell and Ruby? Or even C++, Java and JavaScript?

Apart from Lisp and Haskell, the "modern" languages you mention are rather boring when it comes to programming language design and there isn't too much research behind them. There has been a lot of practical work and research in garbage collection, run time systems, compiler back ends, just in time compilation and virtual machines. But when it comes to language design, the mainstream programming languages are really conservative.


Too, Lisp was described for the first time in 1958. See "Pretty Great Works", #8.


The relationships between theoretical innovations and practical innovations is particularly indistinct in Computer Science.

A language like JavaScript might be interesting because it is simple and ubiquitous and embedded in a powerful application platform - but that does not mean it is of much interest from a theoretical CS perspective.


> that does not mean it is of much interest from a theoretical CS perspective

JavaScript theoretically uninteresting? Are you kidding?

What about its unusual prototypical inheritance? JSON-style literals? Its notation for anonymous functions? The approach modern JS frameworks take to single-threaded asynchronous execution?

IMHO, JS is not the best language for getting work done [1], but it has lots of innovations which are worthy of study.

[1] If you need to do Web frontend stuff, obviously JavaScript is the only game in town, unless you're into Flash, Java applets, or compilers with JS backends.


It can certainly be interesting, but most of the interesting research being done is related to the intersection of languages and security. There do exist formalisations, but they are derivative works based on the GREAT papers.

Notation is for the most part boring. The way you write function literals has no effect on the semantics of the language.

The prototype inheritance is also boring, the small step semantics being two inference rules[1].

JSON-style literals are what. A way of writing a POD. Who cares. It is boring.

Asynchronous execution is also boring. It was novel in the 1960s, it is not novel now.

[1] - http://www.doc.ic.ac.uk/research/technicalreports/2008/DTR08...


"Notation is for the most part boring."

Unless, of course, when your work involves using the notation a lot.... then people get really picky about it.


There is interesting stuff about notation (in my opinion to do with parsing but perhaps thats my bias speaking), but people being "picky" about notation is theoretically dull.


But perhaps that is because nobody has developed a theory to explain why some notations are better than others!


If you're interested, "Programming Language Syntax and Semantics" by D. A. Watt, and "Syntax of Programming Languages: Theory and Practice" by R. C. Backhouse is probably useful in answering some of those questions.

In reality, notation is in the eye of the beholder. Some people like brackets and braces to split up the flow, where as some prefer whitespace as delimiters. There are papers out there on this (which I can't find right now).


Just wanted to point out a couple of things:

> unusual prototypical inheritance

It was the language Self (as far as I know) that introduced prototype based programming. The initial definition of Self happened in the mid-80s if I remember right and there were a ton of papers formalizing various aspects of the language including theories for prototype inheritance.

Bib for Self: http://selflanguage.org/documentation/published/index.html

> notation

Must say that I don't consider notation as theoretically interesting. It is practically interesting obviously. (And things like anonymous functions have existed since Lisp)

> single-threaded asynchronous execution

There is an incredible amount of research in concurrency models. It is a little hard to point to a single source; but the theories of CSP and the actor model have been around for a long time and it has been incorporated in to various languages in more or less the Node JS style.

Don't get me wrong. Javascript has many interesting things to it practically. But the theory has been around for a long time.


"Interesting" and even "practically useful" do not imply "theoretically interesting from an academic perspective".

Of course, the measure for academics is "can I get something published on the topic somewhere respectable" - it (mostly) doesn't have much to do with whether something is cool or useful.


Do you have any idea what sort of things PL researchers study?


> Haven't any new ideas been introduced by the innovations of modern languages like Python, Lisp, Haskell and Ruby? Or even C++, Java and JavaScript?

Those modern languages are all older than 2000. Some of the papers cited in this list are indeed related to Haskell, Lisp (do you really consider lisp modern?). And Compiling works that will have a lasting impact needs some time to reflect on the importance of new works.

This list is essentially centered on logic in programming languages (lambda calculus, types theory, semantics, monads), hence the absence of python, ruby, javascript: those languages picked a lot from this theory but did not bring much novel material.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: