I have just installed some software on my PC. It gives me Lisp, Prolog, ML, and a language that ought to be better known but isn't, Pop-11. It has an editor from which I can compile and run programs in any of these languages. And in this editor, I can read HELP files and TEACH files. Note that: TEACH files. For teaching Poplog, but also AI. How many compilers have built-in AI tutorials? This one does. Poplog.
I like Poplog because I used to teach AI with it. One of my students wrote a poetry generator in Poplog Prolog. The first poem it ever produced went, and I'm not making this up:
Oh poetry,I'll talk about this in a few paragraphs' time, but first, let me show you some Pop-11. It's an interactive language, so I can type commands and get back immediate replies. Most of the time, actually, I type expressions and get back their values. Like this:
Thy power is beauty.
Sussex Poplog (Version 15.5 18/11/1996 12:12:27)(My input is in bold; Pop-11's output, in normal.) The name "pi" is a built-in variable; and it's probably obvious that "exp", "hd" and "tl" are built-in functions. Square brackets enclose a list; "hd" returns its first element; "tl" returns its tail. The arrow => displays a result, while -> assigns it to a variable. In this case, the variable "e" to which I assigned didn't exist, so Poplog obligingly declared it for me.
Copyright (c) 1982-1996 University of Sussex. All rights reserved.
: 1+2*5 =>
: exp(1) =>
: hd( [ 1 2 3 ] ) =>
: tl( [ 1 2 3 ] ) =>
** [2 3]
: exp(1) -> e;
;;; DECLARING VARIABLE e
: e =>
Now let's play with functions:
: exp -> myfunc;I've assigned "exp" to a variable "myfunc", and then applied this just as I did "exp".
;;; DECLARING VARIABLE myfunc
: myfunc =>
** <procedure exp>
: myfunc(1) =>
Just as 1 is an integer constant, 3.141593 is a real constant, and [ 1 2 3 ] is a list constant, so the first line below has a function constant:
: procedure(x); 2*x endprocedure -> myfunc;The first function constant is a doubling function. The second is an add-3 function. Less verbose than Java, what?
: myfunc =>
: myfunc(5) =>
: procedure(x); 3+x endprocedure -> myfunc;
: myfunc =>
: myfunc(5) =>
I can operate on functions. The session below will make "f" into a function that does subtraction; test it by calling f(2,3); and then make it into a subtract-1 function by "freezing" its second argument:
: procedure(x,y); x-y endprocedure -> f;The freezing gets done with the "decorated brackets" (% and %). The result is a new function, a closure, and I can either apply it outright as in "f(% 1 %)(5)", or store it and then apply it as I do using "sub1".
;;; DECLARING VARIABLE f
: f(2,3) =>
: f(% 1 %) -> sub1;
;;; DECLARING VARIABLE sub1
: sub1(5) =>
: f(% 1 %)(5) =>
What this shows is that Pop-11 is a functional programming language. That is, functions are first-class values: they can be assigned and passed around as values, just as can integers, reals, and lists.
Let me tie this in with my students' projects. One student wanted to write a poetry generator; another, a limerick generator. To give Alistair and Tim raw materials that would be satisfyingly interesting, I got hold of an online dictionary. This consisted of at least ten files. One had definitions of all the words in the dictionary, so was in effect a mapping from words to definitions.
This, in fact, is why the poem above was so apposite. Alistair constructed his poems by slotting words into a grammar written in terms of parts of speech. One of his grammar rules was:
<stanza> ::= <interjection> <noun> <determiner> <noun> <verb> <noun>I should explain — the retired English teacher sitting next to me has just tapped me on the shoulder and insisted — that these things are not HTML tags. No-one learns grammar any more, he grumbles: no-one has heard of nouns or verbs, and the pointy brackets will make people think these are some weird new species of HTML. They are, of course, Backus-Naur Form or BNF, which got to pointy brackets 30 years before HTML.
To tell Alistair's program where to get the words for a new poem from, you seeded it with a theme word. Such as "poetry". The program would then collect the words in the theme-word's definition, and then in their definitions. These, he used as candidates to fit the part-of-speech slots. There's a good chance that the words in a word's definition have something in common with the meaning of the word defined, so this was a way to give his poems semantic, as well as syntactic, coherence.
Anyway, that one of the dictionary files mapped from words to definitions. Another told you how a word broke down into syllables. Two more gave pronunciation and stress. The pronunciation wasn't written in the International Phonetic Alphabet, but was phonetic enough: two words rhymed if they ended with the same phonetic rendering of their final syllable. And the stress data made it possible to filter out imperfect rhymes: words that would rhyme were the accent to fall on the rhyme sound, but where it doesn't. The syllable data was rather useful when generating limericks. As, of course, were the rhymes:
There was a young lady called Dawes,
Went out to a dance without gloves;
Her ma said: "Amelia!
Should anyone dance with you,
He'll take you for one of them actresses."
Because Tim and Alistair were writing Prolog, I had to interface the dictionary to that. I started by inverting the pronunciation data, making a file where each entry was the phonetic spelling of a word's final syllable, followed by the word. I converted this — my practicals ran on a VAX — to a VMS indexed file. VMS Fortran can call all the VMS file-handling routines, so some Fortran hacking ensued: I wrote a Fortran function that, given the phonetic spelling of a syllable, would find and return the entry for the first word ending in that syllable.
The next step was to interface VMS Fortran with Pop-11, which I did using Poplog's facilities for linking with external code. So I now had a Pop-11 function that searched for rhymes. Then I had to make this into a Prolog predicate. This was the interesting bit. As I've said, my Fortran function, given the phonetic spelling of a syllable, would return the indexed-file entry for the first word ending in that syllable. From this, my students could get a candidate rhyme word.
But also, VMS allowed one to resume search from that entry, getting the next word ending with the same syllable. I italicise "next" because, as I enthused in my intro, Poplog implements both Pop-11 and Prolog. The Prolog is built on top of Pop-11, using continuations. These are something from programming language semantics that embody the notion of "next computation". So I'm now going to explain how the nextness of the next relevant entry in an indexed file relates to the nextness of programming language semantics, and how both relate to Poplog Prolog.
Continuations were invented in the 1960s in work on the denotational semantics of programming languages. Researchers wanted to translate pieces of programming languages — assignments, loops, expressions — into well-understood mathematical entities that it was (relatively) easy to prove useful things about. If you're interested in the history, I recommend The Discoveries of Continuations by John C. Reynolds.
This, incidentally, was an alternative to operational semantics, which specifies a a program's meaning by how it affects some kind of abstract machine. Cliff Jones has a nice paper that describes the relationship between the two, and teases apart the concepts used in operational semantics, Operational Semantics: Concepts and their Expression.
It is fairly straightforward to give a denotational semantics to some programming language fragments. For example, let i and j be integer variables, and let C be the statement "i := i+1". Suppose i and j are the only variables in our program, and suppose also that we have just executed the command "i:=1; j:=2".
Then we can model the state of our program after this command as a function from strings to integers that maps the string "i" to 1 and "j" to 2. Call this a store (see links here) because it stores variables. Then the command C can also be modelled as a function, one that operates on stores. Specifically, a function that takes any store s and returns another store as its result. In this result, i is one greater than it is in s, but j stays the same.
Let's see that in Pop-11. Here is a function representing a store that maps "i" to 1 and "j" to 2. I'll assign it to s, which I'll then test:
: procedure(var); if var="i" then 1 elseif var="j" then 2 endif endprocedure -> s;
: s("i") =>
: s("j") =>
And here is a function C that takes a store st as argument and returns another store. Because stores are functions, my function's result is a function, the "procedure (s); ... endprocedure". Note how this gets the value of the "i" variable that st knows about, and adds 1:
: procedure(st); procedure (var); if var="i" then st("i")+1 elseif var="j" then st("j") endif endprocedure endprocedure -> C;In the previous paragraph, you saw how store s mapped a variable name to a value. In this paragraph, you see how I apply C to s, and apply the result to the variable names. C has transformed s into a store whose "i" is one greater than that of s.
;;; DECLARING VARIABLE C
: C(s)("i") =>
: C(s)("j") =>
So I've modelled a simple command "i := i+1" as a function that map stores to stores. A denotational semantics for this language would be a way of translating any sequence of commands into a function from stores to stores. That is, the semantics is itself a function:
The semantic function SAs Gene Fisher says in a semantics handout, probably the hardest thing to "get" about semantic definitions is the pervasive use of functions (of functions (of functions)). But he also says — and I have to admire anybody who makes this a separate bullet point in their notes — now tell the truth, isnít that one of the coolest things youíve ever seen?
takes a sequence of commands
and returns a function that
takes a store
and returns a store
where a store is a function that
takes a variable name
and returns its value.
So you might just as well say, if you're keen on this kind of thing, that the meaning of any command is the function from stores to stores that the semantics translates it into. That is, that the "denotation" of any command is, which is why one talks about denotational semantics.
Actually, I've oversimplified. We normally distinguish between stores and environments (see links here). An environment maps names to some mathematical equivalent of their memory location, which a store then maps to values. Also, a complete denotational semantics will contain functions that give meanings to other constructs than commands, for example expressions and procedure definitions. But the principle is the same.
The trouble is, this breaks down in the presence of "goto". This is because we want a denotational semantics to be compositional. Suppose C1 and C2 are any commands. Then our semantics is compositional if, to the command (C1 followed by C2), it gives a meaning that depends on the meaning of C1 and the meaning of C2, regardless of what C1 and C2 are. In terms of our semantic function S above:
S( (C1 followed by C2) ) =Actually, we should widen this idea of compositionality to embrace expressions and other constructs. But the idea is the same.
some function of S(C1) and S(C2).
But jumps mess this up, because how can a statement like "goto L" be regarded as a function from stores to stores, or combined with other such functions?
It's to solve this problem that continuations were invented. In my semantics above, the meaning of any command was a function from stores to stores, a mathematical representation of how it updates variables. But in a continuation semantics, the meaning of any command is a different kind of function — a continuation — that represents, not just the command itself, but all the computations following it.
It turns out that, if you define the meaning of commands in this way, the value of a label is the continuation for the commands following it. That is, the value of L in "L: i:=1" is the continuation representing the i:=1 and any commands following it. And the meaning of "goto L" is the value of L, i.e. L's continuation.
I shan't show exactly what continuations are in this posting — maybe in a later one — but the key point is that a label can be modelled as a function that represents the computations that come after it. Jumping to a label is modelled by looking up the label's continuation and applying it in some way that again, I'm not going to explain in detail here. More generally, any point in a program can be modelled as a function that represents the computations that come after it.
But then, if you have a functional language such as Pop-11, why not actually implement such functions? You could hang on to them until you need the computations they represent, and then use them. But only if you do in fact need those computations.
One language where this makes a huge amount of sense is Prolog. Let me explain why. Last week, by way of explaining the Prolog lightbulb joke, I showed some samples of Prolog. One was a route-finder:
borders( belgium, france ).
borders( france, spain ).
borders( germany, austria ).
borders( germany, denmark ).
borders( germany, switzerland ).
borders( netherlands, belgium ).
borders( netherlands, germany ).
borders( spian, portugal ).
route( A, B ) :-
borders( A, B ).
route( A, B ) :-
borders( A, Z ),
route( Z, B ).
Logically, the first rule means: there is a route from A to B if A borders B. The second rule means: there is a route from A to B if A borders some Z, and there is a route from that Z to B. But how does Prolog run these rules? Let's assume it's trying to verify whether "route(netherlands,denmark)" is true. And let's assume it's doing so using the second rule, so A=netherlands and B=denmark. These variable values mean Prolog will pose itself this subgoal:
borders( netherlands, Z ),
route( Z, denmark )
Comma means "and", so logically speaking, Prolog must find a Z that makes both halves of the subgoal true. Not being omniscient, it can't do so without using the facts we supply above, and the obvious one to begin with is the first clause for "borders(netherlands,...)". In this, the second argument is "belgium", so let's assume Prolog tentatively sets Z to that. Then it has to verify the second half of the subgoal, which because of Z, becomes "route(belgium,denmark)". But although in the real world there is such a route, Prolog can't prove this from the facts. So, it will have to backtrack and look for a different Z.
But here's where continuations become interesting. Because backtracking and looking for a different Z means seeking another clause for "borders(netherlands,...)" and taking a Z from that. Which is like jumping back into a loop that was scanning through all the "borders(netherlands,...)" clauses, but that we jumped out of when we thought we'd found a suitable Z. Which we can do if we store the rest of this loop and all following computations as a continuation!
For another example, consider Tim's limerick generator. Suppose it asks Prolog the query:
rhymes_with( 'Riga', Word ),Read logically, this means: there is a Word such that Word rhymes with "Riga" and Word is a noun. But how could Prolog implement this? Like this: it gets the first possible answer for rhymes_with and returns it in Word. It then prepares to test Word using "part_of_speech( Word, noun )". But because it may have to backtrack and look for another rhyme, it also builds and saves a continuation representing the place that "rhymes_with" may need to backtrack to.
part_of_speech( Word, noun ).
Because you can describe Prolog search in this way with continuations, they can be used as mathematical entities that give it a denotational semantics. But — in a functional language like Poplog, we can also implement Prolog with continuations. Which is how Poplog Prolog is implemented, as Chris Mellish and Steve Hardy explain in their paper Integrating Prolog into the Poplog environment.
And that's what I used to make my Pop-11 rhyme-searching function into a Prolog predicate for Tim and Alistair. I've explained that when searching the file for a word that ends with a given syllable, VMS would give me a pointer to the first entry it found. Later, I could resume search starting from that entry. So I built a continuation that held this pointer, and that called a Pop-11 function to resume search from it. And then I used the "Plog to Pop" procedures to connect this with Prolog. Which is how the nextness of the next rhyme for a word in my rhymes file relates to the nextness of programming language semantics, and how both relate to Poplog Prolog.
My dictionary predicates are one thing. But Poplog has its own rich toolkit of educational software, mainly for Pop-11. Students can view the source of any library file with Poplog's SHOWLIB command, and can call up tutorials with the TEACH command. There are tutorials on Pop-11 in general, on specific Pop-11 commands, and on the Ved editor. There are tutorials on expert systems, parsing, augmented transition networks, planning, search, and theorem proving. Most contain example code which can be run in Ved, as well as exercises and mini projects.
To show new students some of AI's concerns, I often used MSDEMO by Richard Bignell and Aaron Sloman. This is a (very) simplified reconstruction of Terry Winograd's famous Shrdlu, the program that inhabited a virtual world of blocks, spheres and pyramids, and that analysed and acted on commands such as "put the green cone on the red block". MSDEMO lives in the Ved editor, and draws a robot hand hovering over a pile of blocks. If you give it a command, it will parse it, make a plan to execute it, obey the plan, and then redraw the hand and the blocks.
Another demo is based on Joseph Weizenbaum's Eliza, the original chatbot. In Pop-11, it is easy to pattern-match on lists, as this shows:
: [I want a beer badly] matches [??pre beer ??post] =>
;;; DECLARING VARIABLE pre
;;; DECLARING VARIABLE post
** [I want a]
And this makes it pleasingly uncomplicated for students to write a small chatbot. In the dialogue below, Poplog's output is in capitals, and most of my input in normal mixed case. See whether you can identify some of the transformations that Poplog's built-in Eliza is using:
THIS PROGRAM SIMULATES A NON-DIRECTIVE PSYCHOTHERAPIST.
IT WILL APPEAR TO ENGAGE YOU IN CONVERSATION ABOUT YOUR PROBLEMS.
HOWEVER, IT DOESN'T REALLY UNDERSTAND, AS YOU WILL EVENTUALLY DISCOVER.
WHENEVER THE COMPUTER PROMPTS YOU WITH A QUESTION MARK, THUS:
YOU SHOULD TYPE IN A ONE LINE RESPONSE.
TO CORRECT MISTAKES USE THE "DEL" BUTTON.
AT THE END OF EACH OF YOUR RESPONSES, PLEASE PRESS THE RETURN BUTTON.
WHEN YOU HAVE FINISHED (OR ARE CURED?) TYPE BYE AND PRESS THE RETURN BUTTON.
GOOD DAY WHAT IS YOUR PROBLEM
? My light bulb has gone.
DO YOU KNOW ANYONE ELSE WHOSE LIGHT BULB HAS GONE ?
PERHAPS YOU DONT FEEL VERY TALKATIVE TODAY?
COULD YOU BE MORE INFORMATIVE?
ARE YOU PREPARED TO ELABORATE?
WHY ARE YOU SO UNFORTHCOMING?
? I cannot see to type.
EARLIER YOU SAID YOUR LIGHT BULB HAS GONE
? Yes, can you help me change it?
WOULD YOU TRUST A MACHINE TO HELP?
? Of course.
WOULD EVERYONE FIND THAT OBVIOUS?
? No, some people don't like computers.
WHO IN PARTICULAR?
? My friend John's mother doesn't like computers.
TELL ME MORE ABOUT YOUR FAMILY
? We don't have the money to buy a proper computer.
ALL EXCEPT THE VERY RICH HAVE FINANCIAL PROBLEMS NOWADAYS
? Very prescient!
ARE YOU SURE YOU ARE NOT BEING DOGMATIC?
? Could you rephrase that more simply?
FIRST TELL ME WHY YOU HAVE COME HERE?
? Because I need to change a light bulb.
CAN YOU DO WITHOUT?
I DONT THINK YOU REALLY TRUST ME
TO HELP YOU, I NEED MORE INFORMATION
PLEASE DONT GET UPSET, I'M SORRY I SAID THAT
? You are sorry you said "Oh"?
I RECALL YOUR SAYING YOU CANNOT SEE TO TYPE
? It is so dark.
SOMETIMES IT AINT SO DARK
? It is now.
SOMETIMES IT AINT NOW
? I don't think you understand me.
SUPPOSE YOU DID THINK I UNDERSTAND YOU
? Then I would be happy.
CAN MACHINES BE HAPPY?
? Aaron Sloman thinks they can.
WHAT DOES THAT SUGGEST TO YOU?
? Emotions would help a machine choose between goals.
DO YOU HAVE FRIENDS WHO CAN HELP YOU?
? Help me choose between goals?
THAT REMINDS ME,
DIDNT YOU PREVIOUSLY SAY EMOTIONS WOULD HELP A MACHINE CHOOSE BETWEEN
? Yes. An anxious machine would spend more time looking for threats, less absorbed in complex goals.
WOULD AN ANXIOUS MACHINE REALLY SPEND MORE TIME LOOKING FOR THREATS , LESS
ABSORBED IN COMPLEX GOALS
? Oh, indeed.
ARE YOU SURE YOU ARE NOT BEING DOGMATIC?
WHAT IS YOUR REAL PROBLEM?
? I cannot see.
YOUR PROBLEMS MAY BE TOO DIFFICULT FOR ME TO HELP
? Probably, if you can't help me change a light bulb.
YOU DONT SOUND VERY CERTAIN ABOUT THAT
? Maybe I'm not.
CAN YOU ALWAYS EXPECT MAYBE YOU TO BE ?
? You don't understand me. I'm going to the pub.
DRINKING DAMAGES BRAIN CELLS
? I need a drink.
BEWARE OF ADDICTIONS
? I NEED BEER!
MACHINES DONT OFTEN GET DRUNK
? Everyone gets drunk.
WHO IN PARTICULAR GETS DRUNK
? Me, all the time, if I had the money.
HAVE YOU TALKED TO MARGARET THATCHER ABOUT THAT?
Some of Poplog's TEACH files do date back a long way, and I don't think there are many written after 1990. But it's a nice language for experimenting with a wide range of AI techniques, and the tutorials already tell you a lot to get you, or your students, started with symbolic AI. And it is also great for teaching basic computer science: witness my games with first-class functions.
Moreover, Poplog is very very extensible. You can call the compiler at run-time, or your own functions at compile-time. Via continuations, you can fine-tune Prolog's control flow. And you can, as it happens, write code to plant Poplog virtual machine instructions. All this ought to make Poplog excellent for teaching compiler design, and for experimental implementations of languages such as those — some for visual programming, and all worth a look — invented by Wouter van Oortmerssen.
So I'd really like to see somebody get some decent funding for Poplog, add tutorials for other and more modern AI concepts and techniques, port the graphical interface to Windows, and make Poplog work with Unicode. I think this would be really worthwhile for teaching.
Oh, by the way, should you want to see what Eliza was doing to my sentences, its rules are here.
The Free Poplog Portal, School of Computer Science, University of Birmingham. Free versions of Poplog, including Pop-11, Lisp, Prolog, ML, the PopVision library, and the SimAgent toolkit.
HOWTO install Poplog on Windows (Vista), School of Computer Science, University of Birmingham. These are the instructions I followed to install Poplog on Windows. They worked first time, and only took a few minutes to follow.
www.cs.bham.ac.uk/research/projects/poplog/packages/simagent.html — The SimAgent TOOLKIT -- for Philosophers and Engineers (And Some Biologists, Psychologists and Social Scientists), by Aaron Sloman.
dli.iiit.ac.in/ijcai/IJCAI-83-VOL-1/PDF/123.pdf — Integrating Prolog into the Poplog environment, by Chris Mellish and Steve Hardy, 1983.
www.poplog.org/docs/popdocs/pop11/teach/aithemes — TEACH AITHEMES: A PERSONAL VIEW OF ARTIFICIAL INTELLIGENCE, by Aaron Sloman, 1988.
www.cs.bham.ac.uk/research/projects/poplog/lib/elizaprog.p — Source file with Poplog's Eliza rules, mainly by Aaron Sloman, 1978, 1995, 1997.
www.cs.bham.ac.uk/research/projects/poplog/doc/popteach/msdemo — TEACH MSDEMO: USING THE MSBLOCKS DEMONSTRATION PROGRAM, by Richard Bignell and Aaron Sloman, 1987.
www.cs.bham.ac.uk/research/projects/poplog/doc/prologhelp/plogtopop — PLOGHELP PLOGTOPOP: How to call POP-11 from Prolog, by Chris Mellish and Kathryn Seifert, 1983, 1986.
www.cs.bham.ac.uk/research/projects/poplog/doc/popref/vmcode — REF VMCODE: POPLOG VIRTUAL MACHINE INSTRUCTION SET, by John Gibson, 1995.
www.cs.bham.ac.uk/research/projects/cogaff/poplog-learning-environment.html — Poplog as a Learning Environment, by Aaron Sloman, 2006.
www.cs.bham.ac.uk/research/poplog/doc/pop11.at.sussex.ps — The evolution of Poplog and Pop-11 at Sussex University, by Aaron Sloman.
en.wikipedia.org/wiki/Backtracking — Backtracking, Wikipedia.
en.wikipedia.org/wiki/Backus-Naur_form — Backus-Naur Form, Wikipedia.
en.wikipedia.org/wiki/Closure_(computer_science) — Closure, Wikipedia.
en.wikipedia.org/wiki/Compositionality — Principle of compositionality, Wikipedia.
en.wikipedia.org/wiki/Continuation — Continuation, Wikipedia.
citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.40.237 — The Discoveries of Continuations, by John C. Reynolds, 1993. How continuations were discovered, independently and "an extraordinary number of times".
en.wikipedia.org/wiki/Denotational_semantics — Denotational semantics, Wikipedia.
www.csc.liv.ac.uk/~grant/Teaching/COMP317/HTMLNotes/denotational.html — Denotational Semantics, by Grant Malcolm, Liverpool University. Gives denotational semantics for arithmetic expressions, Boolean expressions, and programs. Also explains stores, a.k.a. states.
users.csc.calpoly.edu/~gfisher/classes/530/lectures/7.pdf — More on Tennent-Style Denotational Semantics, by Gene Fisher, Cal Poly University, California. These are notes for use with Robert Tennent's book Semantics of Programming Languages, the ones that ask: now tell the truth, isnít that one of the coolest things youíve ever seen?
en.wikipedia.org/wiki/ELIZA — Eliza, Wikipedia.
www.cs.kent.ac.uk/teaching/this_year/modules/CO/6/30/CO630.04.Bindings.ppt — Bindings and Scope. University of Kent lecture notes which explain environments and identifier binding.
eprints.ncl.ac.uk/file_store/trs/806.pdf — Operational Semantics: Concepts and their Expression, by Cliff B. Jones. Explains stores and environments.
www.j-paine.org/dobbs/prolog_lightbulb.html — The Prolog lightbulb Joke, by Jocelyn Ireson-Paine.
hci.stanford.edu/~winograd/shrdlu/ — SHRDLU, by Terry Winograd. Includes downloads of the original and of a reconstruction by Greg Sharp (and others?).
strlen.com/index.html — Wouter van Oortmerssen's site, with the experimental languages, some for visual programming.