[ Jocelyn Ireson-Paine's Home Page
Using Java and the Web as a front-end to an agent-based Artificial
These are notes towards a paper.
We describe how Java has been used to front-end a course in practical
artificial intelligence previously run on VT100 terminals, and
to provide a graphical interface. We also provide some motivation
for our decision to use production systems.
The course described here was taught for
the Oxford University Experimental Psychology Department.
(It was terminated in 1999, when they stopped all AI teaching.)
It consisted of
8 half-day sessions in Trinity (Summer) Term,
one or two sessions per week.
Students, mostly 2nd year undergraduate psychologists,
learnt some basic AI techniques
and then did a little project.
The course was intended
to convey the flavour of practical Artificial Intelligence,
complementing the connectionist approach taken in the rest of
the Department with an introduction to classical AI on the one
hand and nouvelle AI and Artificial Life on the other, as well
as teaching some AI history.
An important point is that although some students
study for an
AI Finals paper, many have no knowledge of AI or even
I shall refer to the latest version of the course (with
Java front-end) as Eden II, and
to the pre-Java version as Eden.
This replaced an older course which taught Prolog, followed by a small
project such as an expert system or poetry generator. That course
used a front-end, the Logic Programming Tutor, to make Prolog
more student-friendly, and
is described in my book The
Programming Tutor. Its source code is downloadable from
library. Parts of it, adapted to pure Prolog,
are on-line at my
practical notes page.
Reasons for changing to an agent-based course
Why did I change from the Prolog course to an agent-based
- Motivation. Agents can be fun, and can be competed against one another.
These are both good motivations to learn. An
agent-based course lends itself to appealing graphics, adding to the
- The whole iguana, to borrow a phrase from Rodney Brooks
Putting the output of (for example) a planner to use by other
components enables one to appreciate the shortcomings of
the plan representation used, and of other aspects such as its
representation of actions and how this can be updated to include new
- The whole iguana (2). Classical AI tends to separate cognition
along functional lines, so that different functions can be taught in
isolation from one another. This is less true of the nouvelle AI/Artificial
Life approach which is therefore hard to teach unless you have complete
- In the Prolog course, students learnt basic Prolog
and built programs using a few utility predicates. But the components
used were still low-level. With Eden, students
can be given complete
agents and a kit of useful agent components, enabling them to get
started immediately. There's an analogy with a Meccano add-on kit,
Meccano Elektrikit - if you're
interested in building cranes and cars, you don't want to build your own
electric motors first.
- Psychology students often have little experience of mathematics,
programming, and other formal notations. (That was one reason for using
the Logic Programming Tutor: to simplify Prolog into an English-like
The time needed to
learn the formal syntax of Prolog puts them off, as do things like
learning editor keys. Again, this time can be reduced here, because
students can start with complete agents and begin just by modifying them
The Eden Microworld
A microworld is a computer-generated environment which agents
inhabit. "Agents" are computer-generated creatures which
are fed perceptions of the microworld and can react to them, e.g.
by moving about or eating. Their
"brains" may use many different AI techniques.
Eden is a two-dimensional
whose design was largely
imposed by the limitations of
VT100 terminals. It was originally designed by Simon
as part of an OU
AI Society project. I worked on that project, and took Eden
over for use in this course, first using it in Trinity 1992.
It is implemented in Poplog.
can be easily configured to add new kinds of object.
Agent bodies and perceptions could be configured.
can fail unpredictably, and agent perceptions can be imprecise. Such
possibilities of failure are important, for example, in illustrating
the shortcomings of certain planning techniques. Agents could be
controlled by "brains" written in Prolog or Pop-11, and could also
be controlled manually. The perception-action pairs of any agent could
be saved into a file for later use, e.g. in training neural nets.
Eden ran on discrete time. The simulator calculated all agents'
perceptions, and passed them to the agents' brains. It then ran
the brains one by one. Each brain sent back an action specification,
e.g. "move forward 1". The simulator attempted to perform this,
checking for violated laws of physics, updated the agent's state
if necessary (e.g. with simulated pain signals), and then repeated.
These are production-system interpreters
written in Poplog Prolog.
Numerous sensory predicates and motor competances
are provided (and, in general, chosen to be useful within the
so that students don't have to program them from
scratch. [Borrowing from Maes networks.]
Students program the
brains by providing rulebases.
In this course, the actual architecture chosen for the agents
was secondary. Within some limits, a wide range of architectures
can be used to teach the same fundamental ideas. So I experimented
I chose production systems because:
Defects of production systems: lack of explicit control makes it
hard to understand (or write) algorithmic processes. PS's are best
suited to encoding programs expressed as state-space transitions.
However, the PS interpreter was coded in Prolog, and both
conditions and actions could invoke Prolog predicates (written by
me or the students). These could in turn call Pop-11 routines.
- They are easy for novices to understand, having a simple notation and
- They can be adapted to both classic AI and nouvelle AI.
- They can be
used to introduce various useful notions: logic programming,
functional architecture of cognition, etc.
- I tried other techniques which didn't work out:
- Maes networks: too hard to tune.
- Reactive Action Packages: very adaptable, but too complex for
novices. Offer many possible ways to program a particular task,
meaning that the students (or I) would need to spend time
developing good style and guidelines for use.
- Classic parser/planner/world-model a la Shrdlu: I couldn't get
a robust planner (had to use a
free copy of Warplan, which
tended to get into infinite loops). However, I did build such an
agent for demonstrating classic AI: see PopBeast.
This, and the accompanying notes, are a nice educational
package, introducing planning, analogical representations,
propositional logic, symbol grounding, amongst other notions.
- Neural nets: designed an interface, but couldn't find suitable
- Genetic algorithms: did try Koza-style genetic programming,
using Prolog trees and type-checking to make sure input types
matched output types. But with the amount of CPU time
OUCS would permit, could not run a large enough population.
Tried hacking this by having the job crash itself and resubmit to queue,
but still failed to evolve any useful behaviours - population
probably still far too small, so not a sufficiently large pool
useful part-behaviours that might be needed by later generations.
- Nilsson teleo-reactive systems: promising, but needed too
much development time.
Graphics, Eden, and VT100's
Course has suffered greatly from restrictions on graphics.
The course couldn't be run on PC's or Macs, because there were
no Poplog implementations. There were no Departmental Suns or
other workstations that could be used, so we were restricted to
OUCS. It was OUCS policy not to run X-windows on the VAX (even though
Dec did provide a version, Dec-windows), and when we started, OUCS's
Unix mainframe did not have Poplog. This restricted us
to the standard terminals, VT100 clones.
Poplog did eventually become available under Unix at OUCS, but the
Department had no X-emulators, many colleges had none either, and there were a
limited number at CTC when we started.
What is a VT100? A text-only terminal, no graphics (although
inverse video and highlighting are available), 80
characters wide by 24 deep. Only one character can be displayed in any
position. A program can move the cursor to any character position.
To make Eden displayable on these, it was designed as a
two-dimensional world divided up into cells. Each
cell normally holds only one item, depicted as a character. Agents
can move from cell to cell and pick up or drop items. Items being held
won't be displayed in the appropriate cell, but will be shown on a
separate status line as part of their owner's "inventory".
We also suffered from problems with incompatible terminals, made
worse by Oxford's decentralisation (the Department; OUCS; CTC;
28 undergraduate colleges).
There were too many kinds of terminal: a range of terminals made
by DEC, such as VT100's, and VT320's (not with identical keyboards);
OUCS home-built VT100's; PC's
and Macs emulating
VT100's in various ways.
Key layouts differ - e.g. on a real VT100, "Return" and "Enter" keys send
different signals; on a PC, even when running an emulator, they act
identically. [Is this always so?]
The two sets of arrow keys change position between PC's and real
This is bad for students. The Eden editor was Poplog's Ved, and
uses a number of control character commands. The keys that
generate them vary in position between keyboards, so students can't
transfer motor skills.
A possible solution was to supply each student
with software to remap the key
positions to match a real VT100. This could be done either inside the
editor or at the terminal end, e.g. with a Kermit initialisation file.
Reconfiguring the editor requires me to know all the kinds of terminal
students may meet, and for them to know which configuration file to
call up. It also can't deal with irrevocable key mappings, such as
Enter=Return on PCs. Reconfiguring the terminal requires students to
have permission and know how to do it; requires me to know all the kinds
of terminal students may meet; may clobber sessions for subsequent
users. Either may require me to go to the terminal and give advice, which
is not feasible, given the number of colleges.
This would be simpler if the University had forced colleges and
departments to standardise, or - at worst - had provided a list
of all the terminals available in departments and colleges.
Graphical Web browsers - a solution?
The Web seemed to offer a relief from the plethora of terminals.
A browser such as Netscape will run
on almost any machine, is simple to use, has an
interface that does not vary between machines, and does
not require complex
programming to generate nicely formatted text.
Could we use them? Yes, if we ran Eden on
a server machine, and had it send back pages to the browser.
One disadvantage is that HTML
output is limited to text, unless Eden generates
a graphical file depicting its state after each time step (which
would be slow). So
we would gain no visual advantage over a VT100.
Another is that HTML controls are limited to data-entry fields, menus,
buttons, checkboxes, and imagemaps. This would make it difficult for
students to change the microworld or move agents about, though not for
them to step through agent runs. A possible solution here was to
and have each cell depicted as a text-entry field which
students can type into to alter its contents.
trying this using a Pop-11 server under Apache, but it got messy
and didn't seem worth the trouble.
Java - a better solution?
provides truly portable graphics, simpler to program and more
widely useable than X. The idea was to run the course's front-end
as a Java applet. There are
various ways of doing
- Reprogram everything as an applet, including the microworld and the
- Leave everything in Poplog, but have an applet front-end which
displays the state of the simulation and allows students to edit it.
Eden would run on a server machine, with which the applet would
- Reprogram the microworld and agent bodies in Java, but have the
agents connect to Poplog brains running on the server.
The first option is tempting, since it transfers load from the
server to the browser. It might also make it easier to do complex
graphics, linking them more closely with the simulation.
Looking through books on Java for games, I came across
Gamelet: a tool for building simple shoot-'em-up games. It is a game
"shell" or "framework" which provides the classes needed for sprite
animation, collision detection, display optimisation, and scoring. There
is a basic Actor class representing a moving sprite, and this can easily
be subclassed and its behaviour modified. This behaviour could include
communicating with a brain server.
It is downloadable from here.
Free Java on the Web
Gamelet is one extremely useful piece of software that was available
free. Others included:
- Some meters
and gauges which I intend to use to indicate agents' energy levels;
- The JFS remote file
system, which I cannibalised to make the file server;
- An editor,
cannibalised to make the editor for production-system
rulebases. It needed
to be redesigned to incorporate the file
server. [The link to the original source
code appears to be defunct, although the author is still at his original
- Numerous little demonstration programs, e.g. to show the use of
checkboxes. I used some of these as skeletons, just because it was
quicker to do so than to find the specifications (from book or Web) and
code from scratch.
- Along the same lines, useful article in Javaworld on
user interfaces, demonstrating e.g. the use of insets.
Also along the same lines,
Gary Jone's curve
editor. Cannibalised this to make the world-editor.
Eden II's architecture
This gives us
Eden II, implemented as a Java applet talking to a Poplog brain server.
It consists of:
In the applet:
- A microworld simulation written in Gamelet.
- A controller and viewer for the microworld. These display
as a control panel, with buttons allowing the user to suspend and
resume the simulation. There is also a button allowing the user to
enter an edit mode in which they can move objects within the world.
- Classes implementing agent bodies. These include code for generating
perceptions, and for taking actions and acting them upon the world.
- A controller and viewer for each agent. These display
as a control panel, with buttons allowing the user to suspend and
resume the agent's brain (in suspended mode, perceptions are still
generated, but the agent does not pass them to its brain, or expect
to process a resulting action). Also checkboxes allowing the user
to generate an action manually. Also a meter showing the agent's
- Agent brain clients. These
send the agent's perceptions to the brain server, request the server
to run the brain, and get back an action.
- A controller and viewer for the brain. These display as
as a control panel, with windows showing the current STM, rules
being considered for resolution, rules filtered out by resolution, and
rule being fired. There are buttons to load a new production system, and
to run it.
- A text editor for the production systems.
On the machine from which the applet was downloaded:
- An agent brain server. This is written in Java and communicates
with the brain clients via sockets.
- Agent brains written in Poplog and running on the brain server.
These are production systems.
- A production-system interpreter, and code for loading production
systems and compiling to interal form.
- A Prolog interface between Prolog and the Java brain server.
so far, the brains are all production systems, for the reasons
given earlier. All brains run the
same production system interpreter (that used for Eden),
but may have different rulebases.
The model-view-controller paradigm
Every displayable object may be accompanied by a View and a Controller.
We use the MVC paradigm - see Applications
Programming in Smalltalk-80(TM):
How to use Model-View-Controller (MVC)
An introduction to the Observer interface and Observable class using
Model/View/Controller architecture as a guide .
[But note inheritance problem.]
Layout is standard, with View and Controller on the same panel, View
to the left of Controller. Home-built listener interface (this is
Java 1.0, so we can't use the JDK1.1 Listener interface).
Most browsers prohibit applets from writing to or reading files
on the browser's machine, making it difficult for students to
save their work or access locally-edited files.
The standard solution is to run a "file server" on the machine
from which the applet was downloaded. The applet talks to this
via sockets, and has its own protocol for requesting and saving files
and inspecting directories.
Eden II therefore also includes:
These file clients need to be able to be plugged into editors and
any other means by which students access files. Java has a FileBrowser
class, but that is not directly replaceable, since one can't
inherit from it and override its methods. So we define an
interface which can be implemented by either a FileBrowser or
a remote file browser. Editors and other components need to
access files through this.
- A file server, written in Java.
- File clients which are part of the applet and which
send requests for files or directory listings, and files to be saved, to
the file server.
State of play
I have prototyped the entire system apart from the remote file
server, which is not quite finished. Still possible to run locally
via appletviewer on the same machine as the files being edited.
See the picture below, which
shows the EdenII game window, the EdenII controller,
for two agents,
the production-system interfaces for these
agents, the text editor with its file
browser, and part
of the debugging output
from the brain server and client (in the
two windows on the right). Clicking on the image will expand it
to full size.
Part of the Gamelet environment is done, but needs finishing. One
problem here is finding someone to do the artwork. The file server also
Some problems remain:
- Until the file server is finished, I need to run the system
locally (on Ermine) rather than as an applet. Unfortunately, DEC's
Java 1.0.2 has a number of bugs, including faults in threading,
delete not working in text-entry fields, and a windowing bug that causes
the cursor to invert visibility every time there's a window event.
DEC do not support this version any more. In principle, we could
upgrade to their latest 1.1 version, but that requires patching the
operating system, and OUCS have so far refused to do this, because
the operating system itself is unstable and has required a lot of
OUCS effort to make it work reliably.
- The user-interface would be much easier to design with a visual
user-interface builder, rather than writing AWT code by hand.
Unfortunately, the only decent free builder, Lava,
that I've found requires
Java 1.1. There are some commercial ones available, but
the Department refuses to pay for them.
We might also mention
"link-rot". The Lava user-interface builder I mention above,
no longer exists at its original URL. This is probably
because the author was a student (at Nottingham University) and has
now finished his course. Computing services don't have unlimited
filestore; but it would greatly help the rest of us if they would
realise that they are the repository for some
valuable resources, and try to prevent these from getting lost once the
authors leave. If nothing
else, offer every departing student the opportunity to archive their
software in a national archive such as Leo or Hensa, and keep
on a redirection page after the user has left.
To Aaron Sloman for comments.
It seems possible to use Java as a front-end to make teaching
some topics in AI fun and easy to use. The Web's hypertext
nature helps - as many have pointed out, of course - in providing
somewhere to hang course documentation. [But note that you can't easily
eembed HTML inside Java controls.]
The Web helps in another
way, in that there are a lot of free pieces of Java code out there,
both in source and compiled form, which can be cannibalised.
Java does have its limitations - for example, the applet security
restrictions - but these can be circumvented.
The effort has been less successful than it should have been, owing
to lack of support from the Department and from OUCS.
7th February 2000
- Brooks, "The
Whole Iguana" in Robotics Science (1989).
- Dennet, "Why not the whole
iguana?" in Behavioral and Brain Sciences (1978).
[ Jocelyn Ireson-Paine's Home Page
| Publications ]