[ Jocelyn Ireson-Paine's Home Page ]
AI - The art and science of making computers do interesting things that are not in their nature.
Welcome to our September issue. The main feature this month continues August's AI-in-Python theme with a look at Python for robotics and the Pyro robot-control software. We also have a selection of quotes, and some computer-generated humour. As ever, comments and suggestions are welcome: please mail firstname.lastname@example.org.
I came across a few of the quotes below while looking up references for another article. It's an article that hasn't yet worked out, but I thought it would be fun to use them to trace a path through the past - and perhaps future - development of AI. So here goes:
In this feature, I continue last month's Python for AI by moving on to robotics and the Pyro robot-control software. Pyro's designers devised it to overcome the limitations of Lego Mindstorms for teaching. In their paper Avoiding the Karel-the-Robot paradox: A framework for making sophisticated robotics accessible, they explain who Karel the robot was and why he is to be avoided.
Karel was introduced by Richard Pattis
in his book
Karel the Robot - A Gentle Introduction to the Art of Programming.
His book isn't on the Web, but I did find a Karel-based
course for C,
by Roland Untch. This
introduces us to
Karel, who lives in a grid of streets and walls.
Scattered throughout this grid are beepers, which Karel can
sense, pick up, and put down.
Students learn to program by instructing Karel to
perform assorted tasks, using
PickBeeper(). This highly imperative
style of programming is - I imagine - one
that students find easy to get started with.
However, the authors of
Avoiding the Karel-the-Robot
paradox assert that it eventually leads students to
a programming dead-end.
Similarly, they say, although inexpensive robots have made introductory
AI accessible to a wide range of
school and university students, they have led to a
One problem is portability. There are many robots on sale, but each tends to have its own programming language and development tools, often very different from those of other robots. This make it difficult for students to transfer not just code, but also programming techniques.
Also, many robot programming systems are restricted in the sensors they support. For example, many low-cost robots are often supplied with infrared range sensors only. You might be able to add something more sophisticated such as a sonar or laser range sensor; but even if your educational budget can afford this, you may not be able to access the sensor from the software.
So, widespread use of robots for teaching AI needs not just cheap hardware, but also control software that can be ported to many different robots and make them all look identical to the student. That's Pyro's goal: write-once/run-anywhere robot programs. Then students can concentrate on building robot brains. Also, as they learn, they will be able to gradually move up to more and more sophisticated robots. And such robots, if the software is capable enough - and Pyro should be - will be usable in research as well as teaching.
Pyro is available at pyrorobotics.org/, and supports a wide range of robots: Pioneer and PeopleBot family, Khepera and Hemisson family, and Aibo and simulators RoboCup Soccer Player/Stage Gazebo and Khephera. Pyro can be used with Orocos, the Open Robot Control Software that I mentioned last month.
Pyro runs on Unix and Linux, but according to the Pyro FAQ, may also work with other operating systems. A LiveCD is available; and Zach Dodds has made a Windows implementation, PyroWin.
The Pyro library includes modules for various robot control paradigms, robot learning, robot vision, localization and mapping, and multiagent robotics. The robot control paradigms include modules for direct control, finite state machines, subsumption architecture, fuzzy logic control, and neural network control: feedforward, recurrent, self-organizing maps, other vector quantizing algorithms. There are also genetic algorithms and genetic programming. The vision modules provide a library of the most commonly used filters and vision algorithms enabling students to concentrate on the uses of vision in robot control. All this is open source: it can be modified, and students can learn by looking at the code. (The documentation is also open source, available under a Creative Commons licence.) Modules planned for the future include logic-based reasoning and acting, classical planning, and path planning and navigation.
One Farside cartoon depicts two amoebae sitting in front of a television. The female amoeba, sporting typical Larson nagging-wife upswept glasses, is glaring at the male amoeba and shouting "Stimulus, response. Stimulus, response. Don't you ever think!". If stimulus-response control is low on the evolutionary ladder, it's also easy to teach: let's start there, with an example that's reprinted in several of the papers about Pyro, including The Pyro toolkit for AI and robotics:
from pyro.brain import Brain class Avoid(Brain): def wander(self, minSide): robot = self.getRobot() # if approaching an obstacle on the left side, turn right if robot.get(’range’,’value’,’front-left’,’minval’) < minSide: robot.move(0,-0.3) # if approaching an obstacle on the right side, turn left elif robot.get(’range’,’value’,’front-right’,’minval’) < minSide: robot.move(0,0.3) # else go forward else: robot.move(0.5, 0) def step(self): self.wander(1) def INIT(engine): return Avoid(’Avoid’, engine)Here, we're defining a robot "brain". These have to be subclasses of class
Brain. This one is class
Avoid: in Python, although it might look like some kind of procedure call, the code
class X(Y)defines new class
Xto be a subclass of
brain needs a
which Pyro executes on every
control cycle. The one above
makes the robot continually wander, turning
as a direct response to its range sensor
if it has got too close to an obstacle on either side.
The authors emphasise that this program does not depend on the robot or range sensor. it's also independent of the robot's length, since Pyro translates sensory and motor data to multiples of length, and will avoid obstacles when they are within one robot length of the front-left or front-right range sensors, whatever that happens to be.
Let's move on to a robot controlled by a finite-state machine. The robot's job is a bit of simple recycling, picking up and storing cans. The authors use a simulated Pioneer robot with gripper and "blob" camera, discussed in the next section, on vision. Cans are represented as randomly positioned red pucks in a circular environment without obstacles. The robot's goal is to collect all the red cans. Once it has picked up a can, it stores it and looks for more cans.
The finite-state controller has four states:
Each state corresponds to a particular
is triggered by some condition in the environment, tries
to handle the condition, and may then move to
The controller starts in state
locateCan. In this state the robot rotates, looking for a blob
would mean a red can is
in sight. If it finds a can, the controller switches to
approachCan to move the robot toward the closest
visible can. (If the robot loses sight of
the can, the controller returns to state
Once the robot has its gripper around a
can, the controller switches to state
grabCan, making the
robot pick up and store the can. It then returns to
locateCan to search for another can. This state
keeps track of how long it searches on each
activation of the state. If the robot has done a complete
rotation and not seen any cans, the controller switches to
done and stops.
locateCan state in Python.
As with the direct-control brain, each state
step method, called on every
control cycle. States use the
goto method to switch
to other states:
class locateCan(State): def step(self): # get a list of all blobs: blobs=self.get("robot/camera/filterResults") # checks if there are any blobs if len(blobs)!=0: # stops robot when a blob is seen self.robot.move(0, 0) print "found a can!" # transfers control to homing behavior: self.goto('approachCan') # checks if robot has done a complete rotation elif self.searches > 275: print "found all cans" # transfers control to completion behavior: self.goto('done') #otherwise keep rotating and searching else: print "searching for a can" # updates rotation counter: self.searches+=1 # rotates robot and remains in locate behavior: self.robot.move(0, 0.2)
What about vision? As already mentioned, Pyro has camera-interface and image-processing modules. Students can write programs to implement vision algorithms, such as colour histograms, motion detection, object tracking, or edge detection.
For efficiency, the low-level vision library code is written in C++, but students can interactively use it to build layers of filters in Pyro, calling the computationally expensive C++ code while still having the benefits of the high-level, interactive interface of Python.
The authors illustrate with Aibo looking at a ball and applying three filters to the raw image: colour matching, supercolour, and blob segmentation. The colour matching filter marks all pixels in an image that are within a threshold of a given red/green/blue colour triplet. The supercolour filter magnifies the differences between a given colour and the others. For example, the supercolour red filter makes reddish pixels more red, and the others more black. Finally, the blob-segmentation filter connects adjacent pixels of similar colour into regions, computes a box completely surrounding the matching pixels, and returns a list of these bounding boxes. Students can use these filters without needing to worry about the low-level image-processing details - for example, detecting Aibo's ball by finding the largest region matching its colour, then drawing a bounding box around it. It's then easy to program Aibo to move towards this region.
If you own an Aibo - surely the most popular of Pyro's robots - why not consider Pyro as an alternative to Sony's Open-R and other development tools? As the examples from Pyro's Using the Sony AIBO Robot page, commands are not difficult to write:
robot.setPose("mouth", 1.0) robot.setPose("tail", 0.2, 1.0) robot.setPose("left leg front knee", 0.5) robot.getSensor("ir near") robot.setWalk("TIGER.PRM")The
getSensorgets data from one of Aibo's infra-red sensors, and the
setWalkloads a gait.
Using the Sony AIBO Robot also mentions that two Aibo "brains" are available: one for following a blob, and one which tries to kick a ball into a goal. This indicates that, as one would expect, Aibo can be used with Pyro's software for camera control and vision.
I suspect the ball-kicking brain is that described in Ioana Butoi's dissertation Find Kick Play: An Innate Behavior for the Aibo Robot. This explains how Pyro was used to build Aibo software for recognising a ball and a goal, and kicking one towards the other. Butoi describes object-recognition algorithms developed for the RoboCup competition, and also how to stop Aibo falling over as it kicked the ball. Butoi had to devise a stance in which Aibo could balance on three legs while kicking with the fourth. A real dog might do that too (though in my experience, it's more likely either to eat the ball or bite the experimenter); but a real dog would be intelligent enough to constantly adjust its stance as its fourth leg swings and kicks. Aibo isn't that clever, so Butoi had to find a specially stable joint configuration for it to balance on.
Tekkotsu is an application development framework developed at CMU for Aibo and other intelligent robots. Like Pyro, it is intended for educational use: how does it compare?
Pyro developer Douglas Blank says in Using the Sony AIBO Robot and in a posting about the Pyro-Tekkotsu relationship that Aibo Pyro actually uses part of Tekkotsu, namely the Monitor - a set of servers running on Aibo via which programs can transfer sensor data, images, and motion commands. That doesn't mean students need to learn Tekkotsu, though. Blank goes on to say in his posting:
The main project of Tekkotsu offers a unique programming environment. If I were going to land an Aibo on the moon, I'd probably use Tekkotsu to control it. But for doing interactive teaching, and high-level scripting and experiments in the lab, I'd use Pyro. To give you an idea of the environments: In Tekkotsu, if you want to change a line of code, you must recompile everything that depends on the code (it is C++ code) using the provided cross-compiler. Then the code is copied to the dog over ftp, the dog shuts down, and starts back up. The whole process (compile + transfer + reboot) lasts at least a minute on our machines. In Pyro, you simply press the "reload brain" button and nearly instantly you are running the new code.
I love the idea of Aibo on the Moon.
pyrorobotics.org/ - Home page for Pyro Python Robotics. Don't confuse this with Python Remote Objects at pyro.sourceforge.net/, also named Pyro.
www.cs.hmc.edu/~dodds/PyroWin/ - PyroWin, Pyro modified to run under Windows, by Zach Dodds. "At some point, the official version of Pyro may run under Windows out-of-the-box, and this page will disappear".
http://pyrorobotics.org/?page=PyroFAQ - the Pyro FAQ, which answers some questions about how the software works.
emergent.brynmawr.edu/pipermail/pyro-users/2004-September/000050.html - [Pyro-users] Re: Pyro High-Level Conceptual Model.
www.cs.hmc.edu/roboteducation/FinalPapers/Blank.pdf - Avoiding the Karel-the-Robot Paradox: A framework for making sophisticated robotics accessible, by Douglas Blank, Holly Yanco, Deepak Kumar, and Lisa Meeden. Presented at AAAI 2004 Spring Symposium.
www.mtsu.edu/~untch/karel/ - Roland Untch's C course using Karel. Not Pyro-related, but shows who the original Karel was.
dangermouse.brynmawr.edu/~dblank/papers/aimag05.pdf - The Pyro toolkit for AI and robotics, by Douglas Blank, Deepak Kumar, Lisa Meeden, and Holly Yanco. Submitted to AI Magazine.
pyrorobotics.org/?page=PyroCurriculum - The main Pyro Curriculum page. Links to course notes on Pyro for behaviour-based control, neural nets, vision, and other topics. Also links to two slide presentations: the AAAI 2005 overview (10 slides), and the AAAI 2005 tutorial (118 slides). These, particularly the tutorial, contain: examples of Python code, course topics, and student projects; defects of Lego robotics; diagrams of the Pyro architecture; pictures of the robots and simulators.
www.cs.pomona.edu/~marshall/papers/bringing_up_robot.pdf - Bringing up robot: Fundamental mechanisms for creating a self-motivated, self-organizing architecture, by Douglas Blank, Deepak Kumar, Lisa Meeden, and James Marshall. Interesting paper on self-organising maps for a hierarchical control architecture, where each level "chunks" sequences for use by the more abstract level above it.
pyrorobotics.org/?page=Using_20the_20Sony_20AIBO_20Robot - Using the Sony AIBO Robot, on the Pyro site.
cs.brynmawr.edu/Theses/Butoi.pdf - Find Kick Play: An Innate Behavior for the Aibo Robot, by Ioana Butoi, Bryn Mawr, 2005.
emergent.brynmawr.edu/pipermail/pyro-users/2005-February/000087.html - [Pyro-users] Pyro-Tekkotsu relationship ?.
Quite by chance, I found the following in a book bought second-hand from Oxfam some weeks ago:
A few years ago, Dr Graham Ritchie and Dr Kim Binsted created a computer programme that could produce jokes. We were keen to discover if computers were funnier than humans, so entered five of the computer's best jokes into LaughLab. Three of them received some of the lowest Joke Scores in the entire database. Here are those failed puns:What kind of contest can you drive on? A duel carriageway.What kind of line has sixteen balls? A pool queue.What kind of pig can you ignore at a party? A wild bore.However, two examples of computer comedy were surprisingly successful and beat about 250 human jokes:What do you call a ferocious nude? A grizzly bare.What kind of murderer has fibre? A cereal killer.So, jokes written by a computer are not particularly funny to humans, but perhaps they would be hilarious to other computers.
It's from Laughlab: The Scientific Search for the World's Funniest Joke, by the British Association for the Advancement of Science, and refers to the work linked to below.
www.laughlab.co.uk/ - LaughLab, created by Richard Wiseman, University of Hertfordshire, in collaboration with the British Association for the Advancement of Science.
www.dcs.gla.ac.uk/~kimb/dai_version/dai_version.html - A symbolic description of punning riddles and its computer implementation, by Kim Binsted and Graeme Ritchie, 1994. Early paper, explaining the theory behind such riddles as "What do you give an elephant that's exhausted? Trunkquillizers", and its embodiment in the first version of JAPE.
www.inf.ed.ac.uk/publications/online/0158.pdf - The JAPE riddle generator: technical specification by Graeme Ritchie, 2003. The paper contains formal definitions of JAPE-3's data structures, rules and procedures: "the aim is to set out a formally precise, implementation-independent account of how JAPE generates punning riddles. The reason for doing this is that experimental AI programs are usually under-documented, making it difficult for other researchers to replicate the work, or to know what theoretical claims are actually embodied in the implementation."
doc.utwente.nl/fid/1183 - Humour Research: State of the Art, by Matthijs Mulder and Anton Nijholt, Twente. A recent survey of humour theory and of joke generators such as JAPE, the Light Bulb Joke Generator, and Elmo, the Natural Language Robot. Includes a section on resources such as WordNet.
groups.inf.ed.ac.uk/standup/papers/thepsychologist_0203omara.pdf - What do you get when you cross a communication aid with a riddle?, by Dave O’Mara and Annalu Waller. Also in The Psychologist, volume 16, 2003, this paper is published by the STANDUP project (System To Augment Non-speakers Dialogue Using Puns), which seeks to use humour to help language-impaired children communicate.
www.aaai.org/AITopics/html/toons.html - IAMAI's AI-toons. This AAAI cartoon page includes news on STANDUP and other humour research. It explains that "Kim Binsted had always had a love for making people laugh and was part of the improvisational comedy team at school. When her interest in physics and maths took her into artificial intelligence she fell back on her comedy background to help her work on a few problems in computers. Now, having created a programme where computers can generate their own puns, she works on a system that uses comedy to help children learn a new language, whilst still trying to fit a little improv in, in her spare time."[ Jocelyn Ireson-Paine's Home Page ]