Weak and strong AI


next up previous contents
Next: Functional architecture
Up: No Title
Previous: Equivalence between games
Back: to main list of student notes

Weak and strong AI

Taken to its limit, functionalism implies that if we can replicate in a computer the abstract relations that define a mental state, then we have replicated the state itself - including the behaviour it causes, and the feeling of subjective awareness that goes with it. This stance is known as strong AI.

Neural functionalism would do this at the neural level. It doesn't matter whether we replicate the input-output behaviour and connectivity in real neurons, or silicon equivalents, or (the ``Chinese brain'' scenario) people simulating them. Either way, we'll build a system which generates the same behaviour, and has the same subjective awareness as the brain it mimics. There's an amusing dialogue about this in A Conversation with Einstein's Brain by Hofstadter, from The Mind's I edited by Hofstadter and Dennett (PSY AA:H 067; Philosophy; Hooke). gif

Symbol-level functionalism would do this at the symbol level: the level of classical AI.

Weak AI backs off from such extreme claims, and says merely that the computer is a valuable tool - e.g. in ``running'' theories so as to test them more rigorously.


next up previous contents
Next: Functional architecture
Up: No Title
Previous: Equivalence between games
Back: to main list of student notes



Jocelyn Ireson-Paine
Wed Feb 14 23:46:11 GMT 1996