Limits on speed - complexity theory

Next: Sorting
Up: No Title
Previous: Limits on space
Back: to main list of student notes

Limits on speed - complexity theory

There are also results on speed - how fast a computer can perform certain tasks. Any computer, including a connectionist one, will have its own speed limits, but I'll switch now to conventional serial machines. The idea is that, given a task such as sorting numbers, we can write a number of different algorithms (programs) for it. The time taken by each program will increase with the size of its input: it takes longer to sort 40 numbers than to sort 20. (Note that this has to be a statistical measure. If the 20 numbers are exactly in reverse order, and the 40 are exactly in the right order, some algorithms might process the 40 more quickly. But, on average, the time will increase with the size of input.)

What complexity theory studies is:

• Given a specific algorithm, how does its time vary with size of input? Is it as the square of the size, or the cube, or perhaps exponentially? And what are the worst- and best-case possibilities?

• How does the best algorithm for a task vary? Is it possible to write a sorting program whose time increases more slowly than the square of input size, for instance?

Next: Sorting
Up: No Title
Previous: Limits on space
Back: to main list of student notes

Jocelyn Ireson-Paine
Wed Feb 14 23:47:23 GMT 1996