Multiple Layers
   - When you have multiple layers of perceptrons, you have
       a multi-layer perceptron (or MLP).
 
   - It turns out that almost any function can be approximated
       to an aribtrary degree of precision with a two layer MLP
       (given enough perceptrons). 
 
   - There is work in three and more layers.
 
   - This is not the same thing as 
       deep belief nets,
       but is related.
 
   - The layers work in sequence, but there is parallelism within a 
       layer.
 
   - So, the first layer gets inputs.  These are all the same, but
       may be differently weighted.
 
   - Each applies their transfer function (typically the same),
       to produce outputs.
 
   - These then feed inputs to the second layer.
 
   - Since the weights are just mutliplications, and the transfer functions
       generally simple, this whole system is really fast.