Structured Neural Networks?

Back propagation networks often initialize their weights randomly. What effect do different random settings have on the ultimate behavior? Further, would it be more efficient to have structured weights, based on the behavior from the start? Is this similar to the set up of the brain, where connections are initially based on the growth of nerves? Is the growth of these nerves (during forming of the fetus led by the inputs or predetermined?

Damn you ADD!

1. Code a predictor in Flash

2. Realize that making a predictor might be redundant and I should really create the complete architecture.

3. Write List of Components in BB’s Book

4. Write Abstract for list

5. Write about BB being ahead of his time with Neural Networks

6. Find a reference for the first time dependent (meaning it takes time to activate a neuron) Neural Network design.

7. Finds a list of NN implementations and starts reading

8. Realizes that I am now completely lost and need to speek to someone. Being alone, decides to blog it.

9. Decides to Rename his blog to “A PhD with ADD” – which sounds like a famous book title and hope he will be famous for it.

10.  Realizes now WHY he felt inclined to read a bout NN’s, because He just found that Backpropagation NN’s just about try to do the same thing as his predictor algorithm.

11. Adds the Neural Network summaries from

to his learning Schedule