Last Good Quote: Son's are the seasoning on our lives. - Someone on Facebook

Thursday, November 29

Notes on AI

I was watching/exploring an online AI course over at https://class.coursera.org/neuralnets-2012-00

Here are some notes that I don't want to forget:

Types of Neurons (Single points of contacts in a 'brain')

Linear Neuron - Collects all input values and applies a weight and that is it's output.  (aka Linear Filter)
  • y = b + sum(xi + wi)
Binary Threshold Neuron - Collects all input values and when sum hits a "threshold" it outputs a value of 1
  •  if b + sum(xi + wi) > 0 output 1 else output 0
Rectified Linear Neuron - Collects all input values and when sum hits a "threshold" it outputs a value progressive value (dependant on input)
  • In simpler terms, it works like a Binary Threshold but outputs like a Linear Neuron
Sigmoid Neuron - "Smoothes" the output to something between 0 and 1
  • Most commonly used
  • Ouput = 1 / (1+ e^-(b + sum(xi + wi)) 
  • Leads to smooth derivitives
 Stochastic Binary Neuron - Take a Sigmoid Neuron and randomize if it actually fires are not.
  • Follow up: Not sure what randomization is based on...
  • Follow up: Why is this usefull
  • Poisson Rate for Spikes (huh??)
Perception Based Architecture - Will always find a solution within its Test Cases IF a solution exists within Test Space. Often times a solution does not, due to what measurs(features) are chosen.
  • If you choose the right measures(features) then this is a great learning model
  • Choosing features is the hardest part though!
  • Once features are "chosen" you have limited your learning process
  • Do not use this learning model for "multi-layer" networks, it doesn't work
Important Questions to Ask of Your System:
  • Will it eventually get to a correct answer?
  • How quickly will this happen? (How many evolutions/learnings/weight adjustments)

0 comments:

Post a Comment

Followers