Wednesday, January 31, 2007

HTMs - Memory, Machines & Motivation

This blog posting will focus on Rob's presentation on Hierarchical Temporal Memory (HTM) as defined by Jeff Hawkins of Numenta in his onIntelligence book (published in 2004). HTMs, sometimes called memory prediction frameworks, will play a central role in the team's goal of buidling a generalized AI framework and the basis of a robotic nervous system. This is just a high level overview of HTMs being used for our purposes, more details are available here and here.


Prediction Machine: Rob described how HTMs are modeled after a common cortical structure present in the human neocortex. In a nutshell, the human brain stores sensory input in a layered hierarchial memory structure. Each computational unit of this memory framework processes spatial patterns (patterns over space) and temporal patterns (patterns over time). Current sensory patterns are matched against patterns in memory and the closest matching pattern category is sent up the memory hierarchy. As categories of patterns continue traveling up the hierarchy an invariant view of the sensory data is formulated. Spatial and temporal patterns are stored in memory with more detailed/specific patterns existing in the lower levels of the hierarchy and generic/invariant models living in the higher levels.

This invariant model of the world is then used to form predictions based on prior and current sensory input patterns. Current sensory inputs are constantly checked against predictions forming the initial basis for a behavioral action (motor or otherwise). Predictions also allow the brain (HTM) to supplement current sensory input to form a complete picture from sensory data that may be missing information (i.e. "fill in the blanks").

Exception & Attention Machine: A prediction that fails miserably against current sensory input can be considered a violation of expectation or an exceptional event. In computer science, exceptional events are called exceptions and are usually handled immediately or allowed to "bubble up" to a high-level handler for processing. The brain also seems to have an "exception handling" mechanism of its own. When current sensory input "surprises" ... the brain commands the body to pay special attention to the source of the surprising sensory input. A good example is when you see an unrecognized motion from your peripheral field of vision. You immediately turn your head to focus your eyes (and senses) to the source. Attention was probably a very old adaptation of the reptillian brain to focus an organism's full resources to detection of a predator (or prey).

MOTIVATION & MOTOR BEHAVIOR
What do HTMs have to do with robots? Well... we're seeking to use HTMs as a primary tool for a generalized AI system that can drive robot behavior. We're writing code to simulate the common cortical algorithm defined by Jeff Hawkins of Numenta. Motor behavior will be implemented from processing the HTMs predictions and sending commands to robotic servos. We're still in discussions about how to model a robotic nervous system with HTMs as a central component. Team members have discussed potentially modeling innate behaviorial control mechanisms such as pain and hunger. The team's goal is to get robotic behavior to emerge as a response to its environment versus explicity coding specific goals and behaviors into the robot. This is one of our prime directives.

LIMITATIONS & REALITIES
The obvious limitations to the implementation of HTMs are the amount of memory and CPU resources required to effectively simulate the neocortex and the amount of training necessary to get the robot/HTM anywhere near operational (i.e. a baby). We're using Microsoft .NET to implement the HTM and the MS Robotics Studio to tie the HTM to the robot hardware. We will use Microsoft's provider model when implementing the memory structures... since we know initially, we wont have enough RAM to store the memory component .. we will probably persist to Flash memory or hard drive. Other specific goals of our HTM development is complete serialization/deserialization of memory structures and multithreading the processing of sensor nodes.

WRAP-UP

Some thoughts brought up by team members.

Hawkins talks about the massive amount of feedback that seems to take place going down the memory hierarchy. Trevor mentioned that this is probably due to a set of predictions being sent down (versus just the best prediction) to better correlate (in a bayesian fashion) with the current sensory input.

Another interesting point brought up by Rob was specific to an HTM implementation issue regarding categorization of the temporal patterns. Bert prototyped an HTM implementation that was string based but Rob was wondering if a pre-quantized numeric based category structure will be better for the HTM sensor nodes ... this will overcome a nuance of Hawkins design that forces you to evolve the categories over time. Rob's design will require knowing a priori the range of values for sensor input but may provide an interesting alternative.

Team talked about how inhibition should be modeled specifically in an HTM implementation. Maybe an HTM node can inhibit the temporal pattern categories of other HTM nodes by some unknown mechanism ...can probably gather some ideas from modern neural networks.

Robotics - Simply Sensors & Servos

Team members have met several times since last blog posting .... high-level overviews of both spheres (Robotics & AI) were presented along with an introduction to MS Robotics Studio. This blog posting will focus on Trev's robotics overview.

Servos & Sensors pretty much covers the 2 major divisions of robotics hardware. Very simply, if you want the robot to move at any junction point (hip, elbow, knee, etc) - you need a servo. If you want the robot to sense its environment - you need a sensor.

SERVOS
Servos are primarily driven by the rotational movement of a motor. Servos can usually move bidirectionally (forward and backward) and only have 2 primary functions.... 1) Receive commands and 2) Send state information. These functions are usually implemented via attached microcontrollers. State information is derived by the motor's ability to sense its position and track its speed. Some servos report their speeds via the use of a tachometer which correlates a voltage proportionally to a rotational speed. Central to the role of servo control is the concept of negative feedback which is defined as a type of feedback in which the system responds in an opposite direction to the perturbation. A servo controller is a microchip that interfaces directly with the motor control. The motor and servo controller come to a desired result (command) through a two-way conversation described as "hungry or less hungry". Not the last time this "hunger" concept will surface during this meeting.

SENSORS
Sensors that we will primarily being dealing with can be grouped into 3 major camps ... Acoustic, Electromagnetic and Mechanical. Acoustic sensors detect pressure waves in air or water. Sonic and ultrasonic sensors can be used to detect sounds or physical objects (via echo-return sonar).
ACOUSTIC: Common acoustic-based sensors include microphones (for sound) and ultrasound sonar for object/obstacle detection. Positional data during obstacle detection using sonar may be limited to distance measurement only (without reliably providing the angle relative to sensor).

ELECTROMAGNETIC: Infrared sensors detect energy from the infrared portion of the EM spectrum (longer wavelength then visible light but shorter than radio waves). Infrared-based robot sensors can be used to detect heat differences in a field of vision and object/obstacle detection. Ultraviolet robot sensors provide limited obstacle detection & heat detection. Arguably the most important robotics-based sensors revolve around the visible light portion of the EM spectrum. Optical sensors are a primary tool in robotic sensory hardware .... your everyday webcam is a perfect example of an optical sensor (so are your eyes). Obviously, optical sensors are used for object detection in a field of vision and binocular optics (2 web cams) can help in providing the potential for excellent positional and relative angle information for objects in the field. This "richness" of data also has its darkside.... it provides reams of information that if processed/analyzed at any reasonable frame rate (1-50 frames a second) can bring any modern computer to its knees.
MECHANICAL: Bumpers are a good example of a mechanical sensor. These sensors can provide both a positional value and a force strength value when used for collision detection. The image to the left is a simple LEGO based bumper implemention using touch sensors by young kids at a primary school. Other interesting examples of mechanical sensors are human skin and animal whiskers. A substantial amount of literature on whisker-cortex signalling in the rat and cat exists and may serve as a useful model in small robot development. A robotic skin substrate that is tied to an array of mechanical sensors may serve as a powerful addition to any robot implementation.

Trev wrapped up the robotics overview by discussing that simplistic robot designs are usually just a collection of servos and sensors controlled by their respective microcontrollers leading to a master controller (sometimes called a Stamp) that manages the interplay amongst all the pieces by proprietary software (usually specific to controller chipset). Cabling was also mentioned as an important aspect of robotic design that can affect robotic performance.

TEAM GOALS:
ROBOT DESIGN: The team initially discussed rover-based designs for our initial set of robot prototypes but we then decided to focus on bipedal based robot designs in light of the fact that "black box" self balancing robotic bipedal leg platforms are already on the market. We may fall back to 4-leg designs if we "stumble" too often but I admire our ambition (even if foolhardy ... )

ROBOT BEHAVIOR GOAL: After having major discussions on robotic philosophy, robot feelings and robot love (don't ask).... Rob was getting ready to segue into his presentation on Artificial Intelligence and HTMs. We semi-concluded that our first major milestone should be to tie an HTM-based AI to a bipedal robot who's life mission is survival. Initial thoughts revolved around the concepts of hunger (there it goes again) and having the robot define its "hunger" as an inverse relationship to its reserve energy level (battery charge level).... while defining its "food" as light energy that can be fed into solar panels thus raising its charge level.

After laughing hysterically of the notion of our robot running head-on into the headlights of an oncoming vehicle or killing itself because it couldn't lift the shades at a window... we decided that a bipedal robot that can effectively move about seeking well lit areas ... may be a suitable initial goal.


ROBOTICS LINKS
Segway Robotic Mobility Platform (RMP) - CMU uses RMP for their soccer robot platforms
RoboCup Tournament 2007 - Atlanta, Georgia
Robotics Primer Workbook - collaboration between USC, iRobot & Microsoft Robotics Studio
Dr. Robot store
Robotics Connection - robotics store
ROBOTS DREAMS - robotics blog
Acroname Store - Sensors, Servos & Robotics Accessories

Thursday, January 11, 2007

2007 - Spheres, Bots & Brains.

Spheres of Knowledge - The team defined two major spheres of knowledge plus an integration sphere.
Sphere #1 - Artificial Intelligence - specifically Memory Prediction Frameworks (HTMs)
Sphere #2 - Robotics - specificially humanoid or reptilian (2 or 4 legs), sensory networks (laser, acoustic, visual, tactile feedback, etc.) and motion hardware (servos, motors, gyros, etc.).
Sphere #3 - Integration, CPUs & Memory, Simulation & Training

One member assigned to each sphere and the remaining member assigned to master both spheres of knowledge and the integration between the two. We're hoping this setup is optimal for a 3-man team to absorb the mountains of information in each sphere while maintaining some overlap of knowledge and expertise.

The Artificial Intelligence Sphere will center around memory prediction frameworks popularized by Jeff Hawkins of Numenta. Hawkins outlined an algorithmic framework sometimes referred to as Hierarchial Temporal Memory which tries to simulate neocortical behavior. The framework is based on the idea that most of the human neocortex is driven by a huge number of computational units (small neural networks) that perform a common cortical algorithm using memory as its central feature (jeff gave credit to the neuroscientist Vernon Mountcastle for the common algorithm observation). Hawkins notes that invariant representations of real world objects (sense-driven) eventually get represented in the neocortex by hierarchies of memory which hold sensory spatial and temporal pattern memories. Once the neocortex is "trained" and populated with these memories, it consequentially gains a powerful prediction engine for future (sensory) events. A recent study corroborates the central role that memory seems to play in prediction. The team has decided that a framework based on Hawkins HTM ideas show the most promise for our venture. An excellent working knowledge of Bayesian statistical methods will be essential. The other major aspect of this sphere will be the potential integration of knowledge bases (common sense databases, internet sources, etc.) along with the HTM framework to facilitate training and increase functional intelligence of the AI system. Rob has been selected to lead this sphere.

The Robotics Sphere is really The Substrate Sphere, we were primarily looking for a substrate to apply our AI efforts... we all quickly agreed that Robotics provided the most challenging (and fun) platform to test our AI-based engines. Since a career can be spent on development of a particular robotics platform -- we've decided to focus on COTS-based robots. Aligning with the recent MS release of Robotics Studio, the team will focus only on platforms that are supported by the Robotics Studio. This will leverage our current .NET programming expertise, allow us to integrate any .NET based AI algorithms while minimizing our time investment on the robotics side of the house by not focusing on proprietary hardware. The important aspects of this sphere are primarily hardware related including wireless communication between AI node and robotic brain, acuity (granularity) of sensors, noise filtration and locomotive strategy. Trev will lead this sphere.

The Third Sphere is an integration of the first 2 spheres with responsbilities covering development environments, integration, computational hardware (CPUs, memory, etc.), simulation and AI/Robot training. Training is a significant issue in both AI and robot development and requires careful planning and implementation. Duties involve mastery of MS Robotics Studio and Concurrency & Coordination Runtime (CCR), multithreading / multicore concepts, integration of AI engines and robotic "nervous system". Non functional duties involve facilitating cohesiveness between leaders of both spheres and ensuring that all members are working toward the stated goals.

It is important that each person masters their respective sphere and cross-train the rest of the team. Each member will present a technical summary by next meeting covering their corresponding sphere. This action item is meant to ramp up teammates on the relevant technical concepts and to get the team used to adjusting their busy schedules to include time for this new venture.
  • Trev will present a technical robotics summary including robotics jargon & definitions, details on different sensor types and an overview on the initial platform selected - LEGO Mindstorms NXT.
  • Rob will present a summary of the AI sphere concentrating on HTM concepts & algorithms.
  • Bert will present an overview of the MS Robotics Studio and the CCR.

Some relevant links:

Artificial Intelligence Sphere:
Bayesian Models of Inductive Learning- Berkeley PowerPoint
onIntelligence Forum
Hierarchial Temporal Memory - Wikipedia
Science AAAS - article on Brain Science / HTM

Robotics Sphere:
MS Robotics Studio - Lego Mindstorms Tutorial
Lego Mindstorms NXT Community Site
Lego Mindstorms NXT Blog
Robotics Connection Store
The Tech Museum - Robotics