0% found this document useful (0 votes)
141 views22 pages

Adaptive Resonance Theory (Art) Network

Adaptive Resonance Theory (ART) is an unsupervised neural network algorithm developed by Carpenter and Grossberg to solve the stability-plasticity dilemma. ART networks have short-term and long-term memory layers, and a vigilance parameter that controls how similar inputs must be to be grouped together. During learning, ART networks compare a new input to learned categories, update the best matching category if it is sufficiently similar based on vigilance, or create a new category if not similar enough. This allows ART networks to continuously learn new information while preserving previously learned knowledge.

Uploaded by

mohancrescent
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
141 views22 pages

Adaptive Resonance Theory (Art) Network

Adaptive Resonance Theory (ART) is an unsupervised neural network algorithm developed by Carpenter and Grossberg to solve the stability-plasticity dilemma. ART networks have short-term and long-term memory layers, and a vigilance parameter that controls how similar inputs must be to be grouped together. During learning, ART networks compare a new input to learned categories, update the best matching category if it is sufficiently similar based on vigilance, or create a new category if not similar enough. This allows ART networks to continuously learn new information while preserving previously learned knowledge.

Uploaded by

mohancrescent
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

ADAPTIVE RESONANCE THEORY (ART) NETWORK

Adaptive Resonance Theory (ART) is a family of algorithms for unsupervised learning developed by Carpenter and Grossberg. ART is similar to many iterative clustering algorithms where each pattern is processed by -finding the "nearest" cluster (a prototype or template) to that exemplar (desired). - updating that cluster to be "closer" to the exemplar.

ART - Principle
incorporate new data by checking for similarity between this new data and data already learned; memory. If there is a close enough match, the new data is learned. Otherwise, this new data is stored as a new memory. In ART, the changes in the activation of units and weights are governed by the differential equations. For matches, only weight change should be done for a period resonance period
2

Types
ART1 Designed for discrete input. ART2 Designed for continuous input. ARTMAP Combines two ART models to form a supervised learning model.

Adaptive Resonance Theory


Adaptive Resonance Theory (ART) aims to solve the Stability Plasticity Dilemma: How can a system be adaptive enough to handle significant events while stable enough to handle irrelevant events? Stability of an ANN denotes that its weights have reached the steady state In general, for new learning previous weights will be lost Adaptability of an ANN has been related to plasticity of materials Materials can undergo deformation without losing its characteristics and properties.
4

Stability Plasticity Dilemma


Networks ability to exhibit stability, yet maintaining plasticity
How can a learning system be designed to remain plastic or adaptable enough to learn new things whenever they might appear How they are stable enough to preserve previously learned knowledge What prevents the new learning from washing away the memories of previous learning How does the system know when and how to switch to its stable model to achieve stability without rigidity And to its plastic model to achieve plasticity without chaos 5

Example
Identification and recognition of Food varieties Texture and Images

Classification Process
ART classification process consists of three phases
Recognition
Stores a set of patterns in the weights associated with the recognition layer neurons, one for each classification category For best match, output becomes 1, otherwise zero

Comparison
Looks for similarity

The Search Phase


For no reset signal, search ends. Otherwise, the stored pattern must be sent to seek a correct match
7

Architecture
Input processing Field (F1 layer) Cluster Units (F2 layer) Reset Mechanism controls the degree of similarity of patterns placed on the same cluster

BASIC ARCHITECTURE OF ART1

Input Processing Field


Divided into 2 portions:
Input Portion F1(a) Interface Portion F1(b)

F1(a) just presents the input vectors and some processing may be performed F1(b) combines the input signal from the F1(a) with F2 layer for comparison of similarity

10

Cluster Units
This is a competitive layer Cluster units with larger net input is selected to learn the input patterns. The activation of all other F2 units are set to zero. Interface units now combine information from the input and cluster units

11

Reset Mechanism
Cluster may or may not be allowed to learn the pattern depending on the similarity between the top-down weight and input vector. If the cluster unit is not allowed to learn, it is inhibited and a new cluster unit is selected as the candidate.

12

Adaptive Resonance Model


The basic ART model, ART1, is comprised of the following components: 1. The short term memory layer: F1 Short term memory. 2. The recognition layer: F2 Contains the long term memory of the system. 3. Vigilance Parameter: A parameter that controls the generality of the memory. Larger means more detailed memories, smaller produces more general memories. Training an ART1 model basically consists of four steps.
13

Adaptive Resonance Model (2)


Step 1: Send input from the F1 layer to F2 layer for processing. The first node within the F2 layer is chosen as the closest match to the input and a hypothesis is formed. This hypothesis represents what the node will look like after learning has occurred, assuming it is the correct node to be updated.

y F2

F1

Input (I)

F1 (short term memory) contains a vector of size M, and there are N nodes within F2. Each node within F2 is a vector of size M. The set of nodes within F2 is referred to as y.
14

Adaptive Resonance Model (3)


Candidate y F2 Hypothesis (I*)

Step 2: Once the hypothesis has been formed, it is sent back to the F1 layer for matching. Let Tj(I*) represent the level of matching between I and I* for node j (minimum fraction of the input that must remain in the matched pattern for resonance to occur). Then:

F1

T j ( I *) =

I^ I * where A^ B = min( A, B ) I If T j (I *) then the hypothesis is


accepted and assigned to that node. Otherwise, the process moves on to Step 3.

Input (I)

15

Adaptive Resonance Model (4)


Candidate y F2 Hypothesis (I*)

Reset F1

Step 3: If the hypothesis is rejected, a reset command is sent back to the F2 layer. In this situation, the jth node within F2 is no longer a candidate so the process repeats for node j+1.

Input (I)

16

Adaptive Resonance Model (5)


Step 4:
Rejected y* F2 Accepted

1. If the hypothesis was accepted, the winning node assigns its values to it. 2. If none of the nodes accepted the hypothesis, a new node is created within F2. As a result, the system forms a new memory.

F1

Input (I)

In either case, the vigilance parameter ensures that the new information does not cause older knowledge to be forgotten.
17

FUNDAMENTAL ALGORITHM OF ART NETWORK

18

Neural Selector
Basically, most applications of neural networks fall into the following five categories: Prediction Classification Data association Data conceptualization Data filtering
19

20

21

22

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy