Docsity
Docsity

Prepare for your exams
Prepare for your exams

Study with the several resources on Docsity


Earn points to download
Earn points to download

Earn points by helping other students or get them with a premium plan


Guidelines and tips
Guidelines and tips

Principles of S.oft Computing, Study notes of Engineering

Engineering from Anna University, Chennai, in the year 2008. ... 9. 2/\Art;ficial Neural Network: An Introduction ... Economics/finance.

Typology: Study notes

2021/2022

Uploaded on 08/01/2022

hal_s95
hal_s95 🇵🇭

4.4

(652)

10K documents

1 / 356

Toggle sidebar

This page cannot be seen from the preview

Don't miss anything!

bg1
!r
..
~
"
~--
.:
(
'I
Principles
of
S.oft
Computing
(2nd Edition)
Dr.
S.
N.
Sivanandam
Formerly
Professor
and
Head
Department
of
Electrical and Electronics Engineering and
Department
of
Computer Science and Engineering,
PSG
College
of
Technology,
Coimbatore
Dr.
S.
N.
Deepa
Assistant
Professor
Department
of
Electrical and Electronics Engineering,
Anna University
of
Technology, Coimbatore
Coimbatore
WILEY
pf3
pf4
pf5
pf8
pf9
pfa
pfd
pfe
pff
pf12
pf13
pf14
pf15
pf16
pf17
pf18
pf19
pf1a
pf1b
pf1c
pf1d
pf1e
pf1f
pf20
pf21
pf22
pf23
pf24
pf25
pf26
pf27
pf28
pf29
pf2a
pf2b
pf2c
pf2d
pf2e
pf2f
pf30
pf31
pf32
pf33
pf34
pf35
pf36
pf37
pf38
pf39
pf3a
pf3b
pf3c
pf3d
pf3e
pf3f
pf40
pf41
pf42
pf43
pf44
pf45
pf46
pf47
pf48
pf49
pf4a
pf4b
pf4c
pf4d
pf4e
pf4f
pf50
pf51
pf52
pf53
pf54
pf55
pf56
pf57
pf58
pf59
pf5a
pf5b
pf5c
pf5d
pf5e
pf5f
pf60
pf61
pf62
pf63
pf64

Partial preview of the text

Download Principles of S.oft Computing and more Study notes Engineering in PDF only on Docsity!

'I Principles^ .:^ ( ~--^ "^ ~!r..

WILEY CoimbatoreAnna University of Technology, Coimbatore^ Department^ of Electrical and Electronics Engineering, Assistant Professor Dr. S. N. Deepa Coimbatore^ PSG College of Technology,Department^ of Computer Science and Engineering,^ Department^ of^ Electrical and Electronics Engineering and^ Formerly Professor and Head Dr. S. N. Sivanandam (2nd Edition)^ of S.oft Computing

About^ the^ Authors Dr. S. N. Sivanandam completed^ his^ BE^ (Electrical^ and^ Electronics^ Engineering) in 1964

from^ Government College^ of^ Technology,^ Coimbatore,^ and^ MSc^ (Engineering)

in^ Power^ System^ in 1966^ from^ PSG^ College^ of

Technology,^ Cc;:.imbarore.^ He^ acquired^ PhD^ in^ Control

Systems^ in^1982 from^ Madras^ Universicy.^ He^ received

Best^ Teacher^ Award^ in^ the^ year^2001 and^ Dbak.shina

Murthy^ Award^ for^ Teaching^ Excellence^ from^

PSG

College^ ofTechnology.^ He^ received^ The^ CITATION

for^ best^ reaching^ and^ technical^ contribution in^

the^ Year

2002,^ Government^ College^ ofTechnology,^ Coimbatore.

He^ has^ a^ toral^ reaching^ experience^ (UG^ and^ PG)

of^41

years.^ The^ toral^ number of^ undergraduate and postgraduate

projects^ guided^ by^ him^ for^ both^ Computer^ Science

and Engineering^ and^ Electrical^ and Elecrionics Engineering

is^ around^ 600.^ He^ has^ worked^ as^ Professor^ and Head^ Computer^ Science and Engineering Department,

PSG^ College^ of^ Technology, Coimbatore.^ He^

has been identified^ as^ an outstanding person in the field

of^ Compurer^ Science^ and Engineering in^ MARQUlS 'Who's Who',^ October^2003 issue,^ USA.^ He has also been identified

as^ an outstanding person in the field^ of Computational^ Science^ and Engineering in 'Who's Who', December

2005 issue,^ Saxe~Coburg^ Publications,

UK^ He^ has^ been placed^ as^ a^ VIP^ member in the continental

WHO's^ WHO^ Registry^ ofNatiori;u^ Business

Leaders, Inc.,^ NY,^ August 24,^ 2006.^ A widely published author, he^ has^ guided and coguided

30 PhD^ research^ works and at^ present^9 PhD research scholars are working under him.^ The^ rotal

number^ of^ technical publications in International/National Journals/Conferences^ is^ around^ 700.^ He^ has^ also

received^ Certificate^ of^ Merit^ 2005-2006^ for his paper from The^ Institution^ ofEngineers (India). He has chaired 7 International Conferences and

30 National Conferences. He^ is^ a member^ of^ various professional bodies like IE (India),

ISTE,^ CSI,^ ACS^ and^ SSI.^ He^ is^ a technical advisor for various reputed industries and engineering

institutions.^ His research^ areas^ include Modeling and Simulation, Neural Networks, Fuzzy^ Systems^ and Genetic Algorithm,

Pattern^ Recognition, Multidimensional system analysis, Linear and Nonlinear control system,

Signal^ and Image^ prSJ_~-~sing,^ Control^ System,^

Power system,^ Numerical^ methods,^ Parallel^ Computing,

Data Mining and Database Security.Dr. S. N. Deepa has completed her BE Degree from Government^ College^ of^ Technology, Coimbarore, 1999,^ ME^ Degree from^ PSG^ College^ of^ Technology, Coimbatore,

2004,^ and^ Ph.D.^ degree in Electrical Engineering from Anna University,^ Chennai,^ in the year

2008.^ She^ is^ currently Assistant^ Professor,^ Dept.

of Electrical and Electronics Engineering, Anna University ofTechnology, Coimbatore. She

was^ a gold medalist

in her^ BE.^ Degree^ Program.^ She^ has received G.D. Memorial Award in the year 1997 and Best

Outgoing Student^ Award from^ PSG^ College^ of^ Technology,

2004.^ Her^ ME^ Thesis won National Award from the Indian Society^ of^ Technical Education and L&T,^ 2004.

She^ has^ published 7 books and 32 papers in International and National Journals.^ Her^ research^ areas^ include Neural Network,

Fuzzy^ LOgic,^ Genetic Algorithm, Linear and Nonlinear^ Control^ Systems,^ Digital^ Control,

Adaptive and^ Optimal^ Control.

r,.^ ~I

Contents Preface

v

About the Authors^ ...............^.^

viii

1.^ IntroductionLearning^ Objectives^ 1.1^ Neural Networks1.1.1^ Artificial^ Neural Network: Definicion^ 1.2^ 1.3^ 1.

1.1.2^ Advantages^ ofNeural Networks^ ...........................................................

Application^ Scope^ of^ Neural Networks^ .............................................................

Fuzzy^ Logic^ ...........................^ ,^ ...............................................................

Genetic Algorithm^ ...................................................................................

1.5^ Hybrid^ Systems^ ......................................................................................

1.5.1^ Neuro Fuzzy Hybrid^ Systems^ .................

_.............................................^6

1.5.2^ Neuro Genetic Hybrid^ Systems^ ............................................................

1.5.3^ Fuzzy^ Genetic Hybrid^ Systems^

..................^7

1.6^ Soft^ Computing^

1.7^ Summary···'~·····················^

....................^9

2/\Art;ficial^ Neural^ Network:^ An^ Introduction J^ .--^ Learning Objectives

......^ II .........................^ II

2.1^ Fundamental^ Concept^ ...........^.^

II

2.1.1^ Artificial Neural Network^ ......................................................

.^ II^12

2.1.2^ Biological Neural^ Network 2.1.3^ Brain^ vs.^ Computer^ -^ Comparison^ Between

Biological Neuron and ArtificialNeuron (Brain vs. Computer).........................^.....^................^.............. 14 2.2^ Evolution^ of^ Neural Networks^ .....................................................................

2.3^ Basic^ Modds^ of^ Artificial Neural Network..

..........................^.......................^

2.3.1^ Connections..^...............................................

...^.................^...^......^17

2.3.2^ Learning^ .....................^.^

.........^20

2.3.2.1^ Supervised Learning^ ..............................................

......^20.

2.3.2.2^ Unsupervised Learning^ .......................................................

2.3.2.3^ Reinforcement Learning..........^...................

....................^ ....^21

2.3.3^ Activation Functions^ ......................................................................

2.4^ Important^ Terminologies^ of^ A.NNs^ ...............................................................

U!^ ~~---··············^ ......................................

M

2.4.2^ Bias^ .. ..^ .. ..^ ..^.^ ..^ .. .. .. .. .. .. ..^ .. ..^ .. ..^ .. ..

.. .. .. .. .. .. .. .. ..^.^ .. .. ..^.^ ..^ .....................^24

2.4.3^ Threshold^ .................................................................................

2.4.4^ Learning^ Rare^ .............................................................................

2.4.5^ Momentum Factor^ ........................................................................

2.4.6^ Vigilance^ Parameter^ ....................................................................

2.4.7^ Notations^ ..................................................................................

2.5^ McCulloch-Pitrs^ Neuron......^

-----------------------·------·-·················^27

xii^

Contents

4.6.1.2^ Training^ Algorithm^ of^ Discrete^ Hopfield Net

..............................^111

4.6.1.3^.^ Testing^ Algorithm^ of^ Discrete^ Hopfield Net

...............................^112

4.6.1.4^ Analysis^ of^ Energy^ Function and^ Storage

Capacity^ on^ DiscreteHopfield Ne< .................................................................^113

4.6.2^ Continuo~^ Hopfield^ Nerwork^..^.^.^.^.^.^.

.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^114 4.6.2.1 Hardware Model of Continuous Hopfield^ Network^ .......................^114 4.6.2.2 Analysis of Energy Function of Continuous Hopfield Network^ ..........^116

4.7^ Iterative^ Auroassociacive^ Memory^ Networks

.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^..^.^.^.^.^.^.^.^.^.^.^..^.^.^..^.^.^.^.^.^.^.^.^.^.^.^..^.^.^..^.

4.7.1^ Linear^ Autoassociacive^ Memory^ (LAM)

................................................^118

4.7.2^ Brain·in·ilie-Box^ Network^ ..............................................................

4.7.2.1^ Training^ Algorithm^ for^ Brain·in-ilie·Box

Model^ ...........................^119

4.7.3^ Autoassociator^ with Threshold Unit^ ....................................................

4.7.3.1^ TesringAlgoridtm^ ............................................................

4.8^ Temporal^ Associative^ Memory^ Network^

4.9^ Summary^ ..........................................................................................

4.10^ Solved^ Problems^ ...................................................................................

4.11^ Review^ Questions^ .................................................................................

4.12^ Exercise^ Problems^ .................................................................................

4.13^ Projects^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^..^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.

..^.^..^..^..^.^.^..^..^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^..^.^.^..^..^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.

.^.^.^146

5.^ Unsupervised^ Learning^ Networks^ ..................................................................

Learning^ Objectives^ ...............................................................................

5.1^ Introduction^ .......................................................................................

5.2^ FIXed^ Weight^ Competitive^ Nets^ ..................................................................

5.2.1^ Maxner^ ...................................................................................

5.2.1.1^ Architecture ofMaxnet^ ......................................................

5.2.1.2^ Testing/Application^ Algorithm^ of^ Max

net^ .......................^ ,^ ..........^149

5.2.2^ Mexican^ HarNer^ ...........................

:^ ............................................^150 5.2.2.1 Architecture ...................................................................^150 5.2.2.2 Flowchart ...................................................................^150 5.2.2.3 Algoridtm .....................................................................^152

5.2.3^ Hamming Network .._..................................................................

5.2.3.1^ Architecture^ ...................................................................

5.2.3.2^ Testing^ Algorithm^ ............................................................

J^ Kohonen^ Self-Organizing^ Feature^ Maps

.^ •.^!^ ;:;:;^ !:':.:c~~~··.·.··.·.·.·.·.·.·.·.·.·.·.·.·.·.·

..^ ·.····.·.·.·.·..^ ·.·.·.·.·.·.·.·.·.·.·.·.·.·..^ ·.·.··.·.··.·.·.·..^ ·.·.·.·.·.·.·.··..^ ·.·.·.·.·

..^ ·.·.·.·.^ :;~

5.3.3^ Flowchart^ .................................................................................

5.3.4^ TrainingAlgoriilim^ ......................................................................

/^ 5.3.5^ Kohonen^ Self.Qrganizing^ Momr

Map^ ...............................................

J5.4;^ Learning^ Vector^ Quantization^ .................................................................

5.4.1^ Theo<y^ ...........................................................•.......................

5.4.2^ Architecture^ .............................................................................

5.4.3^ Flowchart^ ................................................................................

5.4.4^ TrainingAlgoriilim^ ...................................................................

5.4.5^ Variants^

~ "fi.^ ' .................................... 164 Contents^

xill 5.4.5.1^ LVQ2^ .........................................................................5.5 5.6 5.75.85.9 5.10 5.11 6.

5.4.5.2^ LVQ2.^ ......................................................................

5.4.5.3^ LVQ3^ ................^ :^ ........................................................

Counterpropagation^ Networks^ ...............^ :..

·.................................................^165

5.5.1^ Theo<y^ ................................^ :·...................................................

5.5.2^ Full^ Counterpropagation Net^ ...........................................................

5.5.2.1^ Architecture^ ...................................................................

5.5.2.2^ Flowchart^ .....................................................................

5.5.2.3^ TtainingAlgoritl!m^ ...........................................................

5.5.2.4^ Testing^ (Application)^ Algoridtm^ ............................................

5.5.3^ Forward-Only^ Counterpropagation Net

...............................................^174 5.5.3.1 Architecture ...................................................................^174 5.5.3.2 Flowchart .....................................................................^175 5.5.3.3 TtainingAlgoridtm ...........................................................^175 5.5.3.4 TestingAlgoridtm ............................................................^178

Adaptive^ Resonance^ Theory^ Network^.^.^.^.^..^..^.^.

.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^..^..^.^.^.^.^.^.^.^.^.^.^..^.^.^.^.^.^.^.^.^.^.^.^.^.^..^.^.^.^179

5.6.1^ Theo<y^ ...................................................................................

5.6.1.1^ FundarnemalArchitecrure^ ...................................................

5.6.1.2^ Fundamental Operating^ Principle^

.....^ ·········^ ......^ ··········^180

5.6.1.3^ Fundarnema!Algoridtm^ .........................

5.6.2^ Adaptive^ Resonance^ Theory^1 .......................

5.6.2.1^ Architecture^ ...............,-............................

5.6.2.2^ Flowchart^ of Training^ Pro~s 5.6.2.3^ Training^ Algoriilim^.

181. 181 .1U ..... 1M ... 1M .............. ···················· 1~

5.6.3^ Adaptive^ Resonance^ Theory 2 .5.6.3.1^ Architecture^ .................................................................

5.6.3.2^ Algori1hm^ ...................................................................

5.6.3.3^ Flowchart^ .....................................................................

5.6.3.4^ Training^ Algorithm^ ...........................................................

5.6.3.5^ Sample^ Values^ of^ Parameter^ .................................................

196

Summary^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.

.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^..^.^..^.^.^.^.^.^.^.^..^.^.^.^.^.^..^.^.^..^.^.^.^.^.^197

Solved^ Problems^ ...................................................................................

Review^ Questions.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^..^.^.^.^.^.^.^.^.^.^

.^.^.^.^.^ ..^.^.^.^.^.^.^.^.^.^.^.^.^.^.^..^.^..^.^.^.^.^.^.^..^.^.^.^.^..^ .........^226

Exercise^ Problems^ .................................................................................

Projects^.^.^.^..^.^..^.^.^.^.^.^.^.^.^.^.^.^..^.^.^.^.^.^..^.^.^.^.^.^.^.^..^.^.^.^.^.^.

.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^..^.^.^.^.^.^.^.^.^..^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^229

Special^ Networks^ ......................................................................................

Learning^ Objectives^ ...............................................................................

6.1^ Introduction^ .......................................................................................

6.2^ Simulated Annealing Network^ ...................................................................

6.3^ Boltzmann^ Machine^ ..............................................................................

6.3.1^ Architecture^ ..............................................................................

6.3.2^ Algoridtm^ ................................................................................

6.3.2.1^ Setting^ dte^ Weigh"^ of^ dte^ Network^ .........................................

6.3.2.2^ Testing^ Algorithm^ ...........................................................

6.4^ Gaussian^ Machine^.^.^.^..^.^.^.^.^.^.^.^ ..^.^.^.^.^.^.^

.^ ............................^236

xiv Contents

6.166.156.14 6.136.12^ 6.11^ 6.106.96.86.76.66.5Cauchy Machine ..................................................................................

Probabilistic NeUral Net ..........................................................................

Cascade Correlation Nerwork ....................................................................

Cogniuon Nerwork ...............................................................................

Neocogniuon Network ...........................................................................

Cellular Neural Nerwork .........................................................................

Logicon Projection Ne[Work Model .............................................................

Spatio~Temporal Connectionist Neural Network

Optical Neural Networks .........................................................................

6.13.1 Electro~Optical Multipliers .............................................................

6.13.2 Holographic Correlators ................................................................

Neuroprocessor Chips .............. , .............................................................

Summary ..........................................................................................

Review Questions .................................................................................

  1. (^) Iurroduction (^) to (^) Fuzzy (^) Logic, (^) Classical Sets (^) and (^) Fuzzy (^) Sets (^) ...................................

Learning Objectives ................................................................................

7.1 Introduction to Fuzzy Logic ......................................................................

7.2 Classical Sets (Crisp Sers) ....................................

7 .2.1 Operations on Classical Sets ....................................

7.2.l.l Union .........................................................................

7.2.1.2 Intersection .................................................................

7.2.1.3 Complement ...............................................................

7.2.1.4 Difference (Subtraction) ...................................................

7.2.2 Properties of Classical Sets ........

7 .2.3 Function Mapping of Classical Sets ...

7.3 Fuzzy Sets .....................................................................................

7.3.1 Fuzzy Set Operations ..................................................................

7.3.1.1 Union ..................... ..................

7.3.1.2 Intersection ................................................................

7.3.1.3 Complement .............................................................

7 .3.1.4 More operations on Fuzzy Sets ..........

7.3.2 Properdes of Fuzzy Sets .............................................................

7.4 Summary........ ....... ...........

7.5 Solved Problems .................................................................................

7.6 Review Quesrions ............... .................................

7.7 Exercise Problems .............................................................................

8. Classical Relations and Fuzzy Relations.

Learning Objectives ..........................................................................

8.1 Inuoduction ...................................................................................

8.2 Cartesian Product of Relation .. ,. , .............................

8.3 Classical Relation ......................_..................................................

8.3.1 Cardinality of Classical Relation .......

8.3.2 Operations on Classical Relations .....................................

8.3.3 Properties-of Crisp Relations .........................................................

8.3.4 Composition of Classical Relations ....................................................

277 Contents

XV

  1. 8.88.78.6 8.5 8.4 fll2Lf (^) Relations...................................

8.4.1 Cardinality of Fuzzy Relations ............

8.4.2 Operations on Fuzzy Relations _. ........................................................

8.4.3 Properties of Fuzzy Relations ............................................................

.8.4.4 (^) Fuzzy (^) Composition (^) ................ (^) ·.•· (^) ................................................... (^282)

Tolerance and Equivalence Relations . ~ ..........................................................

8.5.1 Cl:w;ical Equivalence Relation ..........................................................

8.5.2 Classical Tolerance Relation .............................................................

8.5.3 Furiy Equivalence Rdacion .............................................................

8.5.4 FuzZy Tolerance Relation; ...............................................................

Noninteractive Fuzzy Sets ..................................................................

Summary ..........................................................................................

Solved Problems ...................................................................................

8.9 Review Questions .............................................

293 --^ - ·-^ ·^ ·^ ..^ ----^ ....... :..^ 292

8.10 Exercise Problems ........................

-- (^) .. (^) ...........................

9.89.79.69.5 9.49.39.29.1 Membership Functions ............... ..

Features of the Membership Functions .............................................. Introduction ................................ . Learning Objectives ...... .

Fuzzificdtion .......................................................................................

Methods of Membership Value Assignments ....................................................

9.4.1 Intuition ......................................................

(^) - ........................... (^299)

9.4.2 Inference ...............................................................................

9.4.3 Rank Ordering... . . .. .. .... . ........................................................

9.4.4 Angular Fuzzy Se~ ....................................................................

9.4.5 Neural Networks .......................................................................

9.4.6 Genetic Algorithms ................................................................

9.4.7 Induction Reasoning ....................................................

Solved Problems .. .. .. ............... . Summary ................................ .

10. Defuzrification .........................................................................................Exercise Problems .. Review Questions

10.4^ 10.3^ 10.2^ 10.1 Learning Objectives ............................................................................

Introduc-tion .....................................................................................

Lambda-Curs for Fuzzy Sets (Alpha-Cuts) .....................................................

Lambda~Cuts for Fuzzy Relations ................................................................

Defuzzification Methods .. . . . . . ... . . .. . ... .... . .

l 0.4.1 Max~Membership Principle . ... . .. . . . .

10.4.2 Cemroid Me<hod .......................................................................

10.4.3 WeightedAverageMe<hod ..............................................................

10.4.4 Mean-Max Membership .................................................................

10.4.5 Center of Sums ..........................................................................

10.4.6 CenrerofLargestAr"ea .................................................................

1 0.4. 7 First of Maxima (Last of Maxima)

xviii

Contents

15.5 Generic Algorithm vs. Traditional Algorithms

15.6 Basic Terminologies in Genetic Algorithm

15.6.1 Individuals ...............................................................................

15.6.2 Gone> .....................................................................................

15.6.3 Fitn"" ...... .-.............•...............................................................

15.6.4 Populations ...............................................................................

15.7 Simplo GA .........................................................................................

15.8 General Genetic Algorithm ......................................................................

15.9 Operators in Generic Algorithm ............

15.9.1 Encoding ..............................................................................

15.9.1.1 Binary Encoding .............................................................

15.9.1.2 Ocral Encoding ...............................................................

15.9.1.3 Hc=dooimal Encoding ......................................................

15.9.1.4 Permutation Encoding {Real Nwnber

Coding) ............................ 406

15.9.1.5 Valuo Encoding ...............................................................

15.9.1.6 Tree Encoding ................................................................

15.9.2 Selection ..................................................................................

15.9.2.1 Roulette Wheel Selection ....................................................

15.9.2.2 Random Selection ............................................................

15.9.2.3 Rank Sdocrion ...............................................................

15.9.2.4 TournamenrSelecrion .......................................................

15.9.2.5 Boltzmann Selection .................................................

15.9.2.6 Stochastic Universal Sampling . . . . .. .

15.9.3 Crossover {Recombination) ..................................................

_. ........ 410

15.9.3.1 Single~PoimCrossover .......................................................

15.9.3.2 Two-Point Crossover .........................................................

15.9.3.3 Multipoint Crossover {N-Point Crossover)

15.9.3.4 Uniform Crossover ........................................................

15.9.3.5 Three-Parent Crossover .................................................

15.9.3.6 CrossoverwithReducedSurrogate .....................................

15.9.3.7 Shuffle Crossover ...........................................................

15.9.3.8 Precedence Preservative Crossover .........................................

15.9.3.9 Ordered Crossover ......................................................

15.9.3.10 Partially Matched Crossover ...............................................

15.9.3.11 Crossover Probability ......................................................

15.9.4 Mutation ........................ ·....................................................

15.9.4.1 Flipping .....................................................................

15.9.4.2 Interchanging......... . ..................................................

15.9.4.3 Revorsing .....................................................................

15.9.4.4 Mutation Probability ......................................................

15.10 Stopping Condition for Generic Algorithm

Flow ..............................................

15.10.1 Be>dndividlla! ........................................................................

15.10.2 Worst individual .........................................................................

15.10.3 Sum of Fitness ...........................................................................

15.10.4 Median Fitness .........................................................................

15.11 Constraints in Genetic Algorithm ............................................................

L^417 Contents

xix

15.12 Problem Solving Using Genetic Algorithm

15.12.1 Maximizing a Function ...................

15.13 The Schema Theorem .....................

15.13.1 The Optimal Allocation of Trials' ... _: ...

15.13.2 Implicit Parallelism ................. :: .':

15.14Ciassification of Generic Algorithm .... ,

.... (^) :~ (^) ...................................... (^) f:···········

15.i4.1 Messy Generic Algorithms ..............................................................

15.14.2 Adaptive Genetic Algorithms - ...........................................................

15.14.2.1 Adaptive Probabilities of Crossover and Mutation

15.14.2.2 De>ign of Adapci•ep, andpm ...............................................

15.14.2.3 Practical Considerations. and Choice

ofValues for kt, ~. k3 and k4 ...... 429

15.14.3 Hybrid Genocic Algorirhms .............................................................

15.14.4 Parallel Gonecic Algorithm ..............................................................

15.14.4.1 Global Paralldizacion ........................................................

15.14.4.2 Classifir:ation of Parallel GAs ................................................

15.14.4.3 Coarso-Grainod PGAs- The Island

Model ........ ; ....................... 440

15.14.5 Independent Sampling Generic Algorithm

15.14.5.2 Components ofiSGAs ....................................................... 442 15.14.5.1 Comparison ofiSGAwith PGA ............................................ 442 (ISGA) .................................... 441

15.14.6 Real~Coded Generic Algorithms .........................................................

15.14.6.1 Crossover Operators for Real~Coded

GAs .................................. 444

15.14.6.2Mutation Operators for Real~Coded

GAs .................................. 445

15.15HollandClassifierSysrems .......................................................................

15.15.1 The Production System .................................................................

15.15.2 The Bucket Brigade Algorithm .........................................................

15.15.3 Rule Generation .........................................................................

15.16 Genetic Programming .............................

15.16.1 WorkingofGeneticProgramming ...................................................

15.16.2 Characteristics of Generic Programming

15.16.2.4 Machine Intelligence ........................ , .............................. 454 15.16.2.3 Routine ...................................................................... 454 15.16.2.2 High-Rerum ........................... , .................................. 453 15.16.2.1 Human~Competitive ........................................................ 453 ............................................... 453

15.16.3 Data Representation ...................................................................

15.16.3.1 Crossing Programs ........................................................

15.16.3.2 Mutating Programs ........................................................

15.16.3.3 The Fitness Function ........................................................

15.17 Advantages and Limitations of Genetic Algorithm

15.18 Applications of Generic Algorithm .............................................................

15.19Summa')' .......................................................................................

15.20 Review Questions ...............................................................................

15.21 Exercise Problems ..........

  1. (^) Hybrid (^) Soft (^) Computing (^) Techniques 16.1 (^) Introduction (^) .......................... (^). Learning (^) Objectives (^). (^). (^). (^). (^). (^).. (^). (^). (^).... (^). (^). (^). (^). (^).. (^) ................................... (^). ......................

XX^ 16.2 16.3 16.4 16.5 16.616.716.816.

Contents

Neuro-Fuzzy^ Hybrid^ Systems^ ....................................................................

16.2.1^ Comparison^ of^ Fuzzy^ Systems^ with^ Neural

Networks^ .........•....^ ,^ ..^ ,^ ...............^466

16.2.2^ Characteristics^ ofNeuro~Fuzzy^ Hybrids

..........................^ ,.........^ ,^ ...^ ,......^467

16.2.3^ Classifications^ ofNeuro,F=y^ Hybrid^

Systems^ .......................................^468 16.2.3.1 Cooperative Neural Fuzzy 5}'5tems^ ..........................................^468 16.2.3.2 General Neuro-Fuzzy Hybrid^ Systems^ (General^ NFHS)^ ..................^468 16.2.4 Adaptive Neuro,Fuzzy Inference Sysrem^ (ANFIS)^ in^ MATLAB^ .....................^470 16.2.4.1 FIS Structure and Parameter Adjustment^ ...................................^470 16.2.4.2 Constntints of ANFIS ........................................................^471 16.2.4.3 The ANFIS Editor GUI .....................................................^471 16.2.4.4 Data Formalities and the ANFIS^ Editor^ GUI^ ..............................^472 16.2.4.5 More on ANFIS Editor GUI^ ................................................^472 Genetic Neuro~Hybrid Systems ..............................................^ ,^ ....^ ,^ ...^ ,^ ..^ ,^ ......^476 16.3.1 Properties of Genetic Neuro-Hybrid Systems^ ...................................^ ,^ ......^476 16.3.2 Genetic Algorithm Based Back-Propagation^ Network^ (BPN)^ ........................^476 16.3.2.1 Coding ........................................................................^477 16.3.2.2 Weight Extraction ............................................................^477 16.3.2.3 Fitness Function ..............................................................^478 16.3.2.4 R<production ofOffspting ..................................................^479 16.3.2.5 Convergence ..................................................................^479 16.3.3 Advantages ofNeuro-Genecic Hybrids .................................................^479 Genetic Fuzzy Hybrid and Fuzzy Genetic Hybrid^ Systems^ ....................................^479 16.4.1 Genetic F=y Rule Based Systems (GFRBSs)^ .........................................^480 16.4.1.1 GeneticTuningProcess ...................................................^481 16.4.1.2 Genetic Learning of Rule Bases^ ...........................................^482 16.4.1.3 Genetic Learning of Knowledge^ Base^ ...........................^ ,^ ...........^483 16.4.2 Advantages of Generic Fuzzy Hybrids ................................................^483 Simplified F=y ART MAl'.....................^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^..^ .....^483 16.5.1 Supervised ARTMAP System. .. ... .. ..^ ..^ .. .. ...^ ..^ ...^.^ .. ..^ ..^.^.^.^ .. ..^ ...^.^.^ ..^.^.^.^.^.^ .......^484 16.5.2 Comparison of ARTMAP with BPN ...................................................^484 Summary ........................................................................................^485 Solved Problems using MATLAB ........... ,. ,^ ,^.^ ,^ ............^ ,^ ,^ ,^ ,^ ,^ ..............................^485 Review Questions .. , , , ..................... , , , .............^ ,^ ,^ ,^ ....................................^509 Exercise Problems..............................^.^.^..^.^.^.^.^.^.^.^.^..^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^..^ ......^509

17.^ Applications^ of^ Soft^ Computing .....................

·················^511

Learning^ Objectives^.^ ,^.^ ,^ ........^ ,^ ,^ ,^ ,^.^..^ ..... ,^ ......

,^ ,^ ,^ ,^ ··················^ ....^511

17.1^ Introduction^ .............................................

.^.^ ...^511

17.2^ A^ Fusion^ Approach^ of^ Multispectral^ Images

with^ SAR^ (SynthericAperrure^ Radar)Image for Flood Area Analysis...............^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.^.

17.2.1^ Image^ Fusion^ ..........................................................................

17.2.2^ Neural^ Network^ Classification^ ..........................................................

17.2.3^ Methodology^ and^ Results^ ...............................................................

17.2.3.1^ Method^ .......................................................................

17.2.3.2^ Results^ ............................................

........^ .....^.^ .......^514

17.3^ Optimization^ ofTrav~ling^ Salesman^ Problem

using^ Genetic^ Algorithm^ Approach^ ...........^

Contents^515

xxi

17.3.1^ Genetic^ Algorithms^ ............... 17.

::^ ......................................................

17.3.2^ Schemata·································'···············································

17 .3.3^ Problem^ Representation^ .................................................................

17.3.4^ Reproductive^ Algorithms···---~-...^ ·:,.^ ..._

.................................................^517

17.3.5^ MutacioriMethods^ ...............^ ;^ ..^ :_..'

.................................................^518

17.3.6^ Results^ ·······························'····················································

Genetic^ Algorithm-Based^ Internet^ SearCh^ Technique

...........................................^519

17.4.1^ Genetic^ Algorithms^ and^ lmemet^ .......................................................

17.4.2^ First^ Issue:^ Representation^ ofGenomes

................................................^521 17.4.2.1 String Represen[acion ........................................................^521 17.4.2.2 AirayofSuingRepresentation^ ..............................................^52217 .4.2.3 Numerical Representation ...................................................^522 17.4.3 Second Issue: Definicion of the Crossover^ Operator^ ...................................^523 17.4.3.1 Classical Crossover ...........................................................^523 17.4.3.2 Parent Crossover ..............................................................^523 17.4.3.3 Link Crossover ................................................................^523 17.4.3.4 Overlapping Links ...........................................................^523 17.4.3.5 Link Pre-Evaluation ..........................................................^524 17.4.4 Third Issue: Selection of the Degree of^ Crossover^ .....................................^524 17.4.4.1 Limired.Crossover ............................................................^524 17.4.4.2 Unlimited Crossover .........................................................^52417 .4.5 Fourth Issue: Definicion of the Mutation^ Operator^ ...................................^525 17.4.5.1 Generational Mutation ......................................................^526 17.4.5.2 Selective Mutation ...........................................................^526 17.4.6 Fifth Issue: Definition of the Fimess Function^ .........................................^527 17.4.6.1 Simple Keyword Evaluation .................................................^527 17.4.6.2 Jac=d's Score ................................................................^527 17.4.6.3 Link Evaluation ..............................................................^529 17.4.7 Sixth Issue: Generation of rhe Output Set^ .............................................^529 17.4.7.1 ImeracriveGeneration .....................................................^529 17.4.7.2 Post~Generation ........... ..^ .....................^529

Soft^ Computing^ Based^ Hybrid^ Fuzzy^ Controllers17.

............................^ --^ ..............^529

17.5.1^ Neuro~FuzzySysrem^ .....................................................................

17.5.2^ Real-1ime^ Adaptive^ Control of a^ Direct

Drive^ Motor^ ................................^530

17.5.3^ GA-Fuzzy^ Systems^ for^ Control of^ Flexible

Robers^ ....................................^530 17.5.3.1 Application to Flexible Robot Control^ ..................................^531

17.5.4^ GP-FuzzyHierarchical^ Behavior^ Control

............................................^532

17.5.5^ GP-Fuzzy^ Approach^ .....................................................................

Soft^ Computing^ Based^ Rocket^ Engine^ Control^

.................................................^534

17.6.1^ Bayesian^ Belief^ Networks^ ..............................................................

17.6.2^ Fuzzy^ Logic^ Control^ .....................................................................

17.6.3^ Software^ Engineering^ in^ Marshall's^ Flight

Software^ Group^ ...........................^537

17 .6.4^ Experimental^ Appararus^ and^ Facility^ Turbine

Technologies^ SR-30^ Engine^ ..........^537

17.6.5^ System^ Modifications^ ....................................................................

17.6.6^ Fuel-Flow^ Rate^ Measurement^ Sysrem^ ..................................................

17.6.7^ Exit^ Conditions Monitoring^

................^538

0~~ • .....^ ~ -c::' hltroduction

1 Learning^ Objectives^ -------------------, Scope^ of^ soft computing.Various components under soft^ computing.^ Description^ on^ artificial^ neural^ networkswith^ its^ advantages and applications. 11.1^ Neural-Networks

An^ overview^ of^ fuzzy^ logic.A note on genetic algorithm.The theory^ of^ hybrid systems.

A neural^ necwork^ is^ a processing device, either an algorithm or an actual hardware, whose design

was inspired^ by^ the design and functioning^ of^ animal brains and components

thereof.^ The computing world has^ a lot to gain^ from^ neural^ necworks,^ also^ known

as^ artif~eial^ neural networks or neural net. The neu- ral^ necworks^ have^ the abili^ to^ lear^

le^ which makes them very flexible and^ powerfu--rFDr"' neural networks, there^ is^ no need^ to^ devise an algorithm to

erfo~2Pec1^ c^ {as^ -;-rt-r.rrir,ttlere^ it^ no need^ to^ understand the internal^ mec^ amsms o

at^ task.^ These networks^ are^ also^ well^ suited"llr'reir- time^ systents^ because_,.Ofth~G"'f~t"~~PonSe-^ ana-co~pmarional

times which^ are^ because^ of^ rheir parallel architecmre.Before^ discussing^ artiftcial^ neural^ net\Vorks,^

let^ us^ understand how the human brain^ works.

The human brain^ is^ an^ amazing^ processor. Its exact workings are still a

mystery.^ The^ most basic element^ of^ the human brain^ is^ a specific^ type^ of^ cell, known^

as^ neuron, which doesn't regenerate. Because neurons aren't slowly replaced,^ it^ is^ assumed^ that they provide

us^ with our abilities^ to^ remember, think and apply pre- vious experiences^ to^ our every action.^ The human brain comprises about

100 billion neurons.^ Each neuron can connect with^ up^ to^ 200,000^ orher neurons, although

1,000-10,000^ interconnections^ arc; typical.The^ -power^ of^ the^ human^ mind^ comes^

from^ the sheer^ numbers^ of^ neurons^ and^ their multiple

interconnections.^ It^ also^ comes from generic programming and

learning.^ There^ are^ over^100 different classes^ of^ neurons.^ The^ individual neurons^ are

complicated. They^ have^ a myriad^ of^ parts,^

subsystems and control mechanisms. They convey informacion via a host

of^ electrochemical pathways. Together these neurons and^ their conneccions^ form^ a

process^ which^ is^ not^ binary,^ not stable,^ and not syn- chronous. In short, it^ is^ nothing like the currently available elecuonic computers, or

even^ arcificial^ neUral networks.

lnlroduclion

An (^) artificial (^) neural (^) network (ANN) may be defined 1.1.1 (^) Artificial Neural Network: Definition (^) as (^) an infonnation·processing model (^) that (^) is (^) inspired

by

the (^) way (^) biological nervous systems, such (^) as (^) the brain, (^) process (^) information. (^) This (^) model (^) rriis (^) ro replicate only

the most basic functions of rh~ brain. The key

(^) element (^) of (^) ANN (^) is (^) the (^) novel (^) structure (^) of (^) irs information

processing system. An ANN is composed of a

large number of highly interconnected prOcessing

(^) elements

Anificial neural networks, (^) like (^) people, learn (neurons) (^) wo_rking (^) in unison (^) to (^) solve specific problems.

by example. An ANN is configured for a specific application,

such (^) as (^) pattern recognition or data classification through a learning process. (^) In (^) biological systems, learning

involves adjustments (^) to (^) the (^) synaptic connections that (^) exist (^) between (^) the (^) neurons. (^) ANNs (^) undergo a similar

change that occurs when the concept (^) on (^) which they are (^) built (^) leaves the academic environment and is thrown

into (^) the harsher world (^) of (^) users who simply (^) wa~t (^) to get a job done (^) on (^) computers accurately all the (^) time.

Many neural networks now being designed are statistically quite accurate, (^) but (^) they still leave their users with

a bad (^) raste (^) as (^) they falter when it comes to solving-problems accurately.

They might be 85-90% accurate.

Unfortunately, (^) few (^) applications tolerate (^) that (^) level (^) of (^) error.

Neural (^) networks, with their remarkable ability to derive meaning from complicated^ I (^) 1.1.2 (^) Advantages of Neural Networks (^) or (^) imprecise data, could

be used to extract patterns and detect trends (^) that (^) are too complex·ro be noticed by either humans (^) or (^) other

computer techniques. A trained neural network could be thought (^) of (^) as (^) an (^) "expert" (^) in a particular cat-

egory (^) of (^) information it (^) has (^) been given (^) m (^) an.Jyze. (^) This expert could be used to provide projections in

new situations (^) of (^) interest and answer (^) "what (^) if' (^) questions. (^) Other (^) advantages (^) of (^) worlcing (^) with (^) an (^) ANN

l. Adaptive learning: An ANN is endowed with the abilityinclude:

(^) m (^) learn how (^) to (^) do (^) taSks (^) based (^) on (^) the data given

2. Selforganizlltion: An ANN can create irs own organization or representationfor training or initial experience.

(^) of (^) the information it receives

3. Real-time operation: ANN computations may be carriedduring learning tiine.

(^) out (^) in parallel. (^) Special (^) hardware devices are being

designed and manufactured to rake advantage (^) of (^) this capability (^) of (^) ANNs.

4. Fault tolerattce via reduntMnt iufonnation coding.

(^) Partial (^) destruction (^) of (^) a (^) neural (^) network leads to the

corrcseonding degradation (^) of (^) performance. However, (^) so~ (^) caP-@lfuies.may (^) .be (^) reJained even

Currently, neural ne[\vorks can't function (^) as (^) a user interface which translates spoken words .---·· after (^) major (^) ~e.~ (^) dam~e. (^) into (^) instructions

for a machine, (^) but (^) someday they would have rhis skilL (^) Then (^) VCRs, (^) home (^) security systems, (^) CD (^) players, and

word processors would simply be activated by voice. (^) Touch (^) screen (^) and (^) voice editing (^) would (^) replace the word

processors (^) of (^) today. Besides, spreadsheets and databases would be imparted such level (^) of (^) usability that would

be pleasing (^) co (^) everyone. But for now, neural (^) networks (^) are only entering the marketplace in niche areas where

accurate but vague. Loan approval is one such area. Financial institutions make more moneyMany of these niches indeed involve applications where answers provided by the software programs are nottheir statistical accuracy is valuable.

(^) if (^) they succeed in

having (^) the (^) lowest bad loan rate. For these (^) instirurions,

insralling systems that are "90% accurate" in selecting

the genuine loan applicants might be an improvement over their current selection (^) pro~ess. (^) Indeed, some banks

have proved that the failure rate (^) on (^) loans approved by neural networks (^) is (^) lower than those approved by (^) tkir t 1.2 (^) Application (^) Scope (^) of (^) Neural (^) Networks Figure 1 ~ (^1) The (^) multi-disciplinary point (^) of (^) view (^) of (^) neural nerworks.Control (^) theory (^) robotics(time (^) series, (^) data (^) mining) Image/signal (^) processing Economics/financeEngineering Statistical physicsDynamical (^) systemsPhysics optimization)(approximation (^) theory,MathematicsArtificial (^) intel~igenceComputer (^) science

best traditional methods. Also, some credit card (^) companies (^) are using neural (^) networks (^) in their application

screening process.

' (^) I (^) h1s (^) newest (^) method (^) of (^) looking into the future (^) by (^) analyzing past experiences (^) has (^) generated irs own unique

set (^) of (^) problems. (^) One (^) such problem (^) is (^) to provide a reason behind a computer·generated (^) answer, (^) say, (^) as (^) to

why a particular loan application (^) was (^) denied. To explain how a network learned (^) and (^) why it recommends (^) a

particular decision has been difficult. (^) The (^) inner workings (^) of (^) neural (^) networks (^) are (^) "black (^) boxes." (^) Some (^) people

have even called the use (^) of (^) neural (^) networks (^) "voodoo (^) engineering." To (^) justifY (^) the decision·making process,

several neural network tool makers have provided programs (^) that (^) explain which (^) input (^) through which node

Apart from filling the niche areas, neural nerwork's workwhich data plays a major role in (^) decision· (^) making and its imponance.dominates the decision-making process. From this information, experts in the application may be able to infer (^) is (^) also progressing in (^) orher (^) more promising

application areas. (^) The (^) next section (^) of (^) this chapter goes through some (^) of (^) these areas and briefly details

the current work. (^) The (^) objective (^) is (^) to (^) make (^) the reader aware (^) of (^) various possibilities where neural networks

might offer solutions, such (^) as (^) language processing, (^) character recognition, (^) image compression, pattern

Neural networks can be viewed from a multi-disciplinary poimrecognition, etc. ._-"'"^ of^ view^ as^ shown in Figure^ 1-l. (^) /

The (^) neural networks have good scope (^) of (^) being used in the following^ I (^) 1.2 (^) Application (^) Scope (^) of (^) Neural Networks (^) areas:

I. Air traffic control could be automated with the location, altitude, direction

(^) and (^) speed (^) of (^) each radar blip

taken (^) as (^) input (^) to the nerwork. (^) The (^) output (^) would be the air (^) traffic (^) controller's instruction in response to

2. Animal behavior, predator/prey relationships andeach blip.

population cycles may be suitable for analysis by neural

3. Appraisal and valuation of property, buildings, automobiles, machinery, etc. should benetworks.

(^) an (^) easy task for a

neural network.

.Introduction

Genetic algorithm (^) (GA) (^) is (^) ieminiscent (^) of (^) sexual^ 11.4 (^) Genetic (^) Algorithm (^) reproduction (^) in (^) which the (^) genes (^) of (^) rwo (^) parents (^) combine

to (^) form (^) those (^) of (^) their children. When it (^) is (^) applied (^) ro (^) problem solving, the basic premise (^) is (^) that we (^) can

create (^) an (^) initial population (^) of (^) individuals represencing possible solutions (^) to (^) a problem (^) we (^) are (^) trying (^) ro (^) solve.

Each (^) of (^) iliese (^) individuals (^) has (^) certain (^) characteristics that

make them more or less fit as members of the

population. The more (^) fir (^) members (^) will (^) have (^) a higher probability of mating and producing (^) offspring (^) that (^) have

a significam chance (^) of (^) retaining the desirable characteristics

of their parents than rhe less fit members. This

method (^) is (^) very (^) effective (^) at finding optimal or near-optimal (^) solurions (^) m (^) a (^) wide (^) variety (^) of (^) problems (^) because it

does (^) nm (^) impose many limirarions (^) required (^) by (^) traditional methods. It (^) is (^) an (^) elegant (^) generate-and-test strategy

dm can identify and exploit regu\ariries in the environment; and

(^) results (^) in (^) solutions that (^) are (^) globally (^) optimal

Genetic algorithms (^) are (^) adaptive (^) computational procedures modeled on theor^ nearly (^) so. (^) mechanics (^) of (^) natural (^) generic

systems. (^) They (^) express (^) their ability (^) by (^) efficiently (^) exploiting (^) the (^) historical (^) informacion (^) to (^) speculate on (^) new

offspring (^) with (^) expected (^) improved performance. (^) GAs (^) are (^) executed (^) iteratively on a (^) set (^) of (^) coded (^) solutions,

called (^) population, with three (^) basic (^) operators: selection/reproduction, (^) crossover (^) and (^) mutation. They (^) use

only (^) the (^) payoff (^) (objective (^) function) information (^) and (^) probabilistic transition (^) rules (^) for (^) moving (^) to (^) the (^) next

iteration. (^) They (^) are (^) different (^) from (^) most (^) of (^) the (^) normal (^) optimization (^) and (^) search (^) procedures (^) in (^) the (^) following

  1. (^) GAs (^) work (^) with (^) the (^) coding (^) of (^) the (^) parameter four (^) ways: set, (^) not with (^) the (^) parameter (^) themselves;

3. GAs search via sampling (a blind search) using2. GAs work simultaneously with multiple poims, not a'single point;

(^) only (^) the payoff information;

  1. (^) GAs (^) search (^) using (^) stochastic (^) operators, not deterministic

~les.

Since (^) a (^) GA (^) works (^) simulraneously (^) on (^) a (^) set (^) of (^) coded (^) solutions, it (^) has (^) very (^) little (^) chance (^) to (^) get (^) stuck (^) at

local (^) optima (^) when (^) used (^) as (^) optimization technique. (^) Again, (^) it (^) does (^) not (^) need (^) any (^) son (^) of (^) auxiliary (^) information,

like (^) derivative (^) of (^) the (^) optimizing function. (^) Moreover, (^) rhe (^) resolution (^) of (^) rhe (^) possible (^) search (^) space (^) is (^) increased

by (^) operating on (^) coded (^) (possible) (^) solutions (^) and not on (^) the (^) solutions (^) themselves. (^) Further, (^) this (^) search (^) space

need (^) not (^) be (^) continuous. (^) Recently, (^) GAs (^) are (^) finding (^) widespread (^) applications (^) in (^) solving (^) problems requiring

efficient (^) and (^) effecrive (^) search, (^) in (^) business, (^) scientific (^) and (^) engineering (^) circles (^) like (^) synthesis (^) of (^) neural netwmk

architecrures, (^) traveling (^) salesman (^) problem, graph coloring, scheduling, numerical optimization, and (^) pattern

Hybrid (^) systems (^) can (^) be (^) classified (^) into (^) three (^) different I (^) 1.5 (^) Hybrid Systems recognition (^) and (^) image (^) processing. (^) systems: (^) Neuro (^) fuzzy (^) hybrid (^) system; (^) neuron (^) generic

hybrid (^) system; (^) fuzzy (^) genetic (^) hybrid (^) systems. (^) These (^) are (^) discussed (^) in (^) detail in (^) the (^) following (^) sections.

A (^) neuro (^) fuzzy (^) hybrid (^) system (^) is (^) a (^) fuuy (^) system (^) that I (^) 1.5.1 (^) Neuro (^) Fuzzy (^) Hybrid Systems (^) uses (^) a learning algorithm (^) derived (^) from or inspired (^) by (^) neural

nerwork theory w determine its parameters (fuzzy

In other words, a neuro fuzzy hybrid system refers to die combination of fuzzy set theory and sets and fuzzy rules) by processi~g data samples.

(^) neural

ne£Works (^) having (^) advantages (^) of (^) both which (^) are (^) listed (^) below.

1. It can handJe any kind of informacion (numeric, linguistic,

(^) logical, (^) etc.). 1.5 (^) Hybrid (^) Systems

  1. (^) h (^) can (^) manage (^) imprecise, (^) partial, (^) vague (^) or (^) imperfect (^) information.

6. It can mimic human decision-making process., 5. It doesn'r need prior knowledge of relationshipS.ofdata. 4. Ir has self-learning, self-organizing and self-tunjf)-g.p.p~bilities.3. It can resolve conflicts by collaboration and aggregation.

7. It makes computation fast by using fuzzy nurriber

(^) operations.

Neuro (^) fuzzy (^) hybrid (^) systems (^) combine the (^) advantages

of fuzzy systems, which deal with explicit knowledge

that (^) can (^) be (^) explained and understood, and (^) noural (^) networks, which (^) deal (^) with implicit.knowledge that (^) can (^) be

acquired (^) by (^) learning. Neural (^) nerwork (^) learning (^) provides (^) a (^) good (^) way (^) to (^) adjust the knowledge (^) of (^) the expert (^) (i.e.,

artificial (^) intelligence (^) system) (^) and automatically generate additional (^) fuzzy (^) rules (^) and membership functions

to (^) meet (^) certain specifications. (^) It (^) helps (^) reduce (^) design (^) time and (^) costs. (^) On (^) the other hand, (^) FL (^) enhances (^) the

generalization capability (^) of (^) a neural nerwork (^) system (^) by (^) providing more (^) reliable (^) output (^) when (^) extrapolation (^) is

Genetic algorithms (^) {GAs) (^) have (^) been (^) increasingly^ I (^) 1.5.2 (^) Neuro Genetic Hybrid Systemsneeded (^) beyond the (^) limirs (^) of (^) the (^) training data. (^) applied (^) in (^) ANN (^) design (^) in (^) several (^) ways: (^) topology opti-

mization, genetic training algorithms (^) ·and (^) control parameter optimization. (^) In (^) topology optimization, (^) GA

is (^) used (^) to (^) select (^) a (^) topology (^) (number (^) of (^) hidden (^) layers, (^) number (^) of (^) hidden (^) nodes, (^) interconnection parrern)

In (^) genetic (^) training algorithms, the learning (^) offor (^) the (^) ANN which (^) in (^) turn (^) is (^) trained (^) using (^) some training scheme, most commonly back propagation. (^) an (^) ANN (^) is (^) formu1ated (^) as (^) a weight optimization (^) prob-

lem, (^) usually (^) using (^) the (^) inverse (^) mean (^) squared error (^) as (^) a (^) fitness (^) measure. (^) Many (^) of (^) the control parameters

such (^) as (^) learning (^) rate, (^) momentum (^) rate, (^) tolerance

level, etc., can also be optimized using GAs.

(^) In (^) addi-

tion, (^) GAs (^) have (^) been (^) used (^) in (^) many (^) other innovative (^) ways, (^) w (^) create (^) new (^) indicators (^) based (^) on (^) existing (^) ones,

select (^) good (^) indicators, (^) evolve (^) optimal trading (^) systems (^) and complement other techniques (^) such (^) as (^) fuzzy

The optimization abilities (^) of (^) GAs (^) are (^) used (^) to (^) develop I (^) 1.5.3 (^) Fuzzy (^) Genetic Hybrid Systems logic. (^) the (^) best (^) set (^) of (^) rules (^) to (^) be (^) used (^) by (^) a (^) fuzzy (^) inference

engine, (^) and (^) to (^) optimize (^) the (^) choice (^) of (^) membership functions. A particular (^) use (^) of (^) GAs (^) is (^) in (^) fuzzy (^) classifi-

cation (^) systems, (^) where (^) an (^) object (^) is (^) classified (^) on the (^) basis (^) of (^) the (^) linguistic (^) values (^) of (^) the (^) object attributes.

The (^) most (^) difficult part (^) of (^) building a (^) system (^) like (^) this (^) is (^) to (^) find (^) the (^) appropriate (^) ser (^) of (^) fuzzy (^) rules. (^) The

most (^) obvious (^) approach (^) is (^) to (^) obtain (^) knowledge from (^) experts (^) and translate (^) this (^) into a (^) set (^) of (^) fuzzy (^) rules. (^) But

this (^) approach (^) is (^) time consuming. (^) Besides, (^) experts (^) may (^) not (^) be (^) able (^) to (^) put their (^) knowledge (^) into an (^) appro·

priate (^) form (^) of (^) words. (^) A second approach (^) is (^) to obtain the (^) fuzzy (^) rules (^) through machine learning, (^) whereby

the (^) knowledge (^) is (^) auromatically extracted or deduced

from sample cases. A fuzzy GA is a directed random

search (^) over (^) all (^) {discrete) (^) fuzzy (^) subsets (^) of (^) an (^) interval and (^) has (^) features (^) which (^) make (^) it applicable (^) for (^) solving

this problem. It is capable of creating the classification

(^) rules (^) for (^) a (^) fuu.y (^) system (^) where objects (^) are (^) classi-

fied (^) by (^) linguistic terms. Coding the (^) rules (^) genetically (^) enabies (^) the (^) system (^) to (^) deal (^) with (^) mulcivalue (^) FL (^) and (^) is

more (^) efficient (^) as (^) it (^) is (^) consistent with numeric ood.ing (^) of (^) fuzzy (^) examples. (^) The training (^) data (^) and randomly

generated (^) rules (^) are (^) combined (^) to (^) create (^) the (^) initial (^) population, (^) giving (^) a better (^) starring (^) point (^) for reproduc-

tion. (^) Finally, (^) a (^) fitness (^) function (^) measures (^) the (^) strength (^) of (^) the (^) rules, (^) balancing (^) the (^) quality (^) and (^) diversity (^) of (^) ilie

population.

a^

Introduction

11.6^ Soft Computing^ The^ two^ major^ problem-solving^ technologies include:1.^ hard computing;2.^ soft^ computing.^ Hard computing^ deals^ wirh.^ precise models where accurate

solmions^ are^ achieved^ quickly.^ On^ the other

hand,^ soft^ computing^ deals^ with^ approximate^

models^ and^ gives^ solution^ to^ complex^ problems.

The^ t<.yO

problem-solving^ technologies^ are^ shown^ in^ Figure

1·2. Soft computing is a relatively new concept, the rerm^ really^ entering^ general^ circulation^ in^ 1994. The term

"soft^ computing"^ was^ introduced^ by^ Professor^

Lorfi^ Zadeh^ with^ the^ objective^ of^ exploiting^ the

tolerance

for^ imprecision,^ uncenaincy^ and partial truth^ tO

achieve^ tractability,^ robustness,^ low^ solution cost and better

rapport^ with^ realicy.^ The^ ultimate^ goal^ is^ m^ be^

able^ to^ emulate^ fie^ human^ mind^ as^ closely^ as^ possible.

Soft

compuring^ involves^ parmership of^ several^ fields,^

the^ mosr^ imponam being neural^ nerworks,^ G~^ and

FL.^ Also included^ is^ the^ field^ of^ probabilistic reasoning, employed

for^ its^ uncertaincy control techniques.^ However,

this

field^ is^ nor^ examined^ here.^ Soft^ computing^ uses^ a combination^ of^ GAs,^

neural^ nerworks^ and^ FL.^ A hybrid technique,^ in

fact,^ would inherit^ all^ the^ advantages,^ but^ won't^ have^ the^ less

desirable^ features^ of^ single^ soft computing componems.

It has^ to^ possess^ a good learning^ capacicy,^ a^ better

learning time than that^ of^ pure^ GAs^ and^ less^ sensitivity to

the problem^ of^ local^ extremes than neural^ nerworks.

In^ addition, it^ has^ m^ generate a^ fuzzy^ knowledge

base, which^ has^ a linguistic representation and a^ very^

low^ degree^ of computational complexity. An imponam thing about the constituents of^ soft computing^ is^ that they^ are^ complementary,^

not^ camper~ itive,^ offering their own advantages and techniques

to^ pannerships^ to^ allow^ solutions^ to^ otherwise unsolvable problems. The constituents^ of^ soft computing^ are

examined^ in^ turn, following which existing applications

of partnerships^ are^ described.^ "Negotiation^ is^ the communication^ process^

of a group of^ agents^ in^ order^ to^ reach^ a mutually accepted agreement^ on^ some^ matter."^ This^ definition^ is^

typical of the^ research^ being done into negotiation and

co~ ordination^ in^ relation^ to^ software^ agents.^ It^ is^ an

obvious^ necessity that when multiple agents interact,

they

will^ be^ required^ to^ co-ordinate their^ efforts^ and^

attempt^ to^ son^ our^ any^ conflicts^ of^ resources^ or interest.It is important to appreciate rhar agents are owned^ and conrrolled^ by^ people^ in^ order^ to^ complete

tasks^ on their behalf.^ An^ exampl!^ of^ a possible multiple-agent-based negotiation scenario

is^ the competition between HARD^ COMPUTI~JGPrecise^ models^ l J^ I t t Symbolic^ Traditional logic^ numerical reaSDiling^ modeling^ and·search ~ditio~^

SOFT^ COMPUTING Approximatereasoning Figure 1·2 Problem-solving technologies.

1.7^ Summary^

long~disrance^ phone^ call^ providers. When the consumer

picks^ up^ the phone and dials,^ an^ agent^ will^ com- municate on the consumer's behalf with^ all^ the

available^ nerwork^ providers.^ Each^ provider^ will

make^ an

offer that the consumer^ agent^ can^ accept^ of^ reje~.

_'A^ realistic^ goal^ would^ be^ to select the^ lowest

avail~

able-^ price^ for^ .the^ call.^ However,^ given^ the^ first

rOurid_.df^ offers,^ network providers^ may^ wish^

to^ modifY their^ offer^ to make it more competitive. The new

offer^ is^ then submitted^ to^ the consumer agenr and the process^ continues until a conclusion^ is^ reached.·

One^ advantage^ of^ this^ process^ is^ that^ the provider

can dynamicaUy^ alter^ its^ pricing strategy^ to^ account

for^ changes in demand and^ competidon,^ therefore

max~

imizing^ revenue.^ The consumer^ will^ obviously benefit

from^ the constant competition^ berween^ providers. Best^ of^ all,^ the^ process^ is^ emirely^ amonompus

as^ the agents embody and^ act^ on the^ beliefs

and^ con- straints of the parties they represent. Further

changes^ can^ be^ made^ to^ the protocol^ so^ that providers can^ bid^ low^ without being in danger of making a

loss.^ For^ example, if the consumer^ chooses^

to^ go with the^ lowest^ bid but^ pays^ the second^ lowest

price, this^ will^ rake^ away^ the incentive^ to^ underbid or overbid.Much^ of^ the negotiation theory^ is^ based^ around human behavior models and,

as^ a result, it^ is^ oft:en^ trans- lated using Distributed Artificial Intelligence techniques. The problems associated with machine negotiation are^ as^ difficult to^ solve^ as^ rhey^ are^ wirh human negotiation and

involve^ issues^ such^ as^ privacy,^ security and deception. 11.1^ Summary The computing world^ has^ a^ lot^ to^ gain^ from^ neural networks

whose^ ability^ to^ learn^ by^ example^ makes^ them very^ flexible^ and powerful. In^ case^ of^ neural^ nerworks,

there^ is^ no^ need^ to^ devise^ an^ algorithm^ to^ perform

a specific^ task,^ i.e., there^ is^ no need^ to^ understand the imernal mechanisms

of^ that^ rask.^ Neural networks^ are

also^ well^ suited^ for^ real-time^ systems^ because^ of their

fast^ response^ and computational^ times,^ which^ are

due to^ their parallel architecture.Neural^ nerworks^ also^ contribute^ to^ other^ areas

of^ research^ such^ as^ neurology and^ psychology.^

They^ are regularly^ used^ tO^ model parts^ of^ living organisms and

to^ investigate the internal mechanisms^ of^ the brain. Perhaps^ the most exciting aspect of neural^ nerworks

is^ the^ possibility that someday^ "conscious"^ networks n:aighr^ be^ produced.^ Today,^ many scientists^ believe

that consciousness^ is^ a^ "mechanical"^ property and that "conscious"^ neural^ nerworks^ are^ a realistic possibility.Fuzzy^ logic^ was^ conceived^ as^ a^ better^ method^ for

sorting and handling^ data^ but^ has^ proven^ to^ be^ an

excellent choice^ for^ many control^ system^ applications since

it^ mimics human comrollogic.^ It^ can^ be^ built^ inro

anything from^ small, hand-held producrs^ to^ large,^ computerized

process^ control^ systems.^ It^ uses^ an^ imprecise but

very descriptive language^ to^ deal with input^ data^ more like a human operator.

It^ is^ robust and often^ works^ when first implemented with^ little^ or no tuning.When applied to optimize ANNs^ for^ forecasting and classification problems,

GAs^ can^ be^ used^ to^ search for^ the right combination^ of^ inpur^ data, the most suitable

forecast^ hori7:0n,^ the^ optimal or near-optimal network^ interconnection patterns and weights among the neurons, and the conuol parameters (learning

~te, momentum rate, tolerance^ level,^ etc.) based on the uaining

data^ used^ and the^ pre~set^ criteria.^ Like^ ANNs,

GAs^ do not^ always^ guarantee^ you^ a perfect solution, but

in^ many^ cases,^ you^ can^ arrive^ at an acceprable solution without^ die^ rime and^ expense^ of^ an^ exhaustive^ search.^ Soft^ computing^ is^ a^ relatively^ new concept, the term

really^ entering general^ circulation^ in^ 1994, coined

by

Professor^ Lotfi^ Zadeh^ of^ the^ University^ of^ California,

Berkeley,^ USA,^ it^ encompasses^ several^ fields^ of^ comput- ing. The three that^ have^ been examined in this chapter

are^ neural^ nerworks,^ FL^ and^ GAs.^ Neural networks

are important for their ability^ to^ adapt and learn,^ FL^

for^ its exploitation^ of^ partial^ truth and imprecision, and

GAs

~^ Artificial^ Neural^ Network:^ An

Introduction x,^ X,^ ... ~@-r Figure^ 2·1^ Architecture^ of a^ simple^ anificial^ neuron

net. Input^ '(x)^ :-^ ~-^ i^ '^ (v)l----~•·mx^ ~--.J^ Figure 2·2^ Neural^ ner^ of^ pure^ linear equation. It should be noted that each neuron has an imernal stare^ of^ its^

own. This imernal^ stare^ is^ called^ ilie

activation^ or^ activity^ kv~l^ of neuron,^ which^ is^ the function

of^ the.^ inputs the neuron^ receives.^ The activation signal^ of^ a neuron^ is^ transmitted to other neurons.

Remembe(i^ neuron can send only one signal at a rime, which^ can^ be^ transmirred to^ several^ ocher^ neurons.To^ depict^ rhe^ basic^ operation^ of^ a neural net,^ ·consider

a set of neurons,^ say^ X^ and^ Xz,^ transmitting^ signals^1

to^ a110ilier^ neuron,^ Y.^ Here^ X,^ and^ X2^ are input neurons,

which^ transmit signals,^ andY^ is^ the^ output^ neuron,

which^ receives^ signals. Input neurons^ X,^ and^ Xz

are connected to^ the output neuron^ Y,^ over^ a weighted interconnection^ links^ (W,^ and^ W2)^ as^ shown in

Figure^ 2·1. For the above simple rleuron net architecture, the net^ input^ has^ to^ be^ calculated in^ the^ following

way: ]in=^ +XIWI^ +.xz102 where Xi and X2 ,gL~vations of the input neurons^ X,^ and^ X2,^ i.e.,^ the output^ of^ input

signals.^ The

output^ y^ of the output neuron^ Y^ can^ be^ o[)i"alneaOy

applymg^ act1vanon~er^ the^ ner^ input, i.e., the function

of the net input:^ J^ =^ f(y;,)^ Output=^ Function (net input calculated)The function^ robe^ applied^ over^ the^ l]£t^ input^ is^ call:

a;;dti:n^ fo'!!!f_on.^ There^ are^ various^ activation functions,

which^ will^ be^ discussed^ in^ the^ forthcoming^ sect

_.^ e a^ ave^ calculation of^ the^ net input^ is^ similar

tq^ the calculation^ of^ output^ of^ a pure^ linear^ straight line equation

(y^ =^ mx).^ The neural net of a pure linear^ cqu3.tion

is^ as^ shown^ in^ Figure^ 2·2.^ Here,^ m^ oblain^ the output^ y,^ the^ slope^ m^ is

directly multiplied with the input^ signal.^ This

is^ a^ linear equation. Thus, when slope and input^ are^ linearly

varied,^ the^ output^ is^ also^ linearly^ varied,^ as^ shown in

Figure^ 2·3.^ This^ shows^ that^ the weight^ involved

in^ dte^ ANN^ is^ equivalent^ to^ the^ slope^ of^ the^ linear

straight line. I^ 2.1.2^ Biological^ Neural^ Network It iswdl·known that^ dte^ human brain^ consists^ of

(^11) a huge number of neurons, approximatdy 10 ,^ with numer· ous^ interconnections. A schematic diagram^ of^ a biological neuron

is^ s_hown^ in^ Figure^ 2-4.

2.1^ Fundamental^ Concept^ f

Slope=^ t m y^ ,^ X----->-^ Figure 2·3^ Graph^ for^ y^ =^ mx.^ Synapse Nucleus^ --+--0^7 /•^ v, -c_....--^.^11 DEindrites^ 1:^ t^ Figure^ 2-4^ Schcmacic^ diagram^ of^ a^ biological^ neuron.The biological neuron depicted in^ Figure^ 2-4^ consists^ of^ dtree^ main^ pans: 1. Soma or cell body- where the^ cell^ nucleus^ is^ located.2. Dendrites- where the nerve^ is^ connected^ ro^ the^ cell^ body.3. Axon- which carries ~e impu!_s~=-;t^ the neuron.Dendrites are tree-like networks made^ of^ nerve^ fiber^ connected to the^ cell^

body.^ An^ axon^ is^ a single, long conneC[ion^ extending^ from^ the^ cell^ body and carrying

signals^ from^ the neuron. The end^ of^ dte^ axon^ splits into fineruands.^ It^ is^ found that each strand terminates into a

small~ed^ JY1111pse.^ Ir^ is^ duo

na^ se^ that^ e neuron^ introduces^ its^ si^ nals^ to^

euro^ ,^ T^ e^ receiving^ ends o^ e^ a^ ses ~}be^ nrarhr^ neurons^ can^ be^ un^ both^ on the dendrites and on

y.^ ere^ are^ approximatdy

:,}_er^ neuron^ in^ me numan^ Drain. ~es^ are^ passed^ between the^ synapse^ and the dendrites. This type

of^ signal uansmission^ involves a.^ chemical^ process^ in which specific^ transmitter^ substances

are^ rdeased from the sending side^ of^ the^ junccio

This^ results in^ increase^ or^ decrease^ in^ th~

inside the^ bOdy^ of the^ receiving^ cell.^ If^ the^ dectric potential^ reaches^ a^ threshold^ then the receiving cell

fires^ and^ a^ pulse^ or^ action^ potential^ of^ fixed^ strength

and

duration^ is^ sent^ oulihro'iigh^ the axon^ to^ the^ apcic

junctions^ of^ the other^ ceUs.^ After firing, a^ cd1^ has

to^ wait for^ a period^ of^ time^ called^ th^ efore

it^ can^ fire^ again.^ The^ synapses^ are^ said^ to^ be^ inhibitory

if

they^ let^ passing^ impulses hind^ the receiving

cell^ or^ txdtawry^ if^ they let^ passing^ impulses^ cause

the^ firing^ of^ the^ receiving^ cell.^ -·---""

~

X,^ x, ~^ Inputs^14 OutpmAxonSomaNee (^) inpurDendritesWeights (^) or (^) inrerconnecrionsCell^ NeuronBiological neuron^ Anificial (^) neuron^ biological^ and artificial neurons^ Table^ 2·1^ Terminology^ relarioii:ShrpS (^) b~tw-ee·n^ Figure^ 2·5^ Mathematical (^) model (^) of (^) artificial (^) neuron. ~/ element^ w,^ Processing^ ";^ ~^ Weights Artificial (^) Neural (^) Network: (^) An (^) Introduction

Figure (^) 2~5 (^) shows (^) a mathematical (^) represenracion (^) of (^) the (^) above~discussed (^) chemical processing raking (^) place

In (^) chis (^) model, (^) the (^) net (^) input (^) is (^) elucidated (^) as in^ an artificial neuron. i=l^ Yin^ =^ Xt^ WJ^ +^ XzW2^ +^ ·^ ··^ +^ x,wn^ =^ L (^) x;w; "

where i represents the ith processing elemem.

The (^) activation function (^) is (^) applied over (^) it (^) ro (^) calculate the

output. The (^) r-reighc (^) represents (^) the (^) strength (^) of (^) synapse (^) connecting (^) the (^) input (^) and (^) the (^) output (^) neurons. (^) ft (^) pos·

irive (^) weight corresponds (^) to (^) an (^) excitatory (^) synapse, (^) and a (^) negative (^) weight corresponds (^) to (^) an inhibitory

The (^) terms (^) associated (^) with (^) the (^) biological (^) neuron and theirsynapse. (^) counterparts (^) in (^) artificial neuron (^) are (^) prescmed

A comparison could be made between biological Artificial Neur9n (Brain vs. Computer) 2.1.3 Brain vs. Computer - Comparison Between Biolbgical Neuron and in Table 2-l.

and (^) artificial neurons (^) on (^) the (^) basis (^) of (^) the following criteria:

  1. (^) Speed· (^) T~e (^) of rxecurion (^) in (^) the (^) ANN

is of& .. wannsergnds whereas in the ci.se of

biolog-

ical (^) neuron (^) ir (^) is (^) of (^) a (^) few (^) millisecondS. (^) Hence, the artificial neuron (^) modeled (^) using a (^) com purer (^) is (^) more

faster.

l I^ '!.^ j 2. f (^) Fundamental (^) Concept .J

2. jJ'ocessing: Basically, the biological neuron can

(^) perform (^) massive (^) paralld operations (^) simulraneously. (^) The

artificial neuron can (^) also (^) perform (^) several (^) parallel operations

simultaneouSlY, but, ih general, the artificial

neuron ne[INork process is faster than that of the brain.

3. Size and complexity: The total number of neUrons

(^) in the brain (^) is (^) about (^) lOll and (^) the (^) total number of

interconnections (^) is (^) about (^10) • (^) Hence, (^) it can (^) be (^15) (^) rioted (^) that the (^) complexity (^) of (^) the (^) brain (^) is (^) comparatively

higher, (^) i.e. (^) the computational work (^) takes (^) places not"Cmly (^) in the brain (^) cell (^) body, (^) but (^) also (^) in (^) axon, (^) synapse,

ere. (^) On (^) the other hand, the (^) size (^) and (^) complOciry (^) ofan (^) ANN (^) is (^) based (^) on (^) the (^) chosen application and

the (^) ne[INork (^) designer. (^) The (^) size (^) and complexity

of a biological neuron is more than iliac Of an

(^) arcificial

4. Storage capacity (mnno,Y}: The biologica.l. neuron stores the information neurorr.-----

(^) in (^) its (^) imerconnections or in

synapse (^) strength but (^) in (^) an (^) artificial neuron (^) it (^) is (^) smred (^) in (^) its (^) contiguous memory locations. In an (^) artltlcial

neuron, the continuous loading (^) of (^) new (^) information (^) may (^) sometimes overload (^) the (^) memory locations.

As a

result, (^) some (^) of the (^) addresses (^) containing (^) older memory (^) locations (^) may (^) be (^) destroyed. But (^) in (^) case (^) of the brain,

new (^) information can (^) be (^) added in the interconnections (^) by (^) adjusting the (^) strength (^) without (^) descroying (^) the

older (^) infonnacRm. (^) A disadvantage related to brain (^) is (^) that sometimes (^) its (^) memory (^) niay (^) fail (^) to (^) recollect (^) the.

stored information (^) whereas (^) in (^) an (^) artificial neuron,

once the information is stored in its me~ locations,

it (^) can (^) be (^) retrieved. (^) Owing (^) to (^) these (^) facts, (^) rhe (^) adaptability

-^ is^ more (^) toward an (^) artificial (^) neuron.

5. Tokrance: The biola ical neuron assesses

fault tolerant capability (^) whereas (^) the artificial neuron (^) has (^) no

fault (^) tolerance. (^) Th (^) distributed natu (^) of (^) the (^) biological neurons enables to store and (^) retrieve (^) information

even (^) when (^) the (^) interconnections m (^) em (^) get (^) disconnected. Thus biological neurons (^) nc (^) fault (^) toleF.lm. (^) But (^) in

Biological (^) neurons (^) can (^) accept redundancies, whichcase (^) of (^) artificial neurons, (^) the (^) mformauon (^) gets (^) corrupted if the network interconnections are disconnected. (^) is (^) not possible in artificial neurons. (^) Even (^) when (^) some

ceHs (^) die, (^) the (^) human nervous (^) system (^) appears (^) to (^) be (^) performing with (^) the (^) same (^) efficiency.

6. Control mechanism: In an artificial neuron modeled

(^) using (^) a computer, there (^) is (^) a control unit present (^) in

Central Processing Unit, which can transfe..! and control precise scalar

(^) values (^) from (^) unit (^) to (^) unit, (^) bur (^) there

is (^) no (^) such control unit (^) for (^) monitoring in (^) the (^) brain. The (^) srrengdl (^) of a neuron (^) in (^) the brain depends on (^) the

active (^) chemicals present and whether neuron connections are strong or (^) weak (^) as (^) a result (^) ~mre (^) layer

rather (^) t~ (^) synapses. (^) However, (^) rhe (^) ANN (^) possesses (^) simpler interconnections (^) and (^) is (^) freefrom

chemical actions similar to those raking (^) place (^) in (^) brain (biological neuron). Thus, (^) the (^) control mechanism

of (^) an (^) arri6cial neuron (^) is (^) very (^) simple compared (^) to (^) that (^) of (^) a biological neuron. (^) --

So, (^) we (^) have (^) gone through a comparison between ANNs and biological neural (^) ne[INorks. (^) In shan, (^) we (^) can

  1. (^) Ther~lilgfi(y'"interconnected 1. (^) It (^) is (^) a neurally implemented (^) mathem~say (^) that (^) an (^) ANN (^) possesses (^) the (^) following (^) characteristic.s:

processing elements called nwrom in an ANN.

  1. (^) The interconnections with their weighted linkages hold the informative (^) knowledge.
  2. (^) The input (^) signals (^) arrive at the processing (^) elelnents (^) through connections and connecting weights.
  3. (^) The processing elements of the ANN (^) have (^) the (^) ability (^) to (^) learn, (^) recall (^) and generalize (^) from (^) the given (^) data
  4. (^) The computational (^) power (^) can (^) be (^) demonstrated onlyby (^) suitable assignment or adjustment of weights. (^) by (^) the collective behavior (^) of (^) neurons, and it should

The above-mentioned (^) characteristic.s (^) make (^) the (^) ANNsbe (^) noted that no single neuron (^) carries (^) specific information. (^) as (^) connectionist models, parallel distributed processing

models, (^) self-organizing (^) systems, (^) neuro-computing -- (^) systems (^) and (^) neuro-morphic (^) systems.

Artificial (^) Neural (^) Network: (^) An (^) Introduction Figure (^) 2·7 (^) Multilayer (^) feed-forward^ layer^ Input Input Output (^) network. ~^ --------.. -£

Figure 2·8 (^) (A) (^) Single node (^) wirh (^) own (^) feedback. (^) {B) (^) Comperirive (A) (^) (B) -£ Feedback (^) ners. neurous Output

nodes (^) with (^) various weighrs, resulting (^) in&MJ.n)rnp;u~~~eope~

Thus, a single-laye1 feed-forward

A mu t1 erfeed-forward network (Figure 2-?)^ netw rk is formed.

is formed by the interconnection of several layers.

(^) The

input layer (^) is (^) that which receives the input and this layer (^) has (^) no function except buffering (^) the (^) input si nal.

The (^) output (^) layer generates the (^) output (^) of (^) the network. (^) Any (^) layer that (^) is (^) formed between (^) e input (^) and output

layers is called hidden layer. This hidde-n layer is

(^) internal (^) to (^) the (^) network and (^) has (^) no (^) direct contact (^) with (^) the

external (^) environment. (^) It should (^) be (^) noted that there (^) may (^) be (^) zero (^) to (^) several (^) hidden (^) layers (^) in (^) an (^) ANN. More the

number (^) of (^) the (^) hidden (^) layers, (^) more (^) is (^) Ute (^) com (^) lexi (^) f (^) Ute (^) network This (^) may, (^) however, (^) provide (^) an (^) efficient

output (^) response. (^) In (^) case (^) of out (^) ut (^) from (^) one (^) layer (^) is (^) connected to (^) d

A (^) n'etw.Qrk (^) is (^) said (^) m (^) be (^) a (^) feed~forward (^) nerwork^ evlill' (^) node in (^) the (^) next (^) layer. (^) if no neuron (^) in (^) the (^) output (^) layer (^) is (^) an (^) input to a node (^) in

the (^) same (^) layer (^) or in the preceding (^) layer. (^) On (^) the other hand, when ou (^) uts (^) can (^) be (^) directed (^) back as (^) inputs (^) to

same (^) or pr..t:eding (^) layer (^) nodes then (^) it (^) results (^) in (^) me If (^) the (^) feedback (^) of (^) clte (^) om put (^) of (^) clte (^) processing (^) elements (^) ts (^) · recred (^) back (^) at (^) input (^) tO (^) the (^) processing (^) formation (^) e (^) back (^) networ. (^).

elements in the same layer r.fen ic is tailed ilueral

(^) feedbi:Uk. (^) Recurrent networks (^) are (^) feedback networks

with (^) d('ied (^) loop. (^) Figure (^) 2~8(A) (^) shows (^) a (^) simple (^) recurrent neural network having a single neuron with I^ j 2.3 (^) Basic (^) Models (^) of (^) Artificial (^) Neural (^) Network

Figure (^) 2·9 (^) Single·layer (^) recurrent^0 wnm (^) ..^ 0>··^ ~"··^ ®········^ ..^ ··~~! (^) ... (^) network.

(^0) "" (^) Vn^ Input (^) layer .· .. <:.:..::,/....... ··, Y, (^) )---'----- Figure (^) 2·10 (^) Multilayer (^) recurrent^ 0·······....~n2^0 "'" v~ X •"\ (^) -:::: (^) ne[Work.

feedback (^) to (^) itself. (^) Figure (^) 2~9 (^) shows (^) a single· (^) layer (^) network with a feedback connection in which a processing

element's output can be directed back (^) ro (^) the (^) processing element itself or to (^) clte (^) other processing element or

The architecture (^) of (^) a (^) competitive (^) layer (^) is (^) shown into both. (^) Figure (^) 2~8(8), (^) the competitive interconneccions having

fixed weights of -e. This net is called Maxnet,

and (^) will (^) be (^) discussed in the unsupervised learning network

category. (^) Apart from the network architectures (^) discussed (^) so (^) far, (^) there (^) also (^) exists (^) another type (^) of (^) archirec~

rure (^) with lateral feedback, which (^) is (^) called (^) the on·center--off-surround (^) or (^) latmzl (^) inhibition (^) strUCture. (^) In this

~ (^) ----

20 r< (^0) '

Artificial^ Neural^ Network:^ An^ Introduction ,, o;,L^1 ;~''^?^ rl^ 'h-'"'""==::':::C:=:=:':: 1.•^ __^ ~.~;-.^ Flgure2-11~on.&~r~ c<^ ,.^ ,'^ -~-' o'^ ..;'s~?ucture,^ each^ processing^ neuron^ receives c~ ·'

two^ differem^ classes^ of^ inputs-^ "excitatory"^ input

&om^ nearby^ ~

processing elements and^ "inhibitory"^ inputs^ from

more^ disramly_lggted..pro@~^ elements. This

cype^ of

inter~^ is^ shown^ in^ Figure"2:-1T:·--·--···-------

In^ Figure^ 2-11, the connections with open^ circles

are^ excitatory connections and^ the^ links^ with^ solid

con- nective^ circles^ are^ inhibitory connections.^ From^ Figure

2-10,^ it^ can^ be^ noted that a^ processing^ element output can^ be^ directed^ back^ w^ the^ nodes^ in^ a^ preceding

layer,^ forming a^ multilayer^ recunmt^ network.^ Nso,

in^ these networks,^ a^ processing^ dement output^ can^ be^ directed

back^ to^ rhe^ processing^ element^ itself^ and^ to^ other

pro- cessing^ elemenrs^ in^ the^ same^ layer.^ Thus,^ the^ various

network architecrures^ as^ discussed^ from^ Figures^ 2~6-2· can^ be^ suitably^ used^ for^ giving^ effective^ solution

ro^ a problem^ by^ using^ ANN. I^ 2.3,2^ Learning^ The^ main^ property of^ an^ ANN^ is^ its^ capability^ to

learn.^ Learning or training^ is^ a^ process^ by^ means

of^ which a neural network adapts^ itself^ to^ a stimulus^ by^ making$rop~~rer

adjustm~^ resulting^ in^ the^ production of^ desired^ response.^ Broadly,^ there^ are^ nvo^ kinds

o{b;ning^ in^ ANNs:

1.^ Parameter^ learning:^ h^ updates^ the^ connecting

weights^ in^ a^ neural^ net.

2.^ Strncttm^ learning:^ It^ focuses^ on^ the^ change^ in

network^ structure^ (which^ includes^ the^ number^ of

processing elemems^ as^ well^ as^ rheir^ connection^ types).The^ above^ two^ types^ oflearn.ing^ can^ be^ performed^ simultaneously

or^ separately.^ Apart^ from^ these^ two^ categories of^ learning, the learning^ in^ an^ ANN^ can^ be^ generally

classified^ imo three^ categories^ as:^ supervised^ learning; unsupervised^ learning;^ reinforcement^ learning.^ Let

us^ discuss^ rhese^ learning^ types^ in^ detail.

2-_3,2,^1 Supervised Learning The learning^ here^ is^ performed with^ the^ help^ of a

teacher.^ Let^ us^ take^ the^ example^ of^ the^ learning

process of^ a^ small^ child. The child^ doesn't^ know^ how^ to

readlwrite. He/she^ is^ being taught^ by^ the^ parenrs

at home and^ by^ the^ reacher^ in^ school.^ The children^ are^ trained and molded

to^ recognize^ rhe^ alphabets, numerals,^ etc. Their^ each^ and^ every^ action^ is^ supervised^ by^ a^ teacher.

Acrually,^ a^ child^ works^ on^ the^ basis^ of^ the output

that he/She^ has^ to^ produce.^ All^ these^ real-time^ events^

involve^ supervised learning^ methodology.^ Similarly,

in^ ANNs following^ the^ supervised^ learning,^ each^ input vector

re^ uires^ a cor^ din^ rar^ et^ vector,^ which^ represents the^ desired^ output. The input^ vecror^ along with^

the^ target^ vector^ is^ called^ trainin

informed^ precisely^ about what should^ be^ emitted

as^ output. The block^ 1a working^ of^ a supervised learning network.During training.^ the^ input^ vector^ is^ presented

to^ the network, which^ results^ in^ an^ output^ vecror.

This outpur vector^ is^ the actual output^ vecwr.^ Then^ the

actual^ output vector^ is^ compared with the desired

(target) output^ ·vector.^ If there^ exists^ a difference^ berween

the^ two^ output^ vectors^ then^ an^ error^ signal^ is^ generated

2.3^ Basic^ Models^ of^ Artificial , " by

Neural^ Network^ Neural^ -+networkX(lnpu :)^ w^ ErrorError^ <^ (0-Y)signal^ signalsgenerator^ Figure^ 2-12^ Supervised^ learning. 2.3,2,2 Unsupervised Learning

21 y (Actual output) b(Desi ·ad output) ence, the

The learning^ here^ is^ performed without^ the^ help

of^ a^ teacher.^ Consider the learning^ process^ of^ a tadpole, it learns^ by^ itself,^ that^ is,^ a^ child^ fish^ learns^ to^ swim

by^ itself,^ it^ is^ not taught^ by^ its^ mother. Thus,^ its

learning process^ is^ independent and^ is^ nor^ supervised^ by

a^ teacher.^ In^ ANNs^ following^ unsupervised learning,

the

^ input^ vectors^ of^ simil~pe^ are^ grouped without

th^ use^ of training^ da.ta^ t^ specify^ ~ch

'~^ group^ looks^ or^ to^ which group a number^ beloogf

n^ e training^ process,^ efietwork^ receives^ rhe

input

~-·~paii:erns^ and^ organizes^ these^ patterns^ to^ form

clusters.^ When a^ new^ input panern^ is^ applied,

the^ neural ··^ network^ gives^ an^ output^ response^ i^ dicar.ing..ili_~c

which^ the^ input^ pattern^ belongs.^ If^ for^

an^ input, a pattern^ class^ cannot^ be^ found^ the^ a^ new^ class

is^ generated The block^ 1agram^ of^ unsupervised learning

is shown^ in^ Figure^ 2~13.^ From^ Figure^ 2·13^ it^ is^ clear^ that^ there^ is^ no^

feedback^ from^ the^ environment^ to^ inform what the outputs should^ be^ or whether^ the^ outputs^ are^ correct.^ In

this^ case,^ the network must itself^ discover^ patterns~~ lariries,^ features^ or^ categories^ from^ the^ input data

and^ relations^ for^ the^ input data^ over^ (heOUtj:lut.

While discovering^ all^ these^ features,^ the network undergoes

change^ m^ Its^ parameters.^ I^ h1s^ process^ IS^ called

self

organizing^ in^ which^ exact^ clusters^ will^ be^ formed

by^ discovering^ similarities and dissimilarities among the objects. 2.3.2.3^ Reinforcement Learning This learning^ process^ is^ similar^ ro^ supervised^ learning.

In^ the^ case^ of^ supervised learning, the correct^

rarget output^ values^ are^ known^ for^ each^ input^ pattern.

But,^ in^ some^ cases,^ less^ information might be

available.

y X (lnpu al^ output) ~ Figure 2-13 Unsupervised learning.