The Resource Artificial neural networks : an introduction, Kevin L. Priddy and Paul E. Keller

Artificial neural networks : an introduction, Kevin L. Priddy and Paul E. Keller

Label
Artificial neural networks : an introduction
Title
Artificial neural networks
Title remainder
an introduction
Statement of responsibility
Kevin L. Priddy and Paul E. Keller
Creator
Contributor
Subject
Language
eng
Summary
This tutorial text provides the reader with an understanding of artificial neural networks (ANNs) and their application, beginning with the biological systems which inspired them, through the learning methods that have been developed and the data collection processes, to the many ways ANNs are being used today. The material is presented with a minimum of math (although the mathematical details are included in the appendices for interested readers), and with a maximum of hands-on experience. All specialized terms are included in a glossary. The result is a highly readable text that will teach the engineer the guiding principles necessary to use and apply artificial neural networks
Member of
Cataloging source
CaBNvSL
Dewey number
006.3/2
Illustrations
illustrations
Index
index present
LC call number
QA76.87
LC item number
.P736 2005e
Literary form
non fiction
Nature of contents
  • dictionaries
  • bibliography
Series statement
Tutorial texts in optical engineering
Series volume
v. TT68
Target audience
  • adult
  • specialized
Label
Artificial neural networks : an introduction, Kevin L. Priddy and Paul E. Keller
Publication
Note
  • "SPIE digital library."
  • Title from PDF title page (viewed 8/23/09)
Antecedent source
file reproduced from original
Bibliography note
Includes bibliographical references (pages 151-162) and index
http://library.link/vocab/branchCode
  • net
Carrier category
online resource
Carrier category code
cr
Carrier MARC source
rdacarrier
Color
black and white
Content category
text
Content type code
txt
Content type MARC source
rdacontent
Contents
  • Chapter 1. Introduction. 1.1. The neuron -- 1.2. Modeling neurons -- 1.3. The feedforward neural network -- 1.4. Historical perspective on computing with artificial neurons
  • Appendix A. The feedforward neural network. A.1. Mathematics of the feedforward process -- A.2. The backpropagation algorithm -- A.3. Alternatives to backpropagation
  • Appendix B. Feature saliency
  • Appendix C. Matlab code for various neural networks. C.1. Matlab code for principal components normalization -- C.2. Hopfield network -- C.3. Generalized neural network -- C.4. Generalized neural network example -- C.5. ART-like network -- C.6. Simple perceptron algorithm -- C.7. Kohonen self-organizing feature map
  • Appendix D. Glossary of terms -- References -- Index
  • Chapter 10. A plethora of applications. 10.1. Function approximation -- 10.2. Function approximation-Boston housing example -- 10.3. Function approximation-cardiopulmonary modeling -- 10.4. Pattern recognition-tree classifier example -- 10.5. Pattern recognition-handwritten number recognition example -- 10.6. Pattern recognition-electronic nose example -- 10.7. Pattern recognition-airport scanner texture recognition example -- 10.8. Self organization-serial killer data-mining example -- 10.9. Pulse-coupled neural networks-image segmentation example
  • Chapter 11. Dealing with limited amounts of data. 11.1. K-fold cross-validation -- 11.2. Leave-one-out cross-validation -- 11.3. Jackknife resampling -- 11.4. Bootstrap resampling
  • Chapter 2. Learning methods. 2.1. Supervised training methods -- 2.2. Unsupervised training methods
  • Chapter 3. Data normalization. 3.1. Statistical or Z-score normalization -- 3.2. Min-max normalization -- 3.3. Sigmoidal or SoftMax normalization -- 3.4. Energy normalization -- 3.5. Principal components normalization
  • Chapter 4. Data collection, preparation, labeling, and input coding. 4.1. Data collection -- 4.2. Feature selection and extraction
  • Chapter 5. Output coding. 5.1. Classifier coding -- 5.2. Estimator coding
  • Chapter 6. Post-processing
  • Chapter 7. Supervised training methods. 7.1. The effects of training data on neural network performance -- 7.2. Rules of thumb for training neural networks -- 7.3. Training and testing
  • Chapter 8. Unsupervised training methods. 8.1. Self-organizing maps (SOMs) -- 8.2. Adaptive resonance theory network
  • Chapter 9. Recurrent neural networks. 9.1. Hopfield neural networks -- 9.2. The bidirectional associative memory (BAM) -- 9.3. The generalized linear neural network -- 9.4. Real-time recurrent network -- 9.5. Elman recurrent network
Control code
ocn435804266
Dimensions
unknown
Extent
1 online resource (ix, 165 pages)
Form of item
online
Isbn
9780819478726
Media category
computer
Media MARC source
rdamedia
Media type code
c
Other control number
10.1117/3.633187
Other physical details
illustrations
http://library.link/vocab/recordID
.b36997602
Specific material designation
remote
System control number
  • (OCoLC)435804266
  • spie0819478725

Library Locations

    • Deakin University Library - Geelong Waurn Ponds CampusBorrow it
      75 Pigdons Road, Waurn Ponds, Victoria, 3216, AU
      -38.195656 144.304955
Processing Feedback ...