BIBLIOTHEQUE ST
Détail de l'auteur
Auteur Paul E. Keller |
Documents disponibles écrits par cet auteur
Affiner la recherche Interroger des sources externes
Artificial neural networks / Kevin L.priddy
Titre : Artificial neural networks : an introduction Type de document : texte imprimé Auteurs : Kevin L.priddy, ; Paul E. Keller, Auteur Editeur : New delhi : Prentice hall of india Année de publication : 2007 Importance : 166 p Présentation : couv.illi.fig.ima.tab.ind Format : 23.5x17.5 cm ISBN/ISSN/EAN : 978-81-203-3229-4 Langues : Anglais (eng) Langues originales : Anglais (eng) Index. décimale : 621 Physique appliquée Résumé :
This concise tutorial text provides the reader with an understanding of artificial neural networks (ANNs) and their application, beginning with the biological systems which inspired them, through the learning methods that have been developed and the data collection processes, to the many ways ANNs are being used today. The material is presented with a minimum of math (although the mathematical details are included in the appendices for interested readers), and with a maximum of hands-on experience. All specialized terms are included in a glossary. The result is a highly readable text that will teach the engineer the guiding principles necessary to use and apply artificial neural networks.
Note de contenu :
CHAPTER 1.
Introduction
Chapter Outline -
1.1. The Neuron
1.2. Modeling Neurons
1.3. The Feedforward Neural Network
1.3.1. The Credit-Assignment Problem
1.3.2. Complexity
1.4. Historical Perspective on Computing with Artificial Neurons
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 2.
Learning Methods
Chapter Outline -
2.1. Supervised Training Methods
2.2. Unsupervised Training Methods
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 3.
Data Normalization
Chapter Outline -
3.1. Statistical or Z-Score Normalization
3.2. Min-Max Normalization
3.3. Sigmoidal or SoftMax Normalization
3.4. Energy Normalization
3.5. Principal Components Normalization
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 4.
Data Collection, Preparation, Labeling, and Input Coding
Chapter Outline -
4.1. Data Collection
4.1.1. Data-Collection Plan
4.1.2. Biased Data Set
4.1.3. Amount of Data
4.1.4. Features/Measurements
4.1.5. Data Labeling
4.2. Feature Selection and Extraction
4.2.1. The Curse of Dimensionality
4.2.2. Feature Reduction/Dimensionality Reduction
4.2.3. Feature Distance Metrics
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 5.
Output Coding
Chapter Outline -
5.1. Classifier Coding
5.2. Estimator Coding
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 6.
Post-processing
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 7.
Supervised Training Methods
Chapter Outline -
7.1. The Effects of Training Data on Neural Network Performance
7.1.1. Comparative Analysis
7.2. Rules of Thumb for Training Neural Networks
7.2.1. Foley’s Rule
7.2.2. Cover’s Rule
7.2.3. VC Dimension
7.2.4. The Number of Hidden Layers
7.2.5. Number of Hidden Neurons
7.2.6. Transfer Functions
7.3. Training and Testing
7.3.1. Split-Sample Testing
7.3.2. Use of Validation Error
7.3.3. Use of Validation Error to Select Number of Hidden Neurons
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 8.
unsupervised Training Methods
Chapter Outline -
8.1. Self-Organizing Maps (SOMs)
8.1.1. SOM Training
8.1.2. An Example Problem Solution Using the SOM
8.2. Adaptive Resonance Theory Network
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 9.
Recurrent Neural Networks
Chapter Outline -
9.1. Hopfield Neural Networks
9.2. The Bidirectional Associative Memory (BAM)
9.3. The Generalized Linear Neural Network
9.3.1. GLNN Example
9.4. Real-Time Recurrent Network
9.5. Elman Recurrent Network
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 10.
A Plethora of Applications
Chapter Outline -
10.1. Function Approximation
10.2. Function Approximation—Boston Housing Example
10.3. Function Approximation—Cardiopulmonary Modeling
10.4. Pattern Recognition—Tree Classifier Example
10.5. Pattern Recognition—Handwritten Number Rrecognition Example
10.6. Pattern Recognition—Electronic Nose Example
10.7. Pattern recognition—Airport Scanner Texture Recognition Example
10.8. Self Organization—Serial Killer Data-Mining Example
10.9. Pulse-Coupled Neural Networks—Image Segmentation Example
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 11.
Dealing with Limited Amounts of Data
Chapter Outline -
11.1. K-fold Cross-Validation
11.2. Leave-one-out Cross-Validation
11.3. Jackknife Resampling
11.4. Bootstrap Resampling
DOWNLOAD PDF
SAVE TO MY LIBRARY
Appendix A. The Feedforward Neural Network
Chapter Outline -
A.1. Mathematics of the Feedforward Process
A.2. The Backpropagation Algorithm
A.2.1. Generalized Delta Rule
A.2.2. Backpropagation Process
A.2.3. Advantages and Disadvantages of Backpropagation
A.3. Alternatives to Backpropagation
A.3.1. Conjugate Gradient Descent
A.3.2. Cascade Correlation
A.3.3. Second-Order Gradient Techniques
A.3.4. Evolutionary Computation
"
Artificial neural networks : an introduction [texte imprimé] / Kevin L.priddy, ; Paul E. Keller, Auteur . - New delhi : Prentice hall of india, 2007 . - 166 p : couv.illi.fig.ima.tab.ind ; 23.5x17.5 cm.
ISBN : 978-81-203-3229-4
Langues : Anglais (eng) Langues originales : Anglais (eng)
Index. décimale : 621 Physique appliquée Résumé :
This concise tutorial text provides the reader with an understanding of artificial neural networks (ANNs) and their application, beginning with the biological systems which inspired them, through the learning methods that have been developed and the data collection processes, to the many ways ANNs are being used today. The material is presented with a minimum of math (although the mathematical details are included in the appendices for interested readers), and with a maximum of hands-on experience. All specialized terms are included in a glossary. The result is a highly readable text that will teach the engineer the guiding principles necessary to use and apply artificial neural networks.
Note de contenu :
CHAPTER 1.
Introduction
Chapter Outline -
1.1. The Neuron
1.2. Modeling Neurons
1.3. The Feedforward Neural Network
1.3.1. The Credit-Assignment Problem
1.3.2. Complexity
1.4. Historical Perspective on Computing with Artificial Neurons
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 2.
Learning Methods
Chapter Outline -
2.1. Supervised Training Methods
2.2. Unsupervised Training Methods
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 3.
Data Normalization
Chapter Outline -
3.1. Statistical or Z-Score Normalization
3.2. Min-Max Normalization
3.3. Sigmoidal or SoftMax Normalization
3.4. Energy Normalization
3.5. Principal Components Normalization
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 4.
Data Collection, Preparation, Labeling, and Input Coding
Chapter Outline -
4.1. Data Collection
4.1.1. Data-Collection Plan
4.1.2. Biased Data Set
4.1.3. Amount of Data
4.1.4. Features/Measurements
4.1.5. Data Labeling
4.2. Feature Selection and Extraction
4.2.1. The Curse of Dimensionality
4.2.2. Feature Reduction/Dimensionality Reduction
4.2.3. Feature Distance Metrics
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 5.
Output Coding
Chapter Outline -
5.1. Classifier Coding
5.2. Estimator Coding
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 6.
Post-processing
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 7.
Supervised Training Methods
Chapter Outline -
7.1. The Effects of Training Data on Neural Network Performance
7.1.1. Comparative Analysis
7.2. Rules of Thumb for Training Neural Networks
7.2.1. Foley’s Rule
7.2.2. Cover’s Rule
7.2.3. VC Dimension
7.2.4. The Number of Hidden Layers
7.2.5. Number of Hidden Neurons
7.2.6. Transfer Functions
7.3. Training and Testing
7.3.1. Split-Sample Testing
7.3.2. Use of Validation Error
7.3.3. Use of Validation Error to Select Number of Hidden Neurons
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 8.
unsupervised Training Methods
Chapter Outline -
8.1. Self-Organizing Maps (SOMs)
8.1.1. SOM Training
8.1.2. An Example Problem Solution Using the SOM
8.2. Adaptive Resonance Theory Network
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 9.
Recurrent Neural Networks
Chapter Outline -
9.1. Hopfield Neural Networks
9.2. The Bidirectional Associative Memory (BAM)
9.3. The Generalized Linear Neural Network
9.3.1. GLNN Example
9.4. Real-Time Recurrent Network
9.5. Elman Recurrent Network
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 10.
A Plethora of Applications
Chapter Outline -
10.1. Function Approximation
10.2. Function Approximation—Boston Housing Example
10.3. Function Approximation—Cardiopulmonary Modeling
10.4. Pattern Recognition—Tree Classifier Example
10.5. Pattern Recognition—Handwritten Number Rrecognition Example
10.6. Pattern Recognition—Electronic Nose Example
10.7. Pattern recognition—Airport Scanner Texture Recognition Example
10.8. Self Organization—Serial Killer Data-Mining Example
10.9. Pulse-Coupled Neural Networks—Image Segmentation Example
DOWNLOAD PDF
SAVE TO MY LIBRARY
CHAPTER 11.
Dealing with Limited Amounts of Data
Chapter Outline -
11.1. K-fold Cross-Validation
11.2. Leave-one-out Cross-Validation
11.3. Jackknife Resampling
11.4. Bootstrap Resampling
DOWNLOAD PDF
SAVE TO MY LIBRARY
Appendix A. The Feedforward Neural Network
Chapter Outline -
A.1. Mathematics of the Feedforward Process
A.2. The Backpropagation Algorithm
A.2.1. Generalized Delta Rule
A.2.2. Backpropagation Process
A.2.3. Advantages and Disadvantages of Backpropagation
A.3. Alternatives to Backpropagation
A.3.1. Conjugate Gradient Descent
A.3.2. Cascade Correlation
A.3.3. Second-Order Gradient Techniques
A.3.4. Evolutionary Computation
"
Réservation
Réserver ce document
Exemplaires (2)
Code-barres Cote Support Localisation Section Disponibilité 10/158521 L/621.859 Livre Bibliothèque Science et Technologie indéterminé Exclu du prêt 10/158522 L/621.859 Livre Bibliothèque Science et Technologie indéterminé Disponible