Recurrent Neural Networks for Prediction
Recurrent Neural Networks for Prediction
Learning Algorithms, Architectures and Stability
Mandic, Danilo P.; Chambers, Jonathon A.
John Wiley & Sons Inc
08/2001
304
Dura
Inglês
9780471495178
15 a 20 dias
790
Introduction.
Fundamentals.
Network Architectures for Prediction.
Activation Functions Used in Neural Networks.
Recurrent Neural Networks Architectures.
Neural Networks as Nonlinear Adaptive Filters.
Stability Issues in RNN Architectures.
Data-Reusing Adaptive Learning Algorithms.
A Class of Normalised Algorithms for Online Training of Recurrent Neural Networks.
Convergence of Online Learning Algorithms in Neural Networks.
Some Practical Considerations of Predictability and Learning Algorithms for Various Signals.
Exploiting Inherent Relationships Between Parameters in Recurrent Neural Networks.
Appendix A: The O Notation and Vector and Matrix Differentiation.
Appendix B: Concepts from the Approximation Theory.
Appendix C: Complex Sigmoid Activation Functions, Holomorphic Mappings and Modular Groups.
Appendix D: Learning Algorithms for RNNs.
Appendix E: Terminology Used in the Field of Neural Networks.
Appendix F: On the A Posteriori Approach in Science and Engineering.
Appendix G: Contraction Mapping Theorems.
Appendix H: Linear GAS Relaxation.
Appendix I: The Main Notions in Stability Theory.
Appendix J: Deasonsonalising Time Series.
References.
Index.
Introduction.
Fundamentals.
Network Architectures for Prediction.
Activation Functions Used in Neural Networks.
Recurrent Neural Networks Architectures.
Neural Networks as Nonlinear Adaptive Filters.
Stability Issues in RNN Architectures.
Data-Reusing Adaptive Learning Algorithms.
A Class of Normalised Algorithms for Online Training of Recurrent Neural Networks.
Convergence of Online Learning Algorithms in Neural Networks.
Some Practical Considerations of Predictability and Learning Algorithms for Various Signals.
Exploiting Inherent Relationships Between Parameters in Recurrent Neural Networks.
Appendix A: The O Notation and Vector and Matrix Differentiation.
Appendix B: Concepts from the Approximation Theory.
Appendix C: Complex Sigmoid Activation Functions, Holomorphic Mappings and Modular Groups.
Appendix D: Learning Algorithms for RNNs.
Appendix E: Terminology Used in the Field of Neural Networks.
Appendix F: On the A Posteriori Approach in Science and Engineering.
Appendix G: Contraction Mapping Theorems.
Appendix H: Linear GAS Relaxation.
Appendix I: The Main Notions in Stability Theory.
Appendix J: Deasonsonalising Time Series.
References.
Index.