Bruges, Belgium, April 24-25-26
Content of the proceedings
-
Learning and generalization I
Recurrent models
Fuzzy neural networks
Invited paper I
Self-organizing maps
Incremental learning
Invited paper II
Classification
Mathematical aspects of neural networks
Natural and artificial vision
Neural networks and statistics
Invited paper III
Learning and generalization II
Prediction
Learning and generalization I
ES1996-61
Synaptic efficiency modulations for context integration: The meta ODWE architecture
N. Pican
Synaptic efficiency modulations for context integration: The meta ODWE architecture
N. Pican
ES1996-11
Using a Meta Neural Network for RPROP parameter adaptation
C. McCormack
Using a Meta Neural Network for RPROP parameter adaptation
C. McCormack
ES1996-12
On unlearnable problems -or- A model for premature saturation in backpropagation learning
C. Goerick, W. von Seelen
On unlearnable problems -or- A model for premature saturation in backpropagation learning
C. Goerick, W. von Seelen
ES1996-19
Regulated Activation Weights Neural Network (RAWN)
H.A.B. te Braake, H.J.L. van Can, G. van Straten, H.B. Verbruggen
Regulated Activation Weights Neural Network (RAWN)
H.A.B. te Braake, H.J.L. van Can, G. van Straten, H.B. Verbruggen
ES1996-20
Praticing Q-learning
J. Bruske, I. Ahrns, G. Sommer
Praticing Q-learning
J. Bruske, I. Ahrns, G. Sommer
Abstract:
\N
\N
Recurrent models
ES1996-1
Adaptative time constants improve the dynamic features of recurrent neural networks
J.-P. Draye, D. Pavisic, G. Cheron, G. Libert
Adaptative time constants improve the dynamic features of recurrent neural networks
J.-P. Draye, D. Pavisic, G. Cheron, G. Libert
ES1996-73
An adaptive technique for pattern recognition by the random neural network
M. Mokhtari, H. Akdag
An adaptive technique for pattern recognition by the random neural network
M. Mokhtari, H. Akdag
ES1996-9
Negative initial weights improve learning in recurrent neural networks
D. Pavisic, J.-P. Draye, R. Teran, G. Calderon, G. Cheron, G. Libert
Negative initial weights improve learning in recurrent neural networks
D. Pavisic, J.-P. Draye, R. Teran, G. Calderon, G. Cheron, G. Libert
Abstract:
\N
\N
Fuzzy neural networks
ES1996-13
The extraction of Sugeno fuzzy rules from neural networks
A.L. Cechin, U. Epperlein, W. Rosenstiel, B. Koppenhoefer
The extraction of Sugeno fuzzy rules from neural networks
A.L. Cechin, U. Epperlein, W. Rosenstiel, B. Koppenhoefer
ES1996-69
Neural versus neurofuzzy systems for credit approval
S. Piramuthu
Neural versus neurofuzzy systems for credit approval
S. Piramuthu
Abstract:
\N
\N
Invited paper I
ES1996-501
Growing self-organizing networks - Why ?
B. Fritzke
Growing self-organizing networks - Why ?
B. Fritzke
Abstract:
\N
\N
Self-organizing maps
ES1996-6
Identification of gait patterns with self-organizing maps based on ground reaction force
M. Köhle, D. Merkl
Identification of gait patterns with self-organizing maps based on ground reaction force
M. Köhle, D. Merkl
ES1996-23
A self-organizing map for analysis of high-dimensional feature spaces with clusters of highly differing feature density
S. Schünemann, B. Michaelis
A self-organizing map for analysis of high-dimensional feature spaces with clusters of highly differing feature density
S. Schünemann, B. Michaelis
ES1996-27
Quantization vs Organization in the Kohonen S.O.M.
J.-C. Fort, G. Pagès
Quantization vs Organization in the Kohonen S.O.M.
J.-C. Fort, G. Pagès
ES1996-57
On the critical points of the 1-dimensional competitive learning vector quantization algorithm
D. Lamberton, G. Pagès
On the critical points of the 1-dimensional competitive learning vector quantization algorithm
D. Lamberton, G. Pagès
ES1996-74
A Kohonen map representation to avoid misleading interpretations
M. Cottrell, E. de Bodt
A Kohonen map representation to avoid misleading interpretations
M. Cottrell, E. de Bodt
Abstract:
\N
\N
Incremental learning
ES1996-14
FlexNet - A flexible neural network construction algorithm
K. Mohraz, P. Protzel
FlexNet - A flexible neural network construction algorithm
K. Mohraz, P. Protzel
ES1996-42
Incremental category learning in a real world artifact using growing dynamic cell structures
C. Scheier
Incremental category learning in a real world artifact using growing dynamic cell structures
C. Scheier
ES1996-53
Towards constructive and destructive dynamic network configuration
S. Wermter, M. Meurer
Towards constructive and destructive dynamic network configuration
S. Wermter, M. Meurer
ES1996-60
Combining sigmoids and radial basis functions in evolutive neural architectures
R. Chentouf, C. Jutten
Combining sigmoids and radial basis functions in evolutive neural architectures
R. Chentouf, C. Jutten
Abstract:
\N
\N
Invited paper II
ES1996-502
Biologically inspired eye movements for visually guided navigation of mobile robots
F. Mura, N. Martin, N. Franceschini
Biologically inspired eye movements for visually guided navigation of mobile robots
F. Mura, N. Martin, N. Franceschini
Abstract:
\N
\N
Classification
ES1996-25
Representation of obstacles in a neural network based classifier system
N.R. Ball
Representation of obstacles in a neural network based classifier system
N.R. Ball
ES1996-56
Investigating lexical access using neural nets
M. Patel
Investigating lexical access using neural nets
M. Patel
Abstract:
\N
\N
Mathematical aspects of neural networks
ES1996-16
Rates of approximation of real-valued boolean functions by neural networks
K. Hlavackova, V. Kurkova
Rates of approximation of real-valued boolean functions by neural networks
K. Hlavackova, V. Kurkova
ES1996-40
Bounds on the degree of high order binary perceptrons
E. Mayoraz
Bounds on the degree of high order binary perceptrons
E. Mayoraz
ES1996-54
A fast Bayesian algorithm for Boolean functions synthesis by means of perceptron networks
A. Catala Mallofre, J. Aguilar-Martin, B. Morcego Seix, N. Piera i Carrete
A fast Bayesian algorithm for Boolean functions synthesis by means of perceptron networks
A. Catala Mallofre, J. Aguilar-Martin, B. Morcego Seix, N. Piera i Carrete
ES1996-48
Accomodating relevance in neural networks
H. Wang, D. Bell
Accomodating relevance in neural networks
H. Wang, D. Bell
Abstract:
\N
\N
Natural and artificial vision
ES1996-2
Neural model for visual contrast detection
E. Littmann, H. Neumann, L. Pessoa
Neural model for visual contrast detection
E. Littmann, H. Neumann, L. Pessoa
ES1996-5
Application of high-order Boltzmann machines in OCR
A. de la Hera, M. Grana, A. D'Anjou, F.X. Albizuri
Application of high-order Boltzmann machines in OCR
A. de la Hera, M. Grana, A. D'Anjou, F.X. Albizuri
ES1996-22
Simulation of an inner plexiform layer neural circuit in vertebrate retina leads to sustained and transient excitation
G. Maguire, X. Yang
Simulation of an inner plexiform layer neural circuit in vertebrate retina leads to sustained and transient excitation
G. Maguire, X. Yang
ES1996-41
Analysis of visual information by receptive field dynamics
C. Born
Analysis of visual information by receptive field dynamics
C. Born
ES1996-55
Neurotransmitterdynamics in a model of a movement detecting visual system
H.A.K. Mastebroek
Neurotransmitterdynamics in a model of a movement detecting visual system
H.A.K. Mastebroek
Abstract:
\N
\N
Neural networks and statistics
ES1996-34
Recurrent least square learning for quasi-parallel principal component analysis
W. Kasprzak, A. Cichocki
Recurrent least square learning for quasi-parallel principal component analysis
W. Kasprzak, A. Cichocki
ES1996-70
Interpreting data through neural and statistical tools
A. Guérin-Dugué, C. Avilez-Cruz, P.M. Palagi
Interpreting data through neural and statistical tools
A. Guérin-Dugué, C. Avilez-Cruz, P.M. Palagi
ES1996-7
Error rate estimation via cross-validation and learning curve theory
A. Varfis, L. Corleto
Error rate estimation via cross-validation and learning curve theory
A. Varfis, L. Corleto
ES1996-15
Maximum covariance method for weight initialization of multilayer perceptron network
M. Lehtokangas, P. Korpisaari, K. Kaski
Maximum covariance method for weight initialization of multilayer perceptron network
M. Lehtokangas, P. Korpisaari, K. Kaski
Abstract:
\N
\N
Invited paper III
ES1996-503
Neural approaches to independent component analysis and source separation
J. Karhunen
Neural approaches to independent component analysis and source separation
J. Karhunen
Abstract:
\N
\N
Learning and generalization II
ES1996-32
Constraining of weights using regularities
J.N. Kok, E. Marchiori, M. Marchiori, C. Rossi
Constraining of weights using regularities
J.N. Kok, E. Marchiori, M. Marchiori, C. Rossi
ES1996-51
Regularization and neural computation: application to aerial images analysis
E. Schaeffer, P. Bourret, S. Montrozier
Regularization and neural computation: application to aerial images analysis
E. Schaeffer, P. Bourret, S. Montrozier
ES1996-58
An algorithm for training multilayer networks on non-numerical data
W. Kowalczyk
An algorithm for training multilayer networks on non-numerical data
W. Kowalczyk
ES1996-62
A correlation-based network for real-time processing
J. Ngole
A correlation-based network for real-time processing
J. Ngole
ES1996-75
Evolving neural network learning behaviours with set-based chromosomes
S.M. Lucas
Evolving neural network learning behaviours with set-based chromosomes
S.M. Lucas
Abstract:
\N
\N
Prediction
ES1996-4
Neural network application: rainfall forecasting system in Hong Kong
T.W.S. Chow, S. Cho
Neural network application: rainfall forecasting system in Hong Kong
T.W.S. Chow, S. Cho
ES1996-26
Prediction of dynamical systems with composition networks
Y. Moreau, J. Vandewalle
Prediction of dynamical systems with composition networks
Y. Moreau, J. Vandewalle
ES1996-28
Fast signal recognition and detection using ART1 neural networks and nonlinear preprocessing units based on time delay embeddings
R. Dogaru, A.T. Murgan, C. Comaniciu
Fast signal recognition and detection using ART1 neural networks and nonlinear preprocessing units based on time delay embeddings
R. Dogaru, A.T. Murgan, C. Comaniciu
ES1996-38
An analysis of the metric structure of the weight space of feedforward networks and its application to time series modeling and prediction
A. Ossen, S.M. Rüger
An analysis of the metric structure of the weight space of feedforward networks and its application to time series modeling and prediction
A. Ossen, S.M. Rüger
ES1996-59
Time series prediction using neural networks and its application to artificial human walking
R.S. Venema, A. Ypma, J.A.G. Nijhuis, L. Spaanenburg
Time series prediction using neural networks and its application to artificial human walking
R.S. Venema, A. Ypma, J.A.G. Nijhuis, L. Spaanenburg
Abstract:
Abstract. One of the main issues in the research on time series is its prediction. Using a tapped-delay neural network we formulate the optimal network size from the signal correlation time. Then the biofeedback-driven neurocontrol for artificial human walking is developed with much detail on the signal preprocessing. Finally we indicate the need for a hierarchical network architecture to eliminate oscillatory effects inherent to handling the human motor control problem.
Abstract. One of the main issues in the research on time series is its prediction. Using a tapped-delay neural network we formulate the optimal network size from the signal correlation time. Then the biofeedback-driven neurocontrol for artificial human walking is developed with much detail on the signal preprocessing. Finally we indicate the need for a hierarchical network architecture to eliminate oscillatory effects inherent to handling the human motor control problem.