Bruges, Belgium, April 16-17-18
Content of the proceedings
-
Models I
Biological modelling and vision
Invites session: Object and scene recognition - experimental results and models
Statistics and rule extraction
Forecasting and chaos
Invited session: Theory of neural networks with statistical mechanics approaches
Self-organizing maps
Learning I
Invited session: Source separation and Independent Component Analysis
Learning II
Models II
Models I
ES1997-20
Speaker normalization with a mixture of recurrent networks
E. Trentin, D. Giuliani
Speaker normalization with a mixture of recurrent networks
E. Trentin, D. Giuliani
ES1997-55
Neural network cooperation for handwritten digit recognition: a comparison of four methods
Y. Autret, A. Thepaut
Neural network cooperation for handwritten digit recognition: a comparison of four methods
Y. Autret, A. Thepaut
Abstract:
In this paper we focus on "off-line" recognition with unknown scriptor. We only consider isolated digits. We present an approach whose aim is simplicity and performance. Two simple neural solutions are first presented. Their main characteristic is non neural pretreatment reduced as much as possible. A simple method for integration is then proposed. The method has been evaluated on a database from french postal services (SRTP). This data base contains 12000 digits and is kept secret.
In this paper we focus on "off-line" recognition with unknown scriptor. We only consider isolated digits. We present an approach whose aim is simplicity and performance. Two simple neural solutions are first presented. Their main characteristic is non neural pretreatment reduced as much as possible. A simple method for integration is then proposed. The method has been evaluated on a database from french postal services (SRTP). This data base contains 12000 digits and is kept secret.
ES1997-56
Probabilistic self organized map - application to classification
F. Anouar, F. Badran, S. Thiria
Probabilistic self organized map - application to classification
F. Anouar, F. Badran, S. Thiria
ES1997-35
Performance of weighted radial basis function classifiers
L.M. Reyneri, M. Sgarbi
Performance of weighted radial basis function classifiers
L.M. Reyneri, M. Sgarbi
Abstract:
This paper describes Weighted Radial Basis Functions, a neuro-fuzzy unification algorithm which mixes Perceptrons and Radial Basis Functions. The algorithm has been tested as a pattern classifier in practical applications. Its performance are compared against those of other neural classifiers. The proposed algorithm has performance comparable or better than other neural algorithms, although it can be trained much faster. It can also act as a neuro-fuzzy unification algorithm.
This paper describes Weighted Radial Basis Functions, a neuro-fuzzy unification algorithm which mixes Perceptrons and Radial Basis Functions. The algorithm has been tested as a pattern classifier in practical applications. Its performance are compared against those of other neural classifiers. The proposed algorithm has performance comparable or better than other neural algorithms, although it can be trained much faster. It can also act as a neuro-fuzzy unification algorithm.
ES1997-42
Equivalence of a radial basis function NN and a perceptron
I. Grabec
Equivalence of a radial basis function NN and a perceptron
I. Grabec
Abstract:
\N
\N
Biological modelling and vision
ES1997-65
Synchronization and oscillations in the visual cortex: a stochastic model using a spike memory term
M. Mougeot
Synchronization and oscillations in the visual cortex: a stochastic model using a spike memory term
M. Mougeot
ES1997-40
From retinal circuits to motion processing: a neuromorphic approach to velocity estimation
A. Torralba, J. Hérault
From retinal circuits to motion processing: a neuromorphic approach to velocity estimation
A. Torralba, J. Hérault
ES1997-51
Size invariance by dynamic scaling in neural vision systems
G. Meierfrankenfeld, K. Kopecz
Size invariance by dynamic scaling in neural vision systems
G. Meierfrankenfeld, K. Kopecz
ES1997-60
Object recognition with banana wavelets
N. Krüger, G. Peters
Object recognition with banana wavelets
N. Krüger, G. Peters
Abstract:
\N
\N
Invites session: Object and scene recognition - experimental results and models
ES1997-501
Neural network for color contrast gain control
M. D'Zmura
Neural network for color contrast gain control
M. D'Zmura
ES1997-502
Neuronal theories and technical systems for face recognition
R.P. Würtz
Neuronal theories and technical systems for face recognition
R.P. Würtz
ES1997-503
How can the visual system process a natural scene in under 150ms? On the role of asynchronous spike propagation
S. Thorpe, J. Gautrais
How can the visual system process a natural scene in under 150ms? On the role of asynchronous spike propagation
S. Thorpe, J. Gautrais
ES1997-504
Aspects of psychological computation in scene and face recognition
P. Schyns
Aspects of psychological computation in scene and face recognition
P. Schyns
ES1997-505
Scene categorisation by curvilinear component analysis of low frequency spectra
J. Hérault, A. Olivia, A. Guérin-Dugué
Scene categorisation by curvilinear component analysis of low frequency spectra
J. Hérault, A. Olivia, A. Guérin-Dugué
Abstract:
\N
\N
Statistics and rule extraction
ES1997-43
Two neural network methods for multidimensional scaling
M. C. van Wezel, J. N. Kok, W. A. Kosters
Two neural network methods for multidimensional scaling
M. C. van Wezel, J. N. Kok, W. A. Kosters
ES1997-3
Sequential hypotheses tests for modelling neural networks
U. Anders, O. Korn
Sequential hypotheses tests for modelling neural networks
U. Anders, O. Korn
ES1997-68
Extraction of crisp logical rules using constrained backpropagation networks
W. Duch, R. Adamczak, K. Grabczewski, M. Ishikawa, H. Ueda
Extraction of crisp logical rules using constrained backpropagation networks
W. Duch, R. Adamczak, K. Grabczewski, M. Ishikawa, H. Ueda
ES1997-72
Knowledge extraction from neural networks for signal interpretation
F. Alexandre, J.-F. Remm
Knowledge extraction from neural networks for signal interpretation
F. Alexandre, J.-F. Remm
ES1997-17
Equivalent error bars for neural network classifiers trained by Bayesian inference
P. Sykacek
Equivalent error bars for neural network classifiers trained by Bayesian inference
P. Sykacek
Abstract:
\N
\N
Forecasting and chaos
ES1997-53
d-NARMA neural networks: a connectionist extension of ARARMA models
D. Bonnet, V. Perrault, A. Grumbach
d-NARMA neural networks: a connectionist extension of ARARMA models
D. Bonnet, V. Perrault, A. Grumbach
ES1997-80
New criterion of identification in the multilayered perceptron modelling
M. Mangeas, M. Cottrell, J.-F. Yao
New criterion of identification in the multilayered perceptron modelling
M. Mangeas, M. Cottrell, J.-F. Yao
ES1997-2
Error measure for identifying chaotic attractors
T. Hrycej
Error measure for identifying chaotic attractors
T. Hrycej
ES1997-47
A fuzzy Artmap neural system for the prediction of turbulent velocity fields
J. Ferre-Gine, R. Rallo, A. Arenas, F. Giralt
A fuzzy Artmap neural system for the prediction of turbulent velocity fields
J. Ferre-Gine, R. Rallo, A. Arenas, F. Giralt
Abstract:
\N
\N
Invited session: Theory of neural networks with statistical mechanics approaches
ES1997-506
Numerical simulations of an optimal algorithm for supervised learning
A. Buhot, J.-M. Torres Moreno, M. B. Gordon
Numerical simulations of an optimal algorithm for supervised learning
A. Buhot, J.-M. Torres Moreno, M. B. Gordon
ES1997-507
Exact asymptotic estimates of the storage capacities of the committee machines with overlapping and non-overlapping receptive fields
C. Kwon, J.-H. Oh
Exact asymptotic estimates of the storage capacities of the committee machines with overlapping and non-overlapping receptive fields
C. Kwon, J.-H. Oh
ES1997-508
Optimal tuning curves for neurons spiking as a Poisson process
N. Brunel, J.P. Nadal
Optimal tuning curves for neurons spiking as a Poisson process
N. Brunel, J.P. Nadal
ES1997-509
Bayesian online learning in the perceptron
O. Winther, S. A. Solla
Bayesian online learning in the perceptron
O. Winther, S. A. Solla
ES1997-510
Precursor networks for training the binary perceptron
B. Van Rompaey
Precursor networks for training the binary perceptron
B. Van Rompaey
Self-organizing maps
ES1997-36
State-space modeling using self-organizing maps
H. Hyötyniemi
State-space modeling using self-organizing maps
H. Hyötyniemi
Abstract:
Dynamic processes can be modeled using self-organizing maps, for example, by using the states of the system as input feature vectors to the self-organization algorithm. The neural net converges according to the distribution of the state variables, and, in principle, this map can be seen as a compressed representation of the state space. However, the distance metric which the Kohonen type adaptation algorithms are based on is not justified when measuring distances between process states. In this paper, the process to be modeled is modified so that the `static´ norm between states is applicable also when measuring distances between trajectories.
Dynamic processes can be modeled using self-organizing maps, for example, by using the states of the system as input feature vectors to the self-organization algorithm. The neural net converges according to the distribution of the state variables, and, in principle, this map can be seen as a compressed representation of the state space. However, the distance metric which the Kohonen type adaptation algorithms are based on is not justified when measuring distances between process states. In this paper, the process to be modeled is modified so that the `static´ norm between states is applicable also when measuring distances between trajectories.
ES1997-41
Almost sure convergence of the one-dimensional Kohonen algorithm
M. Benaïm, J.-C. Fort, G. Pagès
Almost sure convergence of the one-dimensional Kohonen algorithm
M. Benaïm, J.-C. Fort, G. Pagès
ES1997-54
Self organizing map for adaptive non-stationary clustering: some experimental results on color quantization of image sequences
A.I. Gonzales, M. Grana, A. D'Anjou, F.X. Albizuri, M. Cottrell
Self organizing map for adaptive non-stationary clustering: some experimental results on color quantization of image sequences
A.I. Gonzales, M. Grana, A. D'Anjou, F.X. Albizuri, M. Cottrell
ES1997-75
Measuring topology preservation in maps of real-world data
M. Herrmann, H.-U. Bauer, T. Villmann
Measuring topology preservation in maps of real-world data
M. Herrmann, H.-U. Bauer, T. Villmann
ES1997-77
Kohonen maps versus vector quantization for data analysis
E. de Bodt, M. Verleysen, M. Cottrell
Kohonen maps versus vector quantization for data analysis
E. de Bodt, M. Verleysen, M. Cottrell
Abstract:
\N
\N
Learning I
ES1997-28
The class of linear separability method
M. Tajine, D. Elizondo, E. Fiesler, J. Korczak
The class of linear separability method
M. Tajine, D. Elizondo, E. Fiesler, J. Korczak
ES1997-57
Connectionist rule processing using recursive auto-associative memory
M. St Aubyn, N. Davey
Connectionist rule processing using recursive auto-associative memory
M. St Aubyn, N. Davey
ES1997-62
An algebra for recognition of spatio-temporal forms
G. Vaucher
An algebra for recognition of spatio-temporal forms
G. Vaucher
ES1997-67
Application of a self-learning controller with continuous control signals based on the DOE-approach
M. Riedmiller
Application of a self-learning controller with continuous control signals based on the DOE-approach
M. Riedmiller
Abstract:
\N
\N
Invited session: Source separation and Independent Component Analysis
ES1997-512
From source separation to Independent Component Analysis: an introduction to the special session
C. Jutten
From source separation to Independent Component Analysis: an introduction to the special session
C. Jutten
ES1997-513
A learning algorithm for the blind separation of non-zero skewness source signals with no spurious equilibria
S. Choi, R.-W. Liu
A learning algorithm for the blind separation of non-zero skewness source signals with no spurious equilibria
S. Choi, R.-W. Liu
ES1997-514
A competitive learning algorithm for separating binary sources
P. Pajunen
A competitive learning algorithm for separating binary sources
P. Pajunen
ES1997-515
A linear adaptive neural network for extraction of independent components
Z. Malouche, O. Macchi
A linear adaptive neural network for extraction of independent components
Z. Malouche, O. Macchi
ES1997-516
Blind equalization with a linear feedforward neural network
X.-R. Cao, J. Zhu, J. Si
Blind equalization with a linear feedforward neural network
X.-R. Cao, J. Zhu, J. Si
ES1997-517
Optimization of the asymptotic performance of time-domain convolutive source separation algorithms
N. Charkani, Y. Deville
Optimization of the asymptotic performance of time-domain convolutive source separation algorithms
N. Charkani, Y. Deville
ES1997-518
Nonlinear source separation: the post-nonlinear mixtures
A. Taleb, C. Jutten
Nonlinear source separation: the post-nonlinear mixtures
A. Taleb, C. Jutten
ES1997-519
I.C.A.: conditions on cumulants and information theoretic approach
J.-P. Nadal, N. Parga
I.C.A.: conditions on cumulants and information theoretic approach
J.-P. Nadal, N. Parga
ES1997-520
Nonlinearity and separation capability: further justification for the ICA algorithm with a learned mixture of parametric densities
L. Xu, C.C. Cheung, J. Ruan, S.-I. Amari
Nonlinearity and separation capability: further justification for the ICA algorithm with a learned mixture of parametric densities
L. Xu, C.C. Cheung, J. Ruan, S.-I. Amari
ES1997-521
Independence is far from normal
M. Girolami, C. Fyfe
Independence is far from normal
M. Girolami, C. Fyfe
Abstract:
\N
\N
Learning II
ES1997-70
Composition methods for the integration of dynamical neural networks
Y. Moreau, J. Vandewalle
Composition methods for the integration of dynamical neural networks
Y. Moreau, J. Vandewalle
ES1997-9
Order between logic networks and stable neural networks
Y. Kobuchi
Order between logic networks and stable neural networks
Y. Kobuchi
ES1997-39
Inductive learning in animat-based neural networks
M. Wexler
Inductive learning in animat-based neural networks
M. Wexler
Models II
ES1997-8
Evidence of efficiency of recurrent neural networks with ARMA-like units
J.P. Draye, D. Pavisic, G. Cheron, G. Libert
Evidence of efficiency of recurrent neural networks with ARMA-like units
J.P. Draye, D. Pavisic, G. Cheron, G. Libert
ES1997-66
Recurrent neural networks and motor programs
P. I. Miller
Recurrent neural networks and motor programs
P. I. Miller
ES1997-15
Time delay neural network for target-intercept problem solving
A. Korgul
Time delay neural network for target-intercept problem solving
A. Korgul
ES1997-21
Neural network adative wavelets for function approximation
F. Yong, T.W.S. Chow
Neural network adative wavelets for function approximation
F. Yong, T.W.S. Chow
Abstract:
\N
\N