Digital Library
Close Browse articles from a journal
 
<< previous    next >>
     Journal description
       All volumes of the corresponding journal
         All issues of the corresponding volume
           All articles of the corresponding issues
                                       Details for article 42 of 46 found articles
 
 
  Syntactic Neural Networks
 
 
Title: Syntactic Neural Networks
Author: Lucas, S. M.
Damper, R. I.
Appeared in: Connection science
Paging: Volume 2 (1990) nr. 3 pages 195-221
Year: 1990
Contents: We introduce a new connectionist paradigm which views neural networks as implementations of syntactic pattern recognition algorithms. Thus, learning is seen as a process of grammatical inference and recognition as a process of parsing. Naturally, the possible realizations of this theme are diverse; in this paper we present some initial explorations of the case where the pattern grammar is context-free, inferred (from examples) by a separate procedure, and then mapped onto a connectionist paper. Unlike most neural networks for which structure is pre-defined, the resulting network has as many levels as are necessary and arbitrary connections between levels. Furthermore, by the addition of a delay element, the network becomes capable of dealing with time-varying patterns in a simple and efficient manner. Since grammatical inference algorithms are notoriously expensive computationally, we place an important restriction on the type of context-free grammars which can be inferred. This dramatically reduces complexity. The resulting grammars are called 'strictly-hierarchical' and map straightforwardly onto a temporal connectionist parser (TCP) using a relatively small number of neurons. The new paradigm is applicable to a variety of pattern-processing tasks such as speech recognition and character recognition. We concentrate here on hand-written character recognition; performance in other problem domains will be reported in future publications. Results are presented to illustrate the performance of the system with respect to a number of parameters, namely, the inherent variability of the data, the nature of the learning (supervised or unsupervised) and the details of the clustering procedure used to limit the number of non-terminals inferred. In each of these cases (eight in total), we contrast the performance of a stochastic and a non-stochastic TCP. The stochastic TCP does have greater powers of discrimination, but in many cases the results were very similar. If this result holds in practical situations it is important, because the non-stochastic version has a straightforward implementation in silicon.
Publisher: Taylor & Francis
Source file: Elektronische Wetenschappelijke Tijdschriften
 
 

                             Details for article 42 of 46 found articles
 
<< previous    next >>
 
 Koninklijke Bibliotheek - National Library of the Netherlands