In this work the kernel Adaline algorithm is presented. The new algorithm is a generalisation of Widrow's and Hoff's linear Adaline, and allows to approximate non-linear functional relationships. Similar to the linear adaline, the proposed neural network algorithm minimises the least-mean-squared (LMS) cost function. It can be guaranteed that the kernel Adaline's cost function is always convex, therefore the method does not suffer from local optima as known in conventional neural networks. The algorithm uses potential function operators due to Aizerman and colleagues to map the training points in a first stage into a very high dimensional non-linear “feature” space. In the second stage the LMS solution in this space is determined by the algorithm. Weight decay regularisation allows to avoid overfitting effects, and can be performed efficiently. The kernel Adaline algorithm works in a sequential fashion, is conceptually simple, and numerically robust. The method shows a high performace in tasks like one dimensional curve fitting, system identification, and speech processing.