Fig 2 - uploaded by Peter Kootsookos
Content may be subject to copyright.
Results for the polynomial kernel, degree 

Results for the polynomial kernel, degree 

Source publication
Article
Full-text available
An algebraic curve is defined as the zero set of a multivariate polynomial. We consider the problem of fitting an algebraic curve to a set of vectors given an additional set of vectors labelled as interior or exterior to the curve. The problem of fitting a linear curve in this way is shown to lend itself to a support vector representation, allowing...

Contexts in source publication

Context 1
... the first test, simple morphological operations were used to preprocess the binary image of Figure 1 into sets of interior, exterior, and edge points, as marked in Figure 2 by the crosses, plus-signs and dots, respectively. The figure also includes the zero contours of the resultant function for various values of c. ...
Context 2
... figure also includes the zero contours of the resultant function for various values of c. An 8th order polynomial kernel was used for all the curves of Figure 2, that is, we chose d = 8. Note that the same image was used in a similar test in both [2] and [7]. ...

Similar publications

Article
Full-text available
Support vector machine (SVM) has been shown powerful in binary classification problems. In order to accommodate SVM to speaker verification problem, the concept of sequence kernel has been developed, which maps variable-length speech data into fixed-dimension vectors. However, constructing a suitable sequence kernel for speaker verification is stil...
Article
Support Vector Machines (SVM) are a new and very promising technique in statistical learning theory, proposed by V.Vapnik in 1995 [Vap95]. In this article we address the issue of using the SVM technique for Text-independent Speaker verification experiments by proposing a new feature representation based on GMM to construct the input vector of the S...
Conference Paper
Full-text available
In anchor modeling, each speaker utterance is represented as a fixed-length location vector in the space of reference speakers by scoring against a set of anchor models. SVM-based speaker verification systems using the anchor location representation have been studied in previously reported work with promising results. In this paper, linear combinat...
Article
Full-text available
In this paper, we propose to combine an ecien t image represen- tation based on local descriptors with a Support Vector Machine classier in order to perform object categorization. For this pur- pose, we apply kernels dened on sets of vectors. After testing dieren t combinations of kernel / local descriptors, we have been able to identify a very per...
Conference Paper
Full-text available
Sequence-derived structural and physicochemical features have been used to develop models for predicting protein families. Here, we test the hypothesis that high-level functional groups of proteins may be classified by a very small set of global features directly extracted from sequence alone. To test this, we represent each protein using a small n...

Citations

... Copyright 2005 by the author(s)/owner(s). triangulated meshes) are the most common representation , recent years have seen increasing interest in the use of implicit shape models (Carr et al., 2001; Walder et al., 2003; Shen et al., 2004; Schölkopf et al., 2005). Implicit shape models (or simply implicits) use an embedding function f : X → R, that defines the hyper-surface implicitly by way of its zero level set f −1 (0). ...
... Another method that does not use normal vectors is the Slab SVM – a generalisation of the one-class SVM that essentially applies the " maximum margin " regulariser to the problem of implicit surface modelling (Schölkopf et al., 2005). This is a natural application of kernel methods, and a related approach was taken in (Walder et al., 2003) which generalises the SVM classifier – an alternative which we briefly introduce after the following the description of the Slab SVM. ...
... This causes no problems in rendering an implicit in three dimensions (since the extraneous zero set will always be obscured), but the property that the sign of the function indicates whether a given point is inside the shape no longer holds. The method of Walder et al. (2003), on the other hand, is similar to the above but rather than including the term const(f ) in the objective , introduces inequality constraints that force the function to be greater/less than the value plus/minus one at some additional points interior/exterior to the target manifold – thereby typically requiring normal vectors in order to derive these additional points. The goal of the present work then, is to preserve the strengths of the Slab SVM (especially the fact that it does not use normal vectors) while producing embedding functions with the behaviour depicted on the right side ofFigure 1, as these are more suitable for inside/outside tests etc. ...
Article
We discuss the problem of fitting an implicit shape model to a set of points sampled from a co-dimension one manifold of arbitrary topology. The method solves a non-convex optimisation problem in the embedding function that defines the implicit by way of its zero level set. By assuming that the solution is a mixture of radial basis functions of varying widths we attain the globally optimal solution by way of an equivalent eigenvalue problem, without using or constructing as an intermediate step the normal vectors of the manifold at each data point. We demonstrate the system on two and three dimensional data, with examples of missing data interpolation and set operations on the resultant shapes.
... In the machine learning literature, there has been renewed interest in the use of maxmargin methods as well as Gaussian Processes (GP) for solving non-linear non-parametric regression problems [73,90,115]. Work in [114] presented an approach to algebraic curve fitting using a support vector formulation, that fit a global implicit function whose zerolevel set contained a specific set of data points. The problem is formulated in a manner nearly identical to classical hard-margin linear SVM classification, with the penalty function proportional to the value of the implicit function evaluated at the specified set of input points. ...
... To avoid a degenerate solution, a set of off-manifold points are also specified and associated with labels +1 and −1 to depending on whether they are interior or exterior to the surface of interest. The authors of [114] also proposed a modified rational polynomial kernel function as being more suitable for the geometric fitting problem. More recently, work in [92] demonstrated how algorithms based on GPs can be implemented efficiently and scaled to large datasets, and showed an application to surface reconstruction. ...
... In SVM-based methods, there is the common question of how to select the value of the so-called c-parameter that controls the tradeoff between the quality of the fit to the data and the regularization. Incorrectly set values can lead sometimes to topologically unstable solutions and to the reconstruction qualitatively appearing over-smoothed [114]. In the case of GP-based methods [92], the constraints on the off-manifold points are currently given by specifying not just the locations of the points but also the function values at those points. ...
Article
Full-text available
In recent years, there has been a resurgence in the use of raw point cloud data as the geometric primitive of choice for several modeling tasks such as rendering, editing and compression. Algorithms using this representation often require reliable additional information such as the curve tangent or surface normal at each point. Estimation of these quantities requires the selection of an appropriate scale of analysis to accommodate sensor noise, density variation and sparsity in the data. To this goal, we present a new class of locally semi-parametric estimators that allows analysis of accuracy with finite samples, as well as explicitly addresses the problem of selecting optimal support volume for local fitting. Experiments on synthetic and real data validate the behavior predicted by the model, and show competitive performance and improved stability over leading alternatives that require a preset scale.
Conference Paper
In this paper, we propose an eigenvalue analysis based bandwidth selection method for fitting curves to noisy point clouds. Firstly the selection of a suitable local region radius (bandwidth) is carried out by using the normalized maximum eigenvalue. Then the regression line of the local region is used for ordering the data. Finally, a 3th order polynomial is fitted to the local data by using least squares method.