A heuristic for free parameter optimization with support vector machines

M Boardman, T Trappenberg - the 2006 IEEE international joint …, 2006 - ieeexplore.ieee.org
M Boardman, T Trappenberg
the 2006 IEEE international joint conference on neural network …, 2006ieeexplore.ieee.org
A heuristic is proposed to address free parameter selection for Support Vector Machines,
with the goals of improving generalization performance and providing greater insensitivity to
training set selection. The many local extrema in these optimization problems make gradient
descent algorithms impractical. The main point of the proposed heuristic is the inclusion of a
model complexity measure to improve generalization performance. We also use simulated
annealing to improve parameter search efficiency compared to an exhaustive grid search …
A heuristic is proposed to address free parameter selection for Support Vector Machines, with the goals of improving generalization performance and providing greater insensitivity to training set selection. The many local extrema in these optimization problems make gradient descent algorithms impractical. The main point of the proposed heuristic is the inclusion of a model complexity measure to improve generalization performance. We also use simulated annealing to improve parameter search efficiency compared to an exhaustive grid search, and include an intensity-weighted centre of mass of the most optimum points to reduce volatility. We examine two standard classification problems for comparison, and apply the heuristic to bioinformatics and retinal electrophysiology classification.
ieeexplore.ieee.org
以上显示的是最相近的搜索结果。 查看全部搜索结果