International Journal of Data Science and Analysis
Volume 5, Issue 3, June 2019, Pages: 42-51
Received: Jun. 30, 2019;
Accepted: Jul. 24, 2019;
Published: Aug. 7, 2019
Views 402 Downloads 70
Yeqian Liu, Department of Mathematical Sciences, Middle Tennessee State University, Murfreesboro, USA
The support vector machine (SVM) has become very popular within the machine learning literature. Recently, SVM has received much attention from statisticians. It is well known that for multicategory classification problem, the commonly used multicategory SVM is based on the frequentist framework. In this paper, we develop a multi-class support vector machine under the Bayesian framework. Numerical studies were performed by EM and the Bayesian algorithm Gibbs sampler. Our results have shown that the classification accuracy of the Bayesian approach is comparable to that of frequentist approaches, while Bayesian approach also has the advantage of providing estimates of uncertainty in predictions.
Data Augmentation and Bayesian Methods for Multicategory Support Vector Machines, International Journal of Data Science and Analysis.
Vol. 5, No. 3,
2019, pp. 42-51.
Copyright © 2019 Authors retain the copyright of this article.
This article is an open access article distributed under the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/
) which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Boser, B. E., Guyon, I. M., & Vapnik, V. (1992). A training algorithm for optimal margin classifiers. In Fifth Annual Workshop on Computational Learning Theory, Pittsburgh, 1992.
Vapnik, V. (1998). Statistical learning theory. Wiley, New York.
Burges, C. J. C. (1998). A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 2(2), 121-167.
Nicholas G. Polson and Steven L. Scott (2011). Data Augmentation for Support Vector Machines. Bayesian Analysis, 6, 1-24.
Cortes, C., Vapnik, V. (1995) Support-Vector Networks, Machine Learning.
Henao, R., Yuan, X., Carin, L. (2014) Bayesian Nonlinear Support Vector Machines and Discriminative Factor Modeling. Neural Information Processing Systems Conference.
Hoffman, M. D., Blei, D. M., Wang, C., Paisley, J. (2013) Stochastic Variational Inference. Journal of Machine Learning Research, 14, 1303-1347.
Hensman, J., Fusi, N., Lawrence, N. D. (2013) Gaussian processes for big data. Conference on Uncertainty in Artificial Intellegence.
Fernández-Delgado, M., Cernadas, E., Barro, S., Amorim, D. (2014) Do we need hundreds of classifiers to solve real world classification problems? Journal of Machine Learning Research, 15, 3133-3181.
Mohri, M., Rostamizadeh, A., Talwalkar, A. (2012) Foundations of machine learning. MIT press.
D. F. Andrews and C. L. Mallows (1974). Scale Mixtures of Normal Distributions. Journal of the Royal Statistical Society. Series B, 36, 99-102.
Mike West. (1987). On Scale Mixtures of Normal Distributions. Biometrika, 74, 646-648.
Yoonkyung Lee and Zhenhuan Cui (2006). Characterizing the Solution Path of Multicategory Support Vector Machines. Statistica Sinica , 16, 391-409.
Zhihua Zhang and Michael I. J ordan (2006). Bayesian Multicategory Support Vector Machines Proceedings of the Twenty-Second Conference Annual Conference on Uncertainty in Artificial Intelligence , 552-559.
Yoonkyung Lee, Yi Lin and Grace Wahba (2004). Multicategory Support Vector Machines Theory and Application to the Classification of Microarray Data and Satellite Radiance Data Journal of American Statistical Association, 99, 67-81.