skip to main content
article
Free access

Support vector machines: hype or hallelujah?

Published: 01 December 2000 Publication History
First page of PDF

References

[1]
{1} Barabino N., Pallavicini M., Petrolini A., Pontil M. and Verri A. Support vector machines vs multi-layer perceptrons in particle identification. In Proceedings of the European Symposium on Artifical Neural Networks '99 (D-Facto Press, Belgium), p. 257-262, 1999.
[2]
{2} Bennett K. and Bredensteiner E. Geometry in Learning, in Geometry at Work, C. Gorini Editor, Mathematical Association of America, Washington D. C., 132-145, 2000.
[3]
{3} Bennett K. and Bredensteiner E. Duality and Geometry in SVMs. In P. Langley editor, Proc. of 17th International Conference on Machine Learning, Morgan Kaufmann, San Francisco, 65-72, 2000.
[4]
{4} Bennett K., Dermiriz A. and Shawe-Taylor J. A Column Generation Algorithm for Boosting. In P. Langley editor, Proc. of 17th International Conference on Machine Learning, Morgan Kaufmann, San Francisco, 57-64, 2000.
[5]
{5} Bennett K., Wu D. and Auslender L. On support vector decision trees for database marketing. Research Report No. 98-100, Rensselaer Polytechnic Institute, Troy, NY, 1998.
[6]
{6} Bradley P., Mangasarian O. and Musicant, D. Optimization in Massive Datasets. To appear in Abello, J., Pardalos P., Resende, M (eds), Handbook of Massive Datasets, Kluwer, 2000.
[7]
{7} Brown M., Grundy W., D. Lin, N. Cristianini, C. Sugnet, T. Furey, M. Ares Jr. D. Haussler. Knowledge-based Analysis of Microarray Gene Expression Data using Support Vector Machines. Proceedings of the National Academy of Sciences, 97 (1), p. 262-267, 2000.
[8]
{8} Burges C. A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery, 2, p. 121-167, 1998.
[9]
{9} Campbell C. and Bennett K. A Linear Programming Approach to Novelty Detection. To appear in Advances in Neural Information Processing Systems 14 (Morgan Kaufmann, 2001).
[10]
{10} Chapelle O. and Vapnik V. Model selection for support vector machines. To appear in Advances in Neural Information Processing Systems, 12, ed. S. A. Solla, T. K. Leen and K.-R. Muller, MIT Press, 2000.
[11]
{11} Cortes C. and Vapnik V. Support vector networks. Machine Learning 20, p. 273-297, 1995.
[12]
{12} Crisp D. and Burges C. A geometric interpretation of v-svm classifiers. Advances in Neural Information Processing Systems, 12, ed. S. A. Solla, T. K. Leen and K.-R. Muller, MIT Press, 2000.
[13]
{13} Cristianini N., Campbell C. and Shawe-Taylor, J. Dynamically adapting kernels in support vector machines. Advances in Neural Information Processing Systems, 11, ed. M. Kearns, S. A. Solla, and D. Cohn, MIT Press, p. 204-210, 1999.
[14]
{14} Cristianini N. and Shawe-Taylor J. An Introduction to Support Vector Machines and other Kernel-based Learning Methods. Cambridge University Press, 2000. www.support-vector.net.
[15]
{15} Collobert R. and Bengio S. SVMTorch web page, http://www.idiap.ch/learning/SVMTorch.html
[16]
{16} DeCoste D. and Scholkopf B. Training Invariant Support Vector Machines. To appear in Machine Learning (Kluwer, 2001).
[17]
{17} Drucker H., with Wu D. and Vapnik V. Support vector machines for spam categorization. IEEE Trans. on Neural Networks, 10, p. 1048-1054. 1999.
[18]
{18} Drucker H., Burges C., Kaufman L., Smola A. and Vapnik V. Support vector regression machines. In: M. Mozer, M. Jordan, and T. Petsche (eds.). Advances in Neural Information Processing Systems, 9, MIT Press, Cambridge, MA, 1997.
[19]
{19} Dumais S., Platt J., Heckerman D. and Sahami M. Inductive Learning Algorithms and Representations for Text Categorization. 7th International Conference on Information and Knowledge Management, 1998.
[20]
{20} Fernandez R. and Viennet E. Face identification using support vector machines. Proceedings of the European Symposium on Artificial Neural Networks (ESANN99), (D.- Facto Press, Brussels) p. 195-200, 1999.
[21]
{21} Ferris, M. and Munson T. Semi-smooth support vector machines. Data Mining Institute Technical Report 00-09, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, 2000.
[22]
{22} Ferris M. and Munson T. Interior point methods for massive support vector machines. Data Mining Institute Technical Report 00-05, Computer Sciences Department, University of Wisconsin, Madison, Wisconsin, 2000.
[23]
{23} Friess T.-T., Cristianini N. and Campbell, C. The kernel adatron algorithm: a fast and simple learning procedure for support vector machines. 15th Intl. Conf. Machine Learning, Morgan Kaufman Publishers, p. 188-196, 1998.
[24]
{24} Furey T., Cristianini N., Duffy N., Bednarski D., Schummer M. and Haussler D. Support Vector Machine Classification and Validation of Cancer Tissue Samples using Microarray Expression Data. Bioinformatics 16 p. 906-914, 2000.
[25]
{25} Golub T., Slonim D., Tamayo P., Huard C., Gassenbeek M., Mesirov J., Coller H., Loh M., Downing J., Caligiuri M., Bloomfield C. and Lander E. Modecular Classification of cancer: Class discovery and class prediction by gene expression monitoring. Science, 286 p. 531-537, 1999.
[26]
{26} Guyon I., Matic N. and Vapnik V. Discovering informative patterns and data cleaning. In U. M. Fayyad, G. Piatetsky-Shapiro, P. Smyth, and R. Uthurusamy, editors, Advances in Knowledge Discovery and Data Mining, MIT Press, p. 181- 203, 1996.
[27]
{27} Guyon, I Web page on SVM Applications, http://www.clopinet.com/isabelle/Projects/SVM/applist.html
[28]
{28} Jaakkola T., Diekhans M. and Haussler, D. A discriminative framework for detecting remote protein homologies. MIT Preprint, 1999.
[29]
{29} Joachims, T. Text categorization with support vector machines: learning with many relevant features. Proc. European Conference on Machine Learning (ECML), 1998.
[30]
{30} Joachims, T. Estimating the Generalization Performance of an SVM efficiently. In Proceedings of the 17th International Conference on Machine Learning, Morgan Kaufmann,. 431-438, 2000.
[31]
{31} Joachims, T. Text categorization with support vector machines: learning with many relevant features. Proc. European Conference on Machine Learning (ECML), 1998.
[32]
{32} Joachims, T. Web Page on SVMLight: http://www.ai.cs.uni-dortmund.de /SOFTWARE/SVM_LIGHT/svm_light.eng.html
[33]
{33} Keerthi S., Shevade S., Bhattacharyya C. and Murthy, K. Improvements to Platt's SMO algorithm for SVM classifier design. Tech Report, Dept. of CSA, Banglore, India, 1999.
[34]
{34} Keerthi S., Shevade, S., Bhattacharyya C. and Murthy, K. A. Fast Iterative Nearest Point Algorithm for Support Vector Machine Classifier Design, Techical Report TR-ISL-99-03, Intelligent Systems Lab, Dept of Computer Science and Automation, Indian Institute of Science, Bangalore, India, (accepted for publication in IEEE Transaction on Neural Networks) 1999.
[35]
{35} Luenberger, D. Linear and Nonlinear Programming. Addison-Wesley, 1984.
[36]
{36} Mangasarian, O. and Musicant D. Massive Support Vector Regression Data mining Institute Technical Report 99-02, Dept of Computer Science, University of Wisconsin-Madison, August 1999.
[37]
{37} Mangasarian, O. and Musicant D. Lagrangian Support Vector Regression Data mining Institute Technical Report 00-06, June 2000.
[38]
{38} Mukherjee S., Tamayo P., Slonim D., Verri A., Golub T., Mesirov J. and Poggio T. Support Vector Machine Classification of Microarray Data, MIT AI Memo No. 1677 and MIT CBCL Paper No. 182.
[39]
{39} ORL dataset: Olivetti Research Laboratory, 1994,. http://www.uk.research.att.com/facedatabase.html
[40]
{40} Osuna E., Freund R. and Girosi F. Training Support Vector Machines: an Application to Face Detection. Proceedings of CVPR'97, Puerto Rico, 1997.
[41]
{41} Osuna E., Freund R. and Girosi F. Proc. of IEEE NNSP, Amelia Island, FL p. 24-26, 1997.
[42]
{42} Osuna E. and Girosi F. Reducing the Run-time Complexity in Support Vector Machines. In B. Scholkopf, C. Burges and A. Smola (ed.), Advances in Kernel Methods: Support Vector Learning, MIT press, Cambridge, MA, p. 271-284, 1999.
[43]
{43} Platt J. Fast training of SVMs using sequential minimal optimization. In B. Scholkopf, C. Burges and A. Smola (ed.), Advances in Kernel Methods: Support Vector Learning, MIT press, Cambridge, MA, p. 185-208, 1999.
[44]
{44} Papageorgiou C., Oren M. and Poggio, T. A General Framework for Object Detection. Proceedings of International Conference on Computer Vision, p. 555-562, 1998.
[45]
{45} Raetsch G., Demiriz A., and Bennett K. Sparse regression ensembles in infinite and finite hypothesis space. NeuroCOLT2 technical report, Royal Holloway College, London, September, 2000.
[46]
{46} Rychetsky M., Ortmann, S. and Glesner, M. Support Vector Approaches for Engine Knock Detection. Proc. International Joint Conference on Neural Networks (IJCNN 99), July, 1999, Washington, USA.
[47]
{47} Roobaert D. Improving the Generalization of Linear Support Vector Machines: an Application to 3D Object Recognition with Cluttered Background. Proc. Workshop on Support Vector Machines at the 16th International Joint Conference on Artificial Intelligence, July 31-August 6, Stockholm, Sweden, p. 29-33 1999.
[48]
{48} Scholkopf B., Bartlett P., Smola A. and Williamson R. Support vector regression with automatic accuracy control. In L. Niklasson, M. Boden and T. Ziemke, editors, Proceedings of the 8th International Conference on Artificial Neural Networks, Perspectives in Neural Computing, Berlin, Springer Verlag, 1998.
[49]
{49} Scholkopf B., Bartlett P., Smola A., and Williamson R. Shrinking the Tube: A New Support Vector Regression Algorithm. To appear in: M. S. Kearns, S. A. Solla, and D. A. Cohn (eds.), Advances in Neural Information Processing Systems, 11, MIT Press, Cambridge, MA, 1999.
[50]
{50} Scholkopf B., Burges C. and Smola A. Advances in Kernel Methods: Support Vector Machines. MIT Press, Cambridge, MA. 1998.
[51]
{51} Scholkopf B., Platt J. C., Shawe-Taylor J., Smola A. J., Williamson R. C. Estimating the support of a high-dimensional distribution. Microsoft Research Corporation Technical Report MSR-TR-99-87, 1999.
[52]
{52} Scholkopf B., Shawe-Taylor J., Smola A. and Williamson R. Kernel-dependent support vector error bounds. Ninth International Conference on Artificial Neural Networks, IEE Conference Publications No. 470, p. 304-309, 1999.
[53]
{53} Scholkopf B., Smola A., and Muller, K.-R., Kernel principal component analysis. In B. Scholkopf, C. Burges, and A. Smola, editors, Advances in Kernel Methods: Support Vector Learning. MIT Press, Cambridge, MA, 1999b. 327-352.
[54]
{54} Scholkopf B., Smola A., Williamson R., and Bartlett P. New support vector algorithms. To appear in Neural Computation, 1999.
[55]
{55} Scholkopf, B., Sung, K., Burges C., Girosi F., Niyogi P., Poggio T. and Vapnik V. Comparing Support Vector Machines with Gaussian Kernels to Radial Basis Function Classifiers. IEEE Transactions on Signal Processing, 45, p. 2758-2765, 1997.
[56]
{56} Smola A., Bartlett P., Scholkopf B. and Schuurmans C. (eds), Advances in Large Margin Classifiers, Chapter 2, MIT Press, 1999.
[57]
{57} Shawe-Taylor J. and Cristianini N. Margin distribution and soft margin. In A. Smola, P. Barlett, B. Scholkopf and C. Schuurmans (eds), Advances in Large Margin Classifiers, Chapter 2, MIT Press, 1999.
[58]
{58} Smola A. and Scholkopf B. A tutorial on support vector regression. NeuroColt2 TR 1998-03, 1998.
[59]
{59} Smola A. and Scholkopf B. From Regularization Operators to Support Vector Kernels. In: M. Mozer, M. Jordan, and T. Petsche (eds). Advances in Neural Information Processing Systems, 9, MIT Press, Cambridge, MA, 1997.
[60]
{60} Smola A., Scholkopf B. and Muller K.-R. The connection between regulafisation operators and support vector kernels. Neural Networks, 11 p. 637-649, 1998.
[61]
{61} Smola A., Williamson R., Mika S., and Scholkopf B. Regularized principal manifolds. In Computational Learning Theory: 4th European Conference, volume 1572 of Lecture Notes in Artificial Intelligence (Springer), p. 214-229, 1999.
[62]
{62} Tax D. and Duin R. Data domain description by Support Vectors. In Proceedings of ESANN99, ed. M Verleysen, D. Facto Press, Brussels, p. 251-256, 1999.
[63]
{63} Tax D., Ypma A., and Duin R., Support vector data description applied to machine vibration analysis. In: M. Boasson, J. Kaandorp, J. Tonino, M. Vosselman (eds.), Proc. 5th Annual Conference of the Advanced School for Computing and Imaging (Heijen, NL, June 15-17), 1999, 398-405.
[64]
{64} http://www.ics.uci.edu/mlearn/MLRepository.html
[65]
{65} Vapnik, V. The Nature of Statistical Learning Theory. Springer, New York, 1995.
[66]
{66} Vapnik, V. Statistical Learning Theory. Wiley, 1998.
[67]
{67} Weston, J. Gammerman, A., Stitson, M., Vapnik, V., Vovk, V. and Watkins, C. Support Vector Density Estimation. In B. Scholkopf, C. Burges and A. Smola. Advances in Kernel Methods: Support Vector Machines. MIT Press, cambridge, M. A. p. 293-306, 1999.
[68]
{68} Vapnik, V. and Chapelle, O. Bounds on error expectation for Support Vector Machines. Submitted to Neural Computation, 1999.
[69]
{69} Weston J., Mukherjee, Chapelle, Pontil M., Poggio T., and Vapnik V. Feature Selection for SVMs. To appear in Advances in Neural Information Processing Systems 14 (Morgan Kaufmann, 2001).
[70]
{70} http://kernel-machines.org/
[71]
{71} Zien A., Ratsch G., Mika S., Scholkopf B., Lemmen C., Smola A., Lengauer T. and Muller K.-R. Engineering Support Vector Machine Kernels That Recognize Translation Initiation Sites. Presented at the German Conference on Bioinformatics, 1999.

Cited By

View all
  • (2024)Molecular Docking: An Insight from Drug Discovery to Drug Repurposing ApproachUnravelling Molecular Docking - From Theory to Practice [Working Title]10.5772/intechopen.1005526Online publication date: 28-Jun-2024
  • (2024)Urbanisation in Sub-Saharan Cities and the Implications for Urban Agriculture: Evidence-Based Remote Sensing from Niamey, NigerUrban Science10.3390/urbansci80100058:1(5)Online publication date: 4-Jan-2024
  • (2024)Observations and Considerations for Implementing Vibration Signals as an Input Technique for Mobile DevicesMultimodal Technologies and Interaction10.3390/mti80900768:9(76)Online publication date: 2-Sep-2024
  • Show More Cited By

Index Terms

  1. Support vector machines: hype or hallelujah?

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM SIGKDD Explorations Newsletter
    ACM SIGKDD Explorations Newsletter  Volume 2, Issue 2
    Special issue on “Scalable data mining algorithms”
    Dec. 2000
    114 pages
    ISSN:1931-0145
    EISSN:1931-0153
    DOI:10.1145/380995
    Issue’s Table of Contents

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 01 December 2000
    Published in SIGKDD Volume 2, Issue 2

    Check for updates

    Author Tags

    1. Support Vector Machines
    2. kernel methods
    3. statistical learning theory

    Qualifiers

    • Article

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)348
    • Downloads (Last 6 weeks)36
    Reflects downloads up to 21 Oct 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Molecular Docking: An Insight from Drug Discovery to Drug Repurposing ApproachUnravelling Molecular Docking - From Theory to Practice [Working Title]10.5772/intechopen.1005526Online publication date: 28-Jun-2024
    • (2024)Urbanisation in Sub-Saharan Cities and the Implications for Urban Agriculture: Evidence-Based Remote Sensing from Niamey, NigerUrban Science10.3390/urbansci80100058:1(5)Online publication date: 4-Jan-2024
    • (2024)Observations and Considerations for Implementing Vibration Signals as an Input Technique for Mobile DevicesMultimodal Technologies and Interaction10.3390/mti80900768:9(76)Online publication date: 2-Sep-2024
    • (2024)Weakly Supervised SVM-Enhanced SAM Pipeline for Stone-by-Stone Segmentation of the Masonry of the Loire Valley CastlesJournal of Imaging10.3390/jimaging1006014810:6(148)Online publication date: 19-Jun-2024
    • (2024)An investigation on anisotropic soil slope stability by LS-SVM and LEM approachesWorld Journal of Engineering10.1108/WJE-12-2023-0536Online publication date: 23-Sep-2024
    • (2024)The Goldilocks paradigm: comparing classical machine learning, large language models, and few-shot learning for drug discovery applicationsCommunications Chemistry10.1038/s42004-024-01220-47:1Online publication date: 12-Jun-2024
    • (2024)A survey of artificial intelligence methods for renewable energy forecasting: Methodologies and insightsRenewable Energy Focus10.1016/j.ref.2023.10052948(100529)Online publication date: Mar-2024
    • (2024)MAIAC AOD profiling over the Persian Gulf: A seasonal-independent machine learning approachAtmospheric Pollution Research10.1016/j.apr.2024.10212815:7(102128)Online publication date: Jul-2024
    • (2024)Prediction of product properties and identification of key influencing parameters in microwave pyrolysis of microalgae using machine learningAlgal Research10.1016/j.algal.2024.10366282(103662)Online publication date: Aug-2024
    • (2024)Exploring the influence of waste glass granular replacement on compressive strength in concrete mixtures: a normalization and modeling studyJournal of Building Pathology and Rehabilitation10.1007/s41024-024-00401-x9:1Online publication date: 23-Mar-2024
    • Show More Cited By

    View Options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Get Access

    Login options

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media