Soft Margin Bayes-Point-Machine Classification via Adaptive Direction Sampling

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Authors

Research Organisations

Details

Original languageEnglish
Title of host publicationImage Analysis - 20th Scandinavian Conference
Subtitle of host publicationSCIA 2017, Proceedings
EditorsPuneet Sharma, Filippo Maria Bianchi
Pages313-324
Number of pages12
Publication statusPublished - 19 May 2017
Event20th Scandinavian Conference on Image Analysis, SCIA 2017 - Tromso, Norway
Duration: 12 Jun 201714 Jun 2017

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume10269 LNCS
ISSN (Print)0302-9743
ISSN (electronic)1611-3349

Abstract

Supervised machine learning is an important building block for many applications that involve data processing and decision making. Good classifiers are trained to produce accurate predictions on a training set while also generalizing well to unseen data. To this end, Bayes-Point-Machines (bpm) were proposed in the past as a generalization of margin maximizing classifiers, such as Support-Vector-Machines (svm). For bpms, the optimal classifier is defined as an expectation over an appropriately chosen posterior distribution, which can be estimated via Markov-Chain-Monte-Carlo (mcmc) sampling. In this paper, we propose three improvements on the original bpm classifier. Our new statistical model is regularized based on the sample size and allows for a true soft-margin formulation without the need to hand-tune any nuisance parameters. Secondly, this model can handle multi-class problems natively. Finally, our fast adaptive mcmc sampler uses Adaptive Direction Sampling (ads) and can generate a sample from the proposed posterior with a runtime complexity quadratic in the size of the training set. Therefore, we call our new classifier the Multi-class-Soft-margin-Bayes-Point-Machine (ms-bpm). We have evaluated the generalization capabilities of our approach on several datasets and show that our soft-margin model significantly improves on the original bpm, especially for small training sets, and is competitive with svm classifiers. We also show that class membership probabilities generated from our model improve on Platt-scaling, a popular method to derive calibrated probabilities from maximum-margin classifiers.

ASJC Scopus subject areas

Cite this

Soft Margin Bayes-Point-Machine Classification via Adaptive Direction Sampling. / Vogt, Karsten; Ostermann, Jörn.
Image Analysis - 20th Scandinavian Conference: SCIA 2017, Proceedings. ed. / Puneet Sharma; Filippo Maria Bianchi. 2017. p. 313-324 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10269 LNCS).

Research output: Chapter in book/report/conference proceedingConference contributionResearchpeer review

Vogt, K & Ostermann, J 2017, Soft Margin Bayes-Point-Machine Classification via Adaptive Direction Sampling. in P Sharma & FM Bianchi (eds), Image Analysis - 20th Scandinavian Conference: SCIA 2017, Proceedings. Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 10269 LNCS, pp. 313-324, 20th Scandinavian Conference on Image Analysis, SCIA 2017, Tromso, Norway, 12 Jun 2017. https://doi.org/10.1007/978-3-319-59126-1_26
Vogt, K., & Ostermann, J. (2017). Soft Margin Bayes-Point-Machine Classification via Adaptive Direction Sampling. In P. Sharma, & F. M. Bianchi (Eds.), Image Analysis - 20th Scandinavian Conference: SCIA 2017, Proceedings (pp. 313-324). (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics); Vol. 10269 LNCS). https://doi.org/10.1007/978-3-319-59126-1_26
Vogt K, Ostermann J. Soft Margin Bayes-Point-Machine Classification via Adaptive Direction Sampling. In Sharma P, Bianchi FM, editors, Image Analysis - 20th Scandinavian Conference: SCIA 2017, Proceedings. 2017. p. 313-324. (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)). doi: 10.1007/978-3-319-59126-1_26
Vogt, Karsten ; Ostermann, Jörn. / Soft Margin Bayes-Point-Machine Classification via Adaptive Direction Sampling. Image Analysis - 20th Scandinavian Conference: SCIA 2017, Proceedings. editor / Puneet Sharma ; Filippo Maria Bianchi. 2017. pp. 313-324 (Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)).
Download
@inproceedings{844fccaaf9d64129a479067260a02eab,
title = "Soft Margin Bayes-Point-Machine Classification via Adaptive Direction Sampling",
abstract = "Supervised machine learning is an important building block for many applications that involve data processing and decision making. Good classifiers are trained to produce accurate predictions on a training set while also generalizing well to unseen data. To this end, Bayes-Point-Machines (bpm) were proposed in the past as a generalization of margin maximizing classifiers, such as Support-Vector-Machines (svm). For bpms, the optimal classifier is defined as an expectation over an appropriately chosen posterior distribution, which can be estimated via Markov-Chain-Monte-Carlo (mcmc) sampling. In this paper, we propose three improvements on the original bpm classifier. Our new statistical model is regularized based on the sample size and allows for a true soft-margin formulation without the need to hand-tune any nuisance parameters. Secondly, this model can handle multi-class problems natively. Finally, our fast adaptive mcmc sampler uses Adaptive Direction Sampling (ads) and can generate a sample from the proposed posterior with a runtime complexity quadratic in the size of the training set. Therefore, we call our new classifier the Multi-class-Soft-margin-Bayes-Point-Machine (ms-bpm). We have evaluated the generalization capabilities of our approach on several datasets and show that our soft-margin model significantly improves on the original bpm, especially for small training sets, and is competitive with svm classifiers. We also show that class membership probabilities generated from our model improve on Platt-scaling, a popular method to derive calibrated probabilities from maximum-margin classifiers.",
author = "Karsten Vogt and J{\"o}rn Ostermann",
note = "Funding information: This work was supported by the German Science Foundation ( dfg ) under grant OS 295/4-1.; 20th Scandinavian Conference on Image Analysis, SCIA 2017 ; Conference date: 12-06-2017 Through 14-06-2017",
year = "2017",
month = may,
day = "19",
doi = "10.1007/978-3-319-59126-1_26",
language = "English",
isbn = "9783319591254",
series = "Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)",
pages = "313--324",
editor = "Puneet Sharma and Bianchi, {Filippo Maria}",
booktitle = "Image Analysis - 20th Scandinavian Conference",

}

Download

TY - GEN

T1 - Soft Margin Bayes-Point-Machine Classification via Adaptive Direction Sampling

AU - Vogt, Karsten

AU - Ostermann, Jörn

N1 - Funding information: This work was supported by the German Science Foundation ( dfg ) under grant OS 295/4-1.

PY - 2017/5/19

Y1 - 2017/5/19

N2 - Supervised machine learning is an important building block for many applications that involve data processing and decision making. Good classifiers are trained to produce accurate predictions on a training set while also generalizing well to unseen data. To this end, Bayes-Point-Machines (bpm) were proposed in the past as a generalization of margin maximizing classifiers, such as Support-Vector-Machines (svm). For bpms, the optimal classifier is defined as an expectation over an appropriately chosen posterior distribution, which can be estimated via Markov-Chain-Monte-Carlo (mcmc) sampling. In this paper, we propose three improvements on the original bpm classifier. Our new statistical model is regularized based on the sample size and allows for a true soft-margin formulation without the need to hand-tune any nuisance parameters. Secondly, this model can handle multi-class problems natively. Finally, our fast adaptive mcmc sampler uses Adaptive Direction Sampling (ads) and can generate a sample from the proposed posterior with a runtime complexity quadratic in the size of the training set. Therefore, we call our new classifier the Multi-class-Soft-margin-Bayes-Point-Machine (ms-bpm). We have evaluated the generalization capabilities of our approach on several datasets and show that our soft-margin model significantly improves on the original bpm, especially for small training sets, and is competitive with svm classifiers. We also show that class membership probabilities generated from our model improve on Platt-scaling, a popular method to derive calibrated probabilities from maximum-margin classifiers.

AB - Supervised machine learning is an important building block for many applications that involve data processing and decision making. Good classifiers are trained to produce accurate predictions on a training set while also generalizing well to unseen data. To this end, Bayes-Point-Machines (bpm) were proposed in the past as a generalization of margin maximizing classifiers, such as Support-Vector-Machines (svm). For bpms, the optimal classifier is defined as an expectation over an appropriately chosen posterior distribution, which can be estimated via Markov-Chain-Monte-Carlo (mcmc) sampling. In this paper, we propose three improvements on the original bpm classifier. Our new statistical model is regularized based on the sample size and allows for a true soft-margin formulation without the need to hand-tune any nuisance parameters. Secondly, this model can handle multi-class problems natively. Finally, our fast adaptive mcmc sampler uses Adaptive Direction Sampling (ads) and can generate a sample from the proposed posterior with a runtime complexity quadratic in the size of the training set. Therefore, we call our new classifier the Multi-class-Soft-margin-Bayes-Point-Machine (ms-bpm). We have evaluated the generalization capabilities of our approach on several datasets and show that our soft-margin model significantly improves on the original bpm, especially for small training sets, and is competitive with svm classifiers. We also show that class membership probabilities generated from our model improve on Platt-scaling, a popular method to derive calibrated probabilities from maximum-margin classifiers.

UR - http://www.scopus.com/inward/record.url?scp=85020458313&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-59126-1_26

DO - 10.1007/978-3-319-59126-1_26

M3 - Conference contribution

AN - SCOPUS:85020458313

SN - 9783319591254

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 313

EP - 324

BT - Image Analysis - 20th Scandinavian Conference

A2 - Sharma, Puneet

A2 - Bianchi, Filippo Maria

T2 - 20th Scandinavian Conference on Image Analysis, SCIA 2017

Y2 - 12 June 2017 through 14 June 2017

ER -

By the same author(s)