Unifying Divergence Minimization and Statistical Inference Via Convex Duality
2006
Conference Paper
ei
In this paper we unify divergence minimization and statistical inference by means of convex duality. In the process of doing so, we prove that the dual of approximate maximum entropy estimation is maximum a posteriori estimation as a special case. Moreover, our treatment leads to stability and convergence bounds for many statistical learning problems. Finally, we show how an algorithm by Zhang can be used to solve this class of optimization problems efficiently.
Author(s): | Altun, Y. and Smola, AJ. |
Book Title: | Learning Theory |
Journal: | Learning Theory: 19th Annual Conference on Learning Theory (COLT 2006) |
Pages: | 139-153 |
Year: | 2006 |
Month: | June |
Day: | 0 |
Editors: | Lugosi, G. , H.-U. Simon |
Publisher: | Springer |
Department(s): | Empirical Inference |
Bibtex Type: | Conference Paper (inproceedings) |
DOI: | 10.1007/11776420_13 |
Event Name: | 19th Annual Conference on Learning Theory (COLT 2006) |
Event Place: | Pittsburgh, PA, USA |
Address: | Berlin, Germany |
Digital: | 0 |
Language: | en |
Organization: | Max-Planck-Gesellschaft |
School: | Biologische Kybernetik |
Links: |
Web
|
BibTex @inproceedings{5704, title = {Unifying Divergence Minimization and Statistical Inference Via Convex Duality}, author = {Altun, Y. and Smola, AJ.}, journal = {Learning Theory: 19th Annual Conference on Learning Theory (COLT 2006)}, booktitle = {Learning Theory}, pages = {139-153}, editors = {Lugosi, G. , H.-U. Simon}, publisher = {Springer}, organization = {Max-Planck-Gesellschaft}, school = {Biologische Kybernetik}, address = {Berlin, Germany}, month = jun, year = {2006}, doi = {10.1007/11776420_13}, month_numeric = {6} } |