Prof. Dr. Michael Kohler

 


Dissertation, Habilitationsschrift, Bücher

 

  1. M. Kohler. Nichtparametrische Regressionsschätzung mit Splines. Dissertation, Universität Stuttgart, 1997.
  2. M. Kohler. Analyse von nichtparametrischen Regressionsschätzern unter minimalen Voraussetzungen. Habilitationsschrift, Universität Stuttgart. Shaker Verlag, 2000.
  3. L. Györfi, M. Kohler, A. Krzyzak und H. Walk. A Distribution-Free Theory of Nonparametric Regression. Springer Series in Statistics, Springer-Verlag New York, 2002.
  4. J. Eckle-Kohler und M. Kohler. Eine Einführung in die Statistik und ihre Anwendungen. Springer-Verlag, Berlin, 2009.
  5. L. Devroye, B. Karasözen, M. Kohler und R. Korn (Eds.) Recent Developments in Applied Probability and Statistics - Dedicated to the Memory of Jürgen Lehn. Springer, 2010.


Zur Veröffentlichung eingereichte Arbeiten

  1. A. Braun, M. Kohler und H. Walk. On the rate of convergence of a neural network regression estimate learned by gradient descent. Herunterladbar als pdf-file.
  2. M. Kohler und S. Kersting. Uncertainty quantification based on (imperfect) simulation models with estimated input distributions. Herunterladbar als pdf-file.
  3. M. Kohler und A. Krzyzak. Over-parametrized neural networks learned by gradient descent can generalize especially well. Herunterladbar als pdf-file.
  4. M. Kohler und A. Krzyzak. Analysis of the rate of convergence of an over-parametrized deep neural network estimate learned by gradient descent. Herunterladbar als pdf-file.
  5. M. Kohler und A. Krzyzak. On the rate of convergence of an over-parametrized deep neural network regression estimate with ReLU activation function learned by gradient descent. Herunterladbar als pdf-file.
  6. S. Drews und M. Kohler. Analysis of the expected L_2 error of an over-parametrized deep neural network estimate learned by gradient descent without regularization. Herunterladbar als pdf-file.
  7. M. Kohler und A. Krzyzak. On the rate of convergence of an over-parametrized Transformer classifier learned by gradient descent. Herunterladbar als pdf-file.
  8. M. Kohler, A. Krzyzak und A. Sänger. Learning of deep convolutional network image classifiers via stochastic gradient descent and over-parametrization. Herunterladbar als pdf-file.


Veröffentlichungen in referierten Zeitschriften

  1. M. Kohler. On the universal consistency of a least squares spline regression estimator. Mathematical Methods of Statistics 6, pp. 349-364, 1997.
  2. L. Györfi, M. Kohler und H. Walk. Weak and strong universal consistency of semi-recursive kernel and partitioning regression estimates. Statistics & Decisions 16, pp. 1-18, 1998.
  3. M. Kohler. Nonparametric Regression Function Estimation Using Interaction Least Squares Splines and Complexity Regularization. Metrika 47, pp. 147-163, 1998.
  4. M. Kohler. Universally Consistent Regression Function Estimation Using Hierarchical B-Splines. Journal of Multivariate Analysis 67, pp. 138-164 , 1999.
  5. M. Kohler. Nonparametric estimation of piecewise smooth regression functions. Statistics and Probability Letters 43, pp. 49-55, 1999.
  6. A. Antos, L. Györfi und M. Kohler. Lower bounds on the rate of convergence of nonparametric regression estimates. Journal of Statistical Planning and Inference 83, pp. 91-100, 2000.
  7. M. Kohler. Inequalities for uniform deviations of averages from expectations with applications to nonparametric regression. Journal of Statistical Planning and Inference 89, pp. 1-23, 2000.
  8. M. Kohler und A. Krzyzak. Nonparametric regression estimation using penalized least squares. IEEE Transaction on Information Theory 47, pp. 3054-3058, 2001.
  9. M. Kohler, K. Máthé und M. Pintér. Prediction from randomly right censored data. Journal of Multivariate Analysis 80, pp. 73-100, 2002.
  10. M. Kohler, A. Krzyzak und D. Schäfer. Application of structural risk minimization to multivariate smoothing spline regression estimates. Bernoulli 8, pp. 1-15, 2002.
  11. J. Dippon, P. Fritz und M. Kohler. A statistical approach to case based reasoning, with application to breast cancer data. Computational Statistics and Data Analysis 40, pp. 579-602, 2002.
  12. M. Kohler. Nonlinear orthogonal series estimates for random design regression. Journal of Statistical Planning and Inference 115, pp. 491-520, 2003.
  13. M. Kohler. Universal consistency of local polynomial kernel regression estimates. Annals of the Institute of Statistical Mathematics 54, pp. 879-899, 2003.
  14. M. Hamers und M. Kohler. A bound on the expected maximal deviation of averages from their means. Statistics and Probability Letters 62, pp.137-144, 2003.
  15. M. Kohler, A. Krzyzak und H. Walk. Strong consistency of automatic kernel regression estimates. Annals of the Institute of Statistical Mathematics 55, pp. 287-308, 2003.
  16. M. Hamers und M. Kohler. How well can a regression function be estimated if the distribution of the (random) design is concentrated on a finite set? Journal of Statistical Planning and Inference 123, pp. 377-394, 2004.
  17. M. Kohler. Estimation of smooth regression functions from stationary and ergodic observations via least squares. Mathematical Methods of Statistics 13, pp. 460-471, 2004.
  18. M. Kohler und A. Krzyzak. Adaptive regression estimation with multilayer feedforward neural networks. Journal of Nonparametric Statistics 17, pp. 891-913, 2005.
  19. M. Kohler, A. Krzyzak und H. Walk. Rates of convergence for partitioning and nearest neighbor regression estimates with unbounded data. Journal of Multivariate Analysis 97, pp. 311-323, 2006.
  20. M. Kohler. Nonparametric regression with additional measurement errors in the dependent variable. Journal of Statistical Planning and Inference 136, pp. 3339-3361, 2006.
  21. M. Hamers und M. Kohler. Nonasymptotic bounds on the $L_2$ error of neural network regression estimates. Annals of the Institute of Statistical Mathematics 58, pp. 131-151, 2006.
  22. M. Kohler und A. Krzyzak. Asymptotic confidence intervals for Poisson regression. Preprint herunterladbar als ps- und pdf-file. Journal of Multivariate Analysis 98, pp. 1072-1094. 2007.
  23. L. Györfi und M. Kohler. Nonparametric estimation of conditional distributions. Preprint herunterladbar als ps- und pdf-file. IEEE Transactions on Information Theory 53, pp. 1872-1879, 2007.
  24. M. Kohler und A. Krzyzak. On the rate of convergence of local averaging plug-in classification rules under a margin condition. Preprint herunterladbar als ps- und pdf-file. IEEE Transactions on Information Theory 53, pp. 1735-1742, 2007.
  25. D. Egloff, M. Kohler und N. Todorovic. A dynamic look-ahead Monte Carlo algorithm for pricing American options. Herunterladbar als ps- und pdf-file. Annals of Applied Probability 17, pp. 1138-1171, 2007.
  26. M. Kohler. A regression based smoothing spline Monte Carlo algorithm for pricing American options. Herunterladbar als ps- und pdf-file. AStA Advances in Statistical Analysis 92, pp. 153-178, 2008.
  27. M. Kohler. Multivariate orthogonal series estimates for random design regression. Journal of Statistical Planning and Inference 138, pp. 3217-3237, 2008.
  28. J. Eckle-Kohler, M. Kohler und J. Mehnert. Automatic recognition of German news focussing on future-directed beliefs and intentions. Computer Speech and Language 22, pp. 394-414, 2008.
  29. M. Kohler, A. Krzyzak und H. Walk. Upper bounds for Bermudan options on Markovian data using nonparametric regression and a reduced number of nested Monte Carlo steps. Herunterladbar als ps- und pdf-file. Statistics and Decision 26, pp. 275-288, 2008.
  30. M. Kohler, A. Krzyzak und H. Walk. Optimal global rates of convergence in nonparametric regression with unbounded data. Journal of Statistical Planning and Inference 123, pp. 1286-1296, 2009.
  31. A. M. Bagirov, C. Clausen und M. Kohler. Estimation of a regression function by maxima of minima of linear functions. Herunterladbar als ps- und pdf-file .IEEE Transactions on Information Theory 55, pp. 833-845, 2009.
  32. A. M. Bagirov, C. Clausen und M. Kohler. An L2-boosting algorithm for estimation of a regression function. Herunterladbar als ps- und pdf-file.
  33. IEEE Transcations on Information Theory 56, pp. 1417-1429, 2009.
  34. M. Kohler, A. Krzyzak und N. Todorovic. Pricing of high-dimensional American options by neural networks. Preprint herunterladbar als ps- und pdf-file. Mathematical Finance 20, pp. 383-410, 2010.
  35. A. M. Bagirov, C. Clausen und M. Kohler. An algorithm for the estimation of a regression function by continuous piecewise linear functions. Herunterladbar als ps- und pdf-fileComputational Optimization and Applications 45, pp. 159-179, 2010.
  36. A. Fromkorth und M. Kohler. Analysis of least squares regression estimates in case of additional errors in the variables. Herunterladbar als ps- und pdf-file. Journal of Statistical Planning and Inference 141, pp. 172-188, 2011.
  37. M. Kohler und J. Mehnert. Analysis of the rate of convergence of least squares neural network regression estimates in case of measurement errors. Herunterladbar als ps- und pdf-file. Neural Networks 24, pp. 273-279, 2011.
  38. M. Kohler, A. Krzyzak und H. Walk. Estimation of the essential supremum of a regression function. Herunterladbar als ps- und pdf-file. Statistics and Probability Letters 81, pp. 685-693, 2011.
  39. M. Kohler, D. Jones und H. Walk. Weakly universally consistent forecasting of stationary and ergodic time series. Herunterladbar als ps- und pdf-file. IEEE Transcations on Information Theory 58, pp. 1191-1202, 2012.
  40. M. Kohler und A. Krzyzak. Nonparametric estimation of non-stationary velocity fields from 3D particle tracking velocimetry data. Preprint herunterladbar als ps- und pdf-file. Computational Statistics and Data Analysis 56, pp. 1566-1580, 2012.
  41. L. Devroye, T. Felber, M. Kohler und A. Krzyzak. L_1-consistent estimation of the density of residuals in random design regression models. Herunterladbar als ps- und pdf-file. Statistics and Probability Letters 82, pp. 173-179, 2012.
  42. M. Kohler und A. Krzyzak. Pricing of American options in discrete time using least squares estimates with complexity penalties. Herunterladbar als ps- und pdf-file. Journal of Statistical Planning and Inference 142, pp. 2289-2307, 2012.
  43. M. Kohler und H. Walk. On data-based optimal stopping under stationarity and ergodicity. Herunterladbar als ps- und pdf-file. Bernoulli 19, pp. 931-953, 2013. Zusatzmaterial: Beweis von Lemma 1 und Lemma 2: pdf-file.
  44. L. Devroye, T. Felber und M. Kohler.Estimation of a density using real and artificial data. Herunterladbar als ps- und pdf-file. IEEE Transactions on Information Theory 59, pp. 1917-1928, 2013.
  45. I. Hertel und M. Kohler. Estimation of the optimal design of a nonlinear parametric regression problem via Monte Carlo experiments. Herunterladbar als pdf- und als ps-file. Computational Statistics and Data Analysis 59, pp. 1-12, 2013.
  46.  A. Fromkorth und M. Kohler. On the consistency of regression based Monte
    Carlo methods for pricing Bermudan options in case of estimated financial models. Herunterladbar als pdf-file. Mathematical Finance 25, pp. 371-399, 2015.
  47. D. Furer, M. Kohler und A. Krzyzak. Fixed design regression estimation based on real and artificial data.Herunterladbar als pdf- und als ps-file. Journal of Nonparametric Statistics 25, pp. 223-241, 2013.
  48. M. Kohler und A. Krzyzak. Optimal global rates of convergence for interpolation problems with random design. Herunterladbar als pdf- und als ps-file. Statistics and Probability Letters 83, pp. 1871-1879, 2013.
  49. T. Felber, D. Jones, M. Kohler und H. Walk. Weakly universally consistent static forecasting of stationary and ergodic time series via local averaging and least squares estimates. Herunterladbar als ps- und pdf-file. Journal of Statistical Planning and Inference 143, pp. 1689-1707, 2013.
  50. A. Bott, L. Devroye und M. Kohler. Estimation of a distribution from data with small measurement errors. Herunterladbar als pdf- und als ps-file. Electronic Journal of Statistics 7, pp. 2457-2476, 2013.
  51. J. Kruppa, Y. Liu, G. Biau, M. Kohler, I. König, J. Malley und A. Ziegler. Probability estimation with machine learning methods for dichotomous and multi-category outcome: Theory. Biometrical Journal 56, pp. 534-563, 2014.
  52. D. Jones, M. Kohler, A. Krzyzak und A. Richter. Empirical comparison of nonparametric regression estimates on real data. Herunterladbar als pdf- und als ps-file. Communications in Statistics - Simulation and Computation 45, pp. 2309-2319, 2016.
  53. M. Kohler, A. Krzyzak und H. Walk. Nonparametric recursive quantile estimation. Herunterladbar als pdf-file. Statistics and Probability Letters 93, pp. 102-107, 2014.
  54. M. Kohler. Optimal global rates of convergence for noiseless regression estimation problems with adaptively chosen design. Herunterladbar als pdf- und als ps-file. Journal of Multivariate Analysis 132, pp. 197-208, 2014.
  55. T. Felber, M. Kohler und A. Krzyzak. Adaptive density estimation based on real and artificial data. Herunterladbar als pdf- und als ps-file. Journal of Nonparametric Statistics 27, pp. 1-18, 2015.
  56. M. Kohler, F. Müller und H. Walk. Estimation of a regression function corresponding to latent variables. Herunterladbar als pdf-file. Journal of Statistical Planing and Inference 162, pp. 88-109,  2015.
  57. G. Enss, B. Götz, M. Kohler, A. Krzyzak und R. Platz. Nonparametric estimation of a maximum of quantiles. Herunterladbar als pdf-file. Electronic Journal of Statistics 8, pp. 3176-3192, 2014.
  58. D. Furer und M. Kohler. Smoothing spline regression estimation based on real and artificial data. Herunterladbar als pdf-file. Metrika 78, pp. 711-746, 2015.
  59. T. Felber, M. Kohler und A. Krzyzak. Adaptive density estimation from data with small measurement errors. Herunterladbar als pdf-file. IEEE Transactions on Information Theory 61, pp. 3446-3456, 2015.
  60. A. Bott, T. Felber und M. Kohler. Estimation of a density in a simulation model. Herunterladbar als pdf- und als ps-file. Journal of Nonparametric Statistics 27, pp. 271-285, 2015.
  61. A. Bott und M. Kohler. Adaptive estimation of a conditional density. Herunterladbar als pdf-file.
  62. International Statistical Review 84, pp. 291-316, 2016.
  63. A. Bott und M. Kohler. Nonparametric estimation of a conditional density. Herunterladbar als pdf-file. Annals of the Institute of Statistical Mathematics 69, pp. 189-214, 2017.
  64. M. Kohler und A. Krzyzak. Estimation of a jump point in random design regression. Herunterladbar als pdf- und als ps-file. Statistics and Probability Letters 106, pp. 247-255, 2015.
  65. G. Enss, M. Kohler, A. Krzyzak und R. Platz. Nonparametric quantile estimation based on surrogate models. Herunterladbar als pdf-file. IEEE Transaction on Information Theory 62, pp. 5727-5739, 2016.
  66. A. Bott, T. Felber, M. Kohler und L. Kristl. Estimation of a time dependent density. Herunterladbar als pdf-file. Journal of Statistical Planning and Inference 180, pp. 81-107, 2017.
  67. M. Hansmann und M. Kohler. Estimation of quantiles from data with additional measurement errors. Herunterladbar als pdf-file. Statistica Sinica 27, pp. 1661-1673, 2017.
  68. A. Kelava, M. Kohler und A. Krzyzak. Nonparametric estimation of a latent variable model. Herunterladbar als pdf- und als ps-file. Überarbeitete Version erschienen im Journal of Multivariate Analysis 154, pp. 112-134, 2017.
  69. M. Kohler und A. Krzyzak. Nonparametric regression based on hierarchical interaction models. Herunterladbar als pdf-file. IEEE Transactions on Information Theory 63, pp. 1620-1630, 2017.
  70. M. Kohler, A. Krzyzak, R. Tent und H. Walk. Nonparametric quantile estimation using importance sampling. Herunterladbar als pdf-file. Annals of the Institute of Statistical Mathematics 70, pp. 439-465, 2018.
  71. B. Bauer, L. Devroye, M. Kohler, A. Krzyzak und H. Walk. Nonparametric estimation of a function from noiseless observations at random points. Herunterladbar als pdf-file. Journal of Multivariate Analysis 160, pp. 90-104, 2017.
  72. M. Kohler und M. Krzyzak. Adaptive estimation of quantiles in a simulation model. Herunterladbar als pdf-file. IEEE Transactions on Information Theory 64, pp. 501-512, 2018.
  73. B. Bauer F. Heimrich, M. Kohler und A. Krzyzak. On estimation of surrogate models for high-dimensional computer experiments. Herunterladbar als pdf-file. Annals of the Institute of Statistical Mathematics 71, pp. 107-136, 2019.
  74. M. Kohler, A. Krzyzak, S. Mallapur und R. Platz . Uncertainty Quantification in Case of Imperfect Models: A Non-Bayesian Approach. Herunterladbar als pdf-file. Scandinavian Journal of Statistics 45, pp. 729-752, 2018.
  75. F. Heimrich, M. Kohler und L. Kristl. Nonparametric estimation of time-dependent quantiles in a simulation model. Herunterladbar als pdf-file, suppl. material. Statistica Sinica 30, pp. 135-151, 2020
  76. H. Hansmann, M. Kohler und H. Walk. On the strong universal consistency of local averaging regression estimates. Herunterladbar als pdf-file. Annals of the Institute of Statistical Mathematics 71, pp. 1233-1263, 2019.
  77. H. Hansmann, M. Kohler und H. Walk. Correction to: On the strong universal consistency of local averaging regression estimates. Annals of the Institute of Statistical Mathematics 71, pp. 1265-1269, 2019.
  78. M. Kohler und A. Krzyzak. Estimating quantiles in imperfect simulation models using conditional density estimation. Herunterladbar als pdf-file. Annals of the Institute of Statistical Mathematics 72, pp. 123-155, 2020.
  79. B. Bauer und M. Kohler. On deep learning as a remedy for the curse of dimensionality in nonparametric regression. Herunterladbar als pdf-file. Annals of Statistics 47, pp. 2261-2285, 2019.
  80. M. Kohler und A. Krzyzak. Estimation of a density from an imperfect simulation model. Herunterladbar als pdf-file. IEEE Transactions on Information Theory 65, pp. 1535-1546, 2019.
  81. M. Hansmann und M. Kohler. Estimation of conditional quantiles from data with additional measurement errors. Herunterladbar als pdf-file. Journal of Statistical Planning and Inference 200, 176-195, 2019.
  82. M. Kohler und A. Krzyzak. Estimation of extreme quantiles in a simulation model. Herunterladbar als pdf-file. Journal of Nonparametric Statistics 31, pp. 393-419, 2019.
  83. M. Kohler und R. Tent. Nonparametric Quantile Estimation Using Surrogate Models and Importance Sampling. Herunterladbar als pdf-file. Metrika 83, pp. 141-169, 2020.
  84. M. Kohler und S. Langer. Discussion of ``Nonparametric regression using deep neural networks with ReLU activation function''. Herunterladbar als pdf-file. Annals of Statistics 48, pp. 1906-1910, 2020.
  85. B. Götz, S. Kersting und M. Kohler. Estimation of an improved surrogate model in uncertainty quantification by neural networks. Herunterladbar als pdf-file. Annals of the Institute of Statistical Mathematics 73, pp. 249-281, 2021.
  86. M. Kohler und S. Langer. On the rate of convergence of fully connected very deep neural network regression estimates. Herunterladbar als pdf-file. Annals of Statistics 49, pp. 2231-2249, 2021.
  87. M. Kohler und A. Krzyzak. Estimation of a density using an improved surrogate model. Preprint herunterladbar als pdf-file. Electronic Journal of Statistics 15, pp. 1-41, 2021.
  88. M. Kohler und A. Krzyzak. Over-parametrized deep neural networks minimizing the empirical risk do not generalize well. Herunterladbar als pdf-file. Bernoulli 27, pp. 2564-2597, 2021.
  89. M. Kohler, A. Krzyzak und S. Langer. Estimation of a function of low local dimensionality by deep neural networks. Herunterladbar als pdf-file. IEEE Transactions on Information Theory 68, pp. 4032-4042, 2022.
  90. M. Kohler, A. Krzyzak und B. Walter. On the rate of convergence of image classifiers based on convolutional neural networks. Herunterladbar als pdf-file. Annals of the Instiute of Statistical Mathematics 74, pp. 1085-1108, 2022.
  91. M. Kohler und A. Krzyzak. On the rate of convergence of a deep recurrent neural network estimate in a regression problem with dependent data. Herunterladbar als pdf-file. Bernoulli 29, pp. 1663-1685, 2023.
  92. M. Kohler, L. Langer und U. Reif. Estimation of a regression function on a manifold by fully connected deep neural networks. Herunterladbar als pdf-file. Journal of Statistical Planning and Inference 222, pp. 160-181, 2023.
  93. I. Gurevych, M. Kohler, und G. Sahin. On the rate of convergence of a classifier based on a Transformer encoder. Herunterladbar als pdf-file. IEEE Transaction on Information Theory 68, pp. 8139-8155, 2022.
  94. A. Braun, M. Kohler, L. Langer und H. Walk. Convergence rates for shallow neural networks learned by gradient descent. Herunterladbar als pdf-file. Bernoulli 30, pp. 475-502, 2023.
  95. M. Kohler und B. Walter. Analysis of convolutional neural network image classifiers in a rotational symmetric model. Herunterladbar als pdf-file. Supplement herunterladbar als pdf-file. IEEE Transaction on Information Theory 69, pp. 5203-5218, 2023.
  96. A. Braun, M. Kohler, J. Cho und A. Krzyzak. Analysis of the rate of convergence of neural network regression estimates which are easy to implement. Herunterladbar als pdf-file. Electronic Journal of Statistics 18, pp. 553-598, 2024.
  97. S. Drews und M. Kohler. On the universal consistency of an over-parametrized deep neural network estimate learned by gradient descent. Herunterladbar als pdf-file. Erscheint in Annals of the Institue of Statistical Mathematics 2024.
  98. M. Kohler, A. Krzyzak und B. Walter. Analysis of the rate of convergence of an over-parametrized convolutional neural network image classifier learned by gradient descent. Herunterladbar als pdf-file. To appear in Journal of Statistical Planning and Inference 2024.
  99. M. Kohler und L. Langer. Statistical theory for image classification using deep convolutional neural networks with cross-entropy loss under the hierarchical max-pooling model. Preprint herunterladbar als pdf-file. To appear in Journal of Statistical Planning and Inference 2024.


Konferenzbeiträge, Proceedings

  1. L. Györfi und M. Kohler. Nonparametric regression estimation. In Lecture Notes (L. Györfi, ed.) zu der Advanced School Principles of Nonparametric Learning, abgehalten im Juli 2001 im CISM, Udine, pp. 55-122. Springer-Verlag, 2002.
  2. M. Kohler und A. Krzyzak. Rates of convergence for adaptive regression estimates with multiple hidden layer feedforward neural networks. Proceedings of IEEE 2004 International Symposium on Information Theory, Adelaide, Australien, September 4 - 9, 2005, pp. 1436-1440.
  3. M. Kohler. A Review on Regression-based Monte Carlo Methods for Pricing American Options. Herunterladbar als ps- und pdf-file. In:L. Devroye, B. Karasözen, M. Kohler und R. Korn (Eds.) Recent Developments in Applied Probability and Statistics - Dedicated to the Memory of Jürgen Lehn. Springer, 2010

Unveröffentlichte Arbeiten

  1. M. Kohler, S. Kul und K. Mathe. Least squares estimates for censored regression.
  2. G. Beliakov und M. Kohler. Estimation of regression functions by Lipschitz continuous functions. Herunterladbar als ps-file.
  3. M. Kohler. Universally consistent upper bounds for Bermudan options based on Monte Carlo and nonparametric regression. Herunterladbar als ps- und pdf-file.