Cite this article as:

Gudkov A. A., Mironov S. V., Faizliev A. R. On the Convergence of a Greedy Algorithm for the Solution of the Problem for the Construction of Monotone Regression. Izv. Saratov Univ. (N. S.), Ser. Math. Mech. Inform., 2017, vol. 17, iss. 4, pp. 431-440. DOI: https://doi.org/10.18500/1816-9791-2017-17-4-431-440


Language: 
Russian
Heading: 
UDC: 
519.6

On the Convergence of a Greedy Algorithm for the Solution of the Problem for the Construction of Monotone Regression

Abstract: 

The paper presents greedy algorithms that use the Frank-Woolf-type approach for finding a sparse monotonic regression. The problem of finding monotonic regression arises in smoothing an empirical data, in problems of dynamic programming, mathematical statistics and in many other applied problems. The problem is to find a non-decreasing sequence of points with the lowest error of approximation to the given set of points on the plane. The problem of constructing monotonic regression can be formulated as a convex programming problem with linear constraints and is NP-hard problem. The paper also contains estimates of the rate of convergence for the presented greedy algorithms.

References

1. Chen Y. Aspects of Shape-constrained Estimation in Statistics: PhD Thesis. University of Cambridge, 2013. 143 p.

2. Robertson T., Wright F., Dykstra R. Order Restricted Statistical Inference. New York, John Wiley & Sons, 1988. 544 p.

3. Barlow R., Brunk H. The Isotonic Regression Problem and Its Dual. Journal of the Amer- ican Statistical Association, 1972, vol. 67, no. 1, pp. 140–147. DOI: https://doi.org/10.2307/2284712.

4. Dykstra R. An Isotonic Regression Algorithm. Journal of Statistical Planning and Infer-ence, 1981, vol. 5, no. 1, pp. 355–363. DOI: https://doi.org/10.1214/aos/1176345866.

5. Best M. J., Chakravarti N. Active set algorithms for isotonic regression: a unifying framework. Math. Program., 1990, vol. 47, no. 3, pp. 425–439. DOI: https://doi.org/10.1007/BF01580873.

6. Best M., Chakravarti N., Ubhaya V. Minimizing Separable Convex Functions Subject to Simple Chain Constraints. SIAM Journal on Optimization, 2000, vol. 10, no. 3, pp. 658– 672. DOI: https://doi.org/10.1137/S1052623497314970.

7. Stromberg U. An Algorithm for Isotonic Regression with Arbitrary Convex Distance Function. Computational Statistics & Data Analysis, 1991, vol. 11, no. 1, pp. 205–219.

8. Ahuja R., Orlin J. A Fast Scaling Algorithm for Minimizing Separable Convex FunctionsSubject to Chain Constraints. Operations Research, 2001, vol. 49, no. 1, pp. 784–789. DOI: https://doi.org/10.1287/opre.49.5.784.10601

9. Hansohm J. Algorithms and Error Estimations for Monotone Regression on Partially Preordered Sets. Journal of Multivariate Analysis, 2007, vol. 98, no. 1, pp. 1043–1050. DOI: https://doi.org/10.1016/j.jmva.2006.11.001.

10. Dykstra R., Robertson T. An Algorithm for Isotonic Regression for Two or More Independent Variables. Ann. Statist., 1982, vol. 10, no. 1, pp. 708–719. DOI: https://doi.org/10.1214/aos/1176345866.

11. Burdakov O., Grimvall A., Hussian M. A Generalised PAV Algorithm for Monotonic Regression in Several Variables. COMPSTAT: Proc. 16th Symposium in Computational Statistics. Prague, Czech Republic, 2004, pp. 761–767.

12. Bach F. Efficient Algorithms for Non-convex Isotonic Regression through Submodular Optimization. Online journal CoRR, 2017, vol. abs/1707.09157. Available at: http://arxiv.org/abs/1707.09157 (accessed 10 September, 2017).

13. Diggle P, Morris S. Morton-Jones T. Case-control isotonic regression for investigation of elevation in risk around a point source. Statistics in medicine, 1999, vol. 18, no. 13, pp. 1605–1613. DOI: https://doi.org/10.1002/(SICI)1097-0258(19990715)18:13<1605::AID-SIM146>

14. Cai Y., Judd K. L. Shape-preserving dynamic programming. Math. Meth. Oper. Res., 2013, vol. 77, pp. 407–421. DOI: https://doi.org/10.1007/s00186-012-0406-5.3.0.CO;2-V.

15. Frank M., Wolfe Ph. An algorithm for quadratic programming. Naval Research Logistics Quarterly, 1956, vol. 3, no. 1–2, pp. 95–110. DOI: https://doi.org/10.1002/nav.3800030109.

16. Levitin E. S., Polyak B. T. Constrained minimization methods. USSR Comp. Math. & M. Phys., 1966, vol. 6, no. 5, pp. 1–50.

17. Clarkson K. L. Coresets, sparse greedy approximation, and the Frank–Wolfe algorithm. ACM Transactions on Algorithms, 2010, vol. 6, no. 4, pp. 1–30. DOI: https://doi.org/10.1145/1824777.1824783.

18. Freund R. M., Grigas P. New analysis and results for the Frank-Wolfe method. Math. Program., 2016, vol. 155, no. 1–2, pp. 199–230. DOI: https://doi.org/10.1007/s10107-014-0841-6.

19. Jaggi M. Revisiting Frank–Wolfe: Projection-free sparse convex optimization. ICML’13: Proc. 30th International Conference on Machine Learning. Atlanta, GA, USA, 2013, pp. 427–435.

20. Friedman J. Greedy Function Approximation: A Gradient Boosting Machine. Ann. Statist., 2001, vol. 29, no. 5, pp. 1189–1232. DOI: https://doi.org/10.1214/aos/1013203451.

21. Davis G., Mallat S., Avellaneda M. Adaptive greedy approximation. Constr. Approx., 1997, vol. 13, pp. 57–98. DOI: https://doi.org/10.1007/BF02678430.

22. Jones L. On a conjecture of Huber concerning the convergence of projection pursuitregression. Ann. Statist., 1987, vol. 15, no. 2, pp. 880–882. DOI: https://doi.org/10.1214/aos/1176350382.

23. Barron A. R., Cohen A., Dahmen W., DeVore R. A. Approximation and Learning by Greedy Algorithms. Ann. Statist., 2008, vol. 36, no. 1, pp. 64–94. DOI: https://doi.org/10.1214/009053607000000631.

24. DeVore R. A., Temlyakov V. N. Some remarks on greedy algorithms. Advances in Computational Mathematics, 1996, vol. 5, pp. 173–187. DOI: https://doi.org/10.1007/BF02124742.

25. Konyagin S. V., Temlyakov V. N. A remark on greedy approximation in Banach spaces. East J. Approx., 1999, vol. 5, no. 3, pp. 365–379.

26. Nguyen H., Petrova G. Greedy Strategies for Convex Optimization. Calcolo, 2017, vol. 54, no. 1, pp. 207–224. DOI: https://doi.org/10.1007/s10092-016-0183-2.

27. Temlyakov V. N. Dictionary descent in optimization. Analysis Mathematica, 2016, vol. 42, no. 1, pp. 69–89. DOI: https://doi.org/10.1007/s10476-016-0106-0.

28. DeVore R. A., Temlyakov V. N. Convex optimization on Banach spaces. Found. Comput. Math., 2016, vol. 16, no. 2, pp. 369–394. DOI: https://doi.org/10.1007/s10208-015-9248-x.

29. Sidorov S., Mironov S., Pleshakov M. Dual Greedy Algorithm for Conic Optimization Problem. CEUR-WS, 2016, vol. 1623, pp. 276–283.

30. Temlyakov V. N. Greedy approximation in convex optimization. Constr. Approx., 2015, vol. 41, no. 2, pp. 269–296. DOI: https://doi.org/10.1007/s00365-014-9272-0.

31. De Leeuw J., Hornik K., Mair P. Isotone Optimization in R: Pool-Adjacent-Violators Algorithm (PAVA) and Active Set Methods. Journal of Statistical Software, 2009, vol. 32, no. 5, pp. 1–24. DOI: https://doi.org/10.18637/jss.v032.i05.

32. Chepoi V., Cogneau D., Fichet B. Polynomial algorithms for isotonic regression. Lecture Notes–Monograph Series, 1997, vol. 31, pp. 147–160. DOI: https://doi.org/10.1214/lnms/1215454134

Short text (in English): 
Full text:
178