Abstract
A new algorithm for isotonic regression is presented based on recursively partitioning the solution space. We develop efficient methods for each partitioning subproblem through an equivalent representation as a network flow problem, and prove that this sequence of partitions converges to the global solution. These network flow problems can further be decomposed in order to solve very large problems. Success of isotonic regression in prediction and our algorithm's favorable computational properties are demonstrated through simulated examples as large as 2 × 10 5 variables and 10 7 constraints.
Original language | English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 23 |
Subtitle of host publication | 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 |
State | Published - 2010 |
Event | 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 - Vancouver, BC, Canada Duration: 6 Dec 2010 → 9 Dec 2010 |
Publication series
Name | Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 |
---|
Conference
Conference | 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 |
---|---|
Country/Territory | Canada |
City | Vancouver, BC |
Period | 6/12/10 → 9/12/10 |