LAPSE:2025.0392
Published Article

LAPSE:2025.0392
Interval Hessian-based Optimization Algorithm for Unconstrained Non-convex Problems
June 27, 2025
Abstract
Second-order optimization algorithms that leverage the exact Hessian or its approximation have been proven to achieve a faster convergence rate than first-order methods. However, their applications on training deep neural networks models, partial differential equations-based optimization problems, and large-scale non-convex problems, are hindered due to high computational cost associated with the Hessian evaluation, Hessian inversion to find the search direction, and ensuring its positive-definiteness. Accordingly, we propose a new search direction based on an interval Hessian and incorporate it into a line-search framework. We apply our algorithm to a set of 210 problems and show that it converges to a local minimum for 70% of the problems. We also compare our algorithm with other approaches. We illustrate that our algorithm is competitive to other methods in finding a local minimum using a smaller number of O(n3) operations.
Second-order optimization algorithms that leverage the exact Hessian or its approximation have been proven to achieve a faster convergence rate than first-order methods. However, their applications on training deep neural networks models, partial differential equations-based optimization problems, and large-scale non-convex problems, are hindered due to high computational cost associated with the Hessian evaluation, Hessian inversion to find the search direction, and ensuring its positive-definiteness. Accordingly, we propose a new search direction based on an interval Hessian and incorporate it into a line-search framework. We apply our algorithm to a set of 210 problems and show that it converges to a local minimum for 70% of the problems. We also compare our algorithm with other approaches. We illustrate that our algorithm is competitive to other methods in finding a local minimum using a smaller number of O(n3) operations.
Record ID
Keywords
Interval Hessian, Line-search framework, Non-Convex optimization, Second-order optimization
Subject
Suggested Citation
Sharma A, Dingwani G, Bajaj I. Interval Hessian-based Optimization Algorithm for Unconstrained Non-convex Problems. Systems and Control Transactions 4:1492-1497 (2025) https://doi.org/10.69997/sct.135173
Author Affiliations
Sharma A: Indian Institute of Technology Kanpur, Department of Chemical Engineering, Kanpur, Uttar Pradesh, India
Dingwani G: Indian Institute of Technology Roorkee, Department of Chemical Engineering, Roorkee, Uttarakhand, India
Bajaj I: Indian Institute of Technology Kanpur, Department of Chemical Engineering, Kanpur, Uttar Pradesh, India
Dingwani G: Indian Institute of Technology Roorkee, Department of Chemical Engineering, Roorkee, Uttarakhand, India
Bajaj I: Indian Institute of Technology Kanpur, Department of Chemical Engineering, Kanpur, Uttar Pradesh, India
Journal Name
Systems and Control Transactions
Volume
4
First Page
1492
Last Page
1497
Year
2025
Publication Date
2025-07-01
Version Comments
Original Submission
Other Meta
PII: 1492-1497-1504-SCT-4-2025, Publication Type: Journal Article
Record Map
Published Article

LAPSE:2025.0392
This Record
Data

LAPSE:2025.0005
Interval Hessian-based Optimization...
External Link

https://doi.org/10.69997/sct.135173
Article DOI
Download
Meta
Record Statistics
Record Views
514
Version History
[v1] (Original Submission)
Jun 27, 2025
Verified by curator on
Jun 27, 2025
This Version Number
v1
Citations
Most Recent
This Version
URL Here
http://psecommunity.org/LAPSE:2025.0392
Record Owner
PSE Press
Links to Related Works
Directly Related to This Work
Supplementary Material
Article DOI
References Cited
- Adjiman C, Dallwig S, Floudas C, Neumaier A. A global optimization method, aBB, for general twice-differentiable constrained NLPs - I. Theoretical advances. Comput Chem Eng 22:1137-1158 (1998) https://doi.org/10.1016/S0098-1354(98)00027-1
- Nocedal, J., & Wright, S. J. NUMERICAL OPTIMIZATION. In Springer eBooks (2006)
- Puranik, Y., & Sahinidis, N. V. Bounds tightening based on optimality conditions for nonconvex box-constrained optimization. J Glob Opt, 67(1-2), 59-77 (2016) https://doi.org/10.1007/s10898-016-0491-8
- Moré, J. J., & Wild, S. M. Benchmarking Derivative-Free optimization Algorithms. SIAM J Opt, 20(1), 172-191 (2009). https://doi.org/10.1137/080724083
- Xu, P., Roosta, F., & Mahoney, M. W. Second-order Optimization for Non-convex Machine Learning: an Empirical Study. In SIAM eBooks (pp. 199-207). (2020) https://doi.org/10.1137/1.9781611976236.23

