biblio

Solving Economic Models with Neural Networks Without Backpropagation

Numéro196
DateApril 2025
AuteurJulien Pascal
Résumé

Abstract. This paper presents a novel method to solve high-dimensional economic models using neural networks when the exact calculation of the gradient by backprop-agation is impractical or inapplicable. This method relies on the gradient-free bias-corrected Monte Carlo (bc-MC) operator, which constitutes, under certain conditions, an asymptotically unbiased estimator of the gradient of the loss function. This method is well-suited for high-dimensional models, as it requires only two evaluations of a residual function to approximate the gradient of the loss function, regardless of the model di-mension. I demonstrate that the gradient-free bias-corrected Monte Carlo operator has appealing properties as long as the economic model satisfies Lipschitz continuity. This makes the method particularly attractive in situations involving non-differentiable loss functions. I demonstrate the broad applicability of the gradient-free bc-MC operator by solving large-scale overlapping generations (OLG) models with aggregate uncertainty, including scenarios involving borrowing constraints that introduce non-differentiability in household optimization problems.
Keywords: Dynamic Programming, Neural Networks, Machine Learning, Monte Carlo, Overlapping Generations, Occasionally Binding Constraints.
JEL: C45, C61, C63, C68, E32, E37.

Téléchargement Cahier d'étude 196 (pdf, 2 MByte)