showSidebars ==
showTitleBreadcrumbs == 1
node.field_disable_title_breadcrumbs.value ==

SMU SOE Lee Kong Chian Seminar in Econometrics (Oct 25, 2021, 9.00am-10.30am): Artificial Neural Network Estimation and Inference on Functionals of Nonparametric Conditional Moment Restrictions

Please click here if you are unable to view this page.

 

 

TOPIC:  

ARTIFICIAL NEURAL NETWORK ESTIMATION AND INFERENCE ON FUNCTIONALS OF NONPARAMETRIC CONDITIONAL MOMENT RESTRICTIONS

 

Paper 1: Efficient Estimation of Weighted Average Derivatives in NPIV Models: Comparisons of Neural Network Estimators (by Jiafeng Chen, Xiaohong Chen, Elie Tamer)
 
We investigate the computational performance of various Artificial Neural Networks (ANNs) in nonparametric instrumental variables (NPIV) models of high dimensional covariates that are relevant to empirical work in economics. We present two efficient procedures for estimation and inference on (weighted) average derivatives: an optimally-weighted sieve minimum distance (OP-SMD) procedure and a sieve efficient score equation (ES) procedure. Both procedures use ANN sieves to approximate the unknown NPIV function and are asymptotically first-order equivalent, although their finite-sample performances differ. We provide a detailed practitioner’s recipe for implementing both efficient estimators and their inefficient counterparts (the identity-weighted SMD and the inefficient score estimators). This involves the choice of tuning parameters both for the unknown functions (the NPIV and the conditional expectations) but also for the choice of estimation of the optimal weights in the OP-SMD and the Riesz representers in the ES estimators. We conduct a large set of Monte Carlo experiments that compares the finite-sample performance in complicated designs that involve NPIV function of up to 13 continuous covariates, and various underlying nonlinearities and covariate correlations. Some of the takeaways include: 1) tuning and optimization are more delicate in ANN estimation; 2) Given proper tuning, ANNs with different layers and different activations can all perform well; 3) ANN OP-SMD estimators and ANN inefficient score estimators perform adequately; 4) stable inferences are more difficult to achieve with ANN (than spline) estimators; 5) there seems a gap between current implementation and approximation theory. Finally, we apply ANN NPIV to estimate average derivatives in two empirical demand examples.

Paper 2: Neural Network Inference on Nonparametric Conditional Moment Restrictions with Weakly Dependent Data (by Xiaohong Chen, Yuan Liao and Weichen Wang)
 
Deep Neural Networks (DNNs) are nonlinear sieves that can approximate nonlinear functions of high dimensional variables much more flexibly than various linear sieves (or series). This paper considers quasi-likelihood ratio (QLR) based inference on expectation functionals of time series nonparametric conditional moment restrictions via DNNs. We estimate the expectation functional and the DNN sieve approximated structure functions jointly by minimizing a modified optimally weighted minimum distance criterion. Regardless whether the expectation functional is regular (or root-T estimable) or not, the DNN QLR statistic is shown to be asymptotically Chi-square distributed, which can be used for confidence bands construction. DNN QLR inference on averaged partial means and averaged squared partial derivatives of nonparametric conditional mean instrumental variables (NPIV) and nonparametric conditional quantile IV (NPQIV) models are presented as examples. A small Monte Carlo shows the finite sample performance of the procedure.
 

Click here to view the CV.
 
 
 

This seminar will be held virtually via Zoom. A confirmation email with the Zoom details will be sent to the registered email by 24 October 2021.
 

Xiaohong Chen

Yale University
 
 
Econometric theory
Semi/nonparametric estimation
and inference methods
Sieve methods
Nonlinear time series
Semi/nonparametric models
 
 

25 October 2021 (Monday)

 
 

9.00am - 10.30am