We propose an easy to implement method for making small sample parametric inference about the root of an estimating equation expressible as a quadratic form in normal random variables. It is based on saddlepoint approximations to the distribution of the estimating equation whose unique root is a parameter's maximum likelihood estimator (MLE), while substituting conditional MLEs for the remaining (nuisance) parameters. Monotoncity of the estimating equation in its parameter argument enables us to relate these approximations to those for the estimator of interest. The proposed method is equivalent to a parametric bootstrap percentile approach where Monte Carlo simulation is replaced by saddlepoint approximation. It finds applications in many areas of statistics including, nonlinear regression, time series analysis, inference on ratios of regression parameters in linear models and calibration. We demonstrate the method in the context of some classical examples from nonlinear regression models and ratios of regression parameter problems. Simulation results for these show that the proposed method, apart from being generally easier to implement, yields confidence intervals with lengths and coverage probabilities that compare favourably with those obtained from several competing methods proposed in the literature over the past half-century.
- Automatic percentile method
- Nonlinear regression
- Parametric bootstrap confidence interval
- Quadratic estimating equation
- Ratio of regression parameters
- Saddlepoint approximation
- Time series