Abstract: We consider estimation in Cox (1972) regression with missing covariates. We focus on the situation when observable covariates and surrogates are continuous. To estimate the induced relative risk, we approximate it by a function of the conditional mean and the conditional variance of the missing variable, given observable covariate and surrogate variables in the risk set at each failure time. This approach may be considered as a higher order extension of the usual regression calibration method, and hence can be expected to reduce bias, especially when the relative risk or the estimating error from the surrogate variables is large. The proposed estimator arises from an approximate approach, so that the magnitude of any bias needs to be studied. Asymptotic distribution theory is developed and small sample performance is examined. We illustrate the method by an example from a medical study.
Key words and phrases: Cox regression, estimating equation, measurement error model, regression calibration, surrogate covariate.