December 5, 2023
Similar papers 2
October 26, 2020
An important challenge in statistical analysis lies in controlling the bias of estimators due to the ever-increasing data size and model complexity. Approximate numerical methods and data features like censoring and misclassification often result in analytical and/or computational challenges when implementing standard estimators. As a consequence, consistent estimators may be difficult to obtain, especially in complex and/or high dimensional settings. In this paper, we study ...
December 20, 2014
Response-biased sampling, in which samples are drawn from a popula- tion according to the values of the response variable, is common in biomedical, epidemiological, economic and social studies. In particular, the complete obser- vations in data with censoring, truncation or missing covariates can be regarded as response-biased sampling under certain conditions. This paper proposes to use transformation models, known as the generalized accelerated failure time model in econome...
January 27, 2016
Functional covariates are common in many medical, biodemographic, and neuroimaging studies. The aim of this paper is to study functional Cox models with right-censored data in the presence of both functional and scalar covariates. We study the asymptotic properties of the maximum partial likelihood estimator and establish the asymptotic normality and efficiency of the estimator of the finite-dimensional estimator. Under the framework of reproducing kernel Hilbert space, the e...
August 15, 2022
Many epidemiological and clinical studies aim at analyzing a time-to-event endpoint. A common complication is right censoring. In some cases, it arises because subjects are still surviving after the study terminates or move out of the study area, in which case right censoring is typically treated as independent or non-informative. Such an assumption can be further relaxed to conditional independent censoring by leveraging possibly time-varying covariate information, if availa...
December 5, 2023
In observational studies, unmeasured confounders present a crucial challenge in accurately estimating desired causal effects. To calculate the hazard ratio (HR) in Cox proportional hazard models, which are relevant for time-to-event outcomes, methods such as Two-Stage Residual Inclusion and Limited Information Maximum Likelihood are typically employed. However, these methods raise concerns, including the potential for biased HR estimates and issues with parameter identificati...
May 27, 2020
This paper provides guidance for researchers with some mathematical background on the conduct of time-to-event analysis in observational studies based on intensity (hazard) models. Discussions of basic concepts like time axis, event definition and censoring are given. Hazard models are introduced, with special emphasis on the Cox proportional hazards regression model. We provide check lists that may be useful both when fitting the model and assessing its goodness of fit and w...
April 12, 2022
Regression analysis based on many covariates is becoming increasingly common. However, when the number of covariates $p$ is of the same order as the number of observations $n$, maximum likelihood regression becomes unreliable due to overfitting. This typically leads to systematic estimation biases and increased estimator variances. It is crucial for inference and prediction to quantify these effects correctly. Several methods have been proposed in literature to overcome overf...
December 13, 2018
We propose a new likelihood-based approach for estimation, inference and variable selection for parametric cure regression models in time-to-event analysis under random right-censoring. In this context, it often happens that some subjects are "cured", i.e., they will never experience the event of interest. Then, the sample of censored observations is an unlabeled mixture of cured and "susceptible" subjects. Using inverse probability censoring weighting (IPCW), we propose a li...
December 9, 2013
When observations are subject to right censoring, weighted least squares with appropriate weights (to adjust for censoring) is sometimes used for parameter estimation. With Stute's weighted least squares method, when the largest observation is censored ($Y_{(n)}^+$), it is natural to apply the redistribution to the right algorithm of Efron (1967). However, Efron's redistribution algorithm can lead to bias and inefficiency in estimation. This study explains the issues clearly ...
August 15, 2017
Accurately predicting the time of occurrence of an event of interest is a critical problem in longitudinal data analysis. One of the main challenges in this context is the presence of instances whose event outcomes become unobservable after a certain time point or when some instances do not experience any event during the monitoring period. Such a phenomenon is called censoring which can be effectively handled using survival analysis techniques. Traditionally, statistical app...