Mathematical Methods in Reliability Research - Lifetime data analysis

Tuesday 22 May 2012
10:00 am - 4:15 pm
Event type Seminar
Location John Crank Room 128
Mathematical Methods in Reliability Research - Lifetime data analysis

Programme

1000-1030      Refreshments

1030-1230      Talks and Discussion

1030-1055      Natural constructions of hazard and intensity functions
                     Prof. David Percy (Salford)

1055-1120      Monitoring using Event History Data
                     Dr Axel Gandy (Imperial, London)

1120-1145      Warranty data analysis: new results and challenges
                     Dr Shaomin Wu (Cranfield)

1145-1210      Multivariate quantile-quantile plots and related  tests
                     Dr Subhra sankar Dhar (Cambridge)

1210-1225      Joint modelling of longitudinal and survival analysis
                     Dr Yanchun Bao, (Brunel)

1225-1325      Buffet Lunch and Discussion

1325-1350      Nonparametric predictive inference for reliability of coherent system
                     Prof. Frank Coolen (Durham)

1350-1415      Optimal design for censored lifetime experiments
                     Dr Alan Kimber (Southampton)

1415-1425      Small sample inference of GPD with application in MTTF and Volatility
                     Mr Zhuo Sheng (Brunel)

1425-1500      Tea break and Discussion

1500-1525      Survival Models and Threshold Crossings
                     Prof. Martin Newby (City)

1525-1550      Accelerated failure time models for censored survival data under referral bias
                     Dr Hongsheng Dai (Brighton)

15:50-16:00    Brief summary and remind of next two meetings in Durham and Glasgow.

Abstracts

Natural constructions of hazard and intensity functions
Prof. David Percy  (Salford)

We investigate the reliability distributions for variants of standard hazard functions that model periodicity, impulses, amplifications, damping, orders and extrema. We also consider parallel-series system configurations and the resulting implications for intensity functions of complex systems, concluding with a discussion of prior elicitation and practical applications.

Monitoring using Event History Data
Dr Axel Gandy (Imperial, London)

This talk discusses how survival analysis / event history models can be used for monitoring purposes. In particular, CUSUM charts based on partial likelihood ratios will be discussed. As most control charts, this method needs an alarm threshold. Calibration of the threshold of charts to achieve a desired in-control property (e.g. average run length, false alarm probability) often ignores that the in-control distribution is usually only estimated. A method to take account of the estimation error when calibrating charts will be suggested.

Accelerated failure time models for censored survival data under referral bias
Dr Hongsheng Dai (University of Brighton)

The estimation of progression to liver cirrhosis and identifying its risk factors are often of  epidemiological interests in hepatitis C natural history study. In most hepatitis C cohort
studies, patients were usually recruited to the cohort with referral bias because clinically the patients with more rapid disease progression were preferentially referred to liver clinics. A pair of correlated event times may be observed for each patient, time to development of cirrhosis and time to referral to a cohort. This paper considers accelerated failure time models to study the effects of covariates on progression to cirrhosis. A new non-parametric estimator is proposed to handle a flexible bivariate distribution of the cirrhosis and referral times and to take the referral bias into account. The asymptotic normality of the proposed coefficient estimator is also provided. Numerical studies show that the coefficient estimator and its covariance function estimator perform well.
                  
Warranty data analysis: new results and challenges
Dr Shaomin Wu (Cranfield)

Warranty claims and supplementary data contain useful information about product quality and reliability. Analysing such data can therefore be of benefit to manufacturers in identifying early warnings of abnormalities in their products, providing useful information about failure modes to aid design modification, estimating product reliability for deciding on warranty policy, and forecasting future warranty claims needed for preparing fiscal plans.  In the last two decades, considerable research has been conducted in warranty data analysis (WDA) from several different perspectives. This presentation attempts to report our newly developed approaches to warranty forecasting and some existing challenges.
                                    
Multivariate quantile-quantile plots and related  tests
Dr Subhra sankar Dhar (Cambridge University)

The univariate quantile-quantile (Q-Q) plot is a well-known graphical tool for examining whether two data sets are generated from the same distribution or not. It is also used to determine how well a specified probability distribution fits a given sample. In this talk, we will develop and study a multivariate version of Q-Q plot based on spatial quantiles (see Chaudhuri (1996), JASA). The usefulness of the proposed graphical device will be illustrated on different real and simulated data, some of which have fairly large dimensions. We will also develop certain statistical tests that are related to the proposed multivariate Q-Q plots and study their asymptotic properties. The performance of those tests compared to some other well-known tests for multivariate distributions will be discussed also. This is a joint work with Biman Chakraborty and Probal Chaudhuri.
       
Nonparametric predictive inference for reliability of coherent system
Prof. Frank Coolen (Durham)

Nonparametric predictive inference (NPI) is a statistical method using relatively few modelling assumptions, enabled through the use of lower and upper probabilities to quantify uncertainty. In this talk, lower and upper survival functions for coherent systems are presented, based on test results in the form of failures times of components exchangeable with those in the system. As it is a data-driven approach, such test failure times must be available for each type of component in the system. It is also shown how partial knowledge of the system structure can be used, which has the advantage of possibly reducing the computational effort in case of a specific reliability target.
(Joint work with Ahmad Aboalkhair, Abdullah Al-nefaieeh and Tahani Coolen-Maturi)

Optimal design for censored lifetime experiments
Dr Alan Kimber (University of Southampton)

Censoring may occur in many industrial or biomedical 'time to event' experiments. Efficient designs for such experiments are needed but finding such designs can be problematic since the statistical models involved will usually be nonlinear, making the optimal choice of design parameter dependent. We provide analytical characterisations of locally D- and c-optimal designs for a class of models that includes the natural proportional hazards parameterisation of the exponential regression model, thus reducing the numerical effort for design search substantially. Links to designs for the semi-parametric Cox proportional hazards model are also discussed.

Small sample inference of GPD with application in MTTF and Volatility
Mr Zhuo Sheng (Brunel)

Exact statistics inference of the generalised Pareto distribution (GPD) under the extreme value theory is often subjected to small sample sizes. The estimation becomes even difficult for extremely high quantiles of the GPD while quantiles are very useful measures in risk analysis with application in in MTTF (mean time to failure) and Volatility.

Joint modelling of longitudinal and survival analysis
Dr Yanchun Bao (Brunel)

In survival analysis, time-dependent covariates are usually present as longitudinal data collected periodically and measured with error. The longitudinal data can be assumed to follow a linear mixed effect model and Cox regression models may be used for modelling of survival events. The hazard rate of survival times depends on the underlying time-dependent covariate measured with error, which may be described by random effects. Most existing methods proposed for such models assume a parametric distribution assumption on the random effects and specify a normally distributed error term for the linear mixed effect model. These assumptions may not be always valid in practice. In this paper we propose a new likelihood method for Cox regression models with error-contaminated time-dependent covariates. The proposed method does not require any parametric distribution assumption on random effects and random errors. Asymptotic properties for parameter estimators are provided.
Simulation results show that under certain situations the proposed methods are more efficient than the existing methods.

 

Page last updated: Friday 11 May 2012