960 resultados para Right censored data


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The log-Burr XII regression model for grouped survival data is evaluated in the presence of many ties. The methodology for grouped survival data is based on life tables, where the times are grouped in k intervals, and we fit discrete lifetime regression models to the data. The model parameters are estimated by maximum likelihood and jackknife methods. To detect influential observations in the proposed model, diagnostic measures based on case deletion, so-called global influence, and influence measures based on small perturbations in the data or in the model, referred to as local influence, are used. In addition to these measures, the total local influence and influential estimates are also used. We conduct Monte Carlo simulation studies to assess the finite sample behavior of the maximum likelihood estimators of the proposed model for grouped survival. A real data set is analyzed using a regression model for grouped data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

"US 84-10/8."

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Lognormal distribution has abundant applications in various fields. In literature, most inferences on the two parameters of the lognormal distribution are based on Type-I censored sample data. However, exact measurements are not always attainable especially when the observation is below or above the detection limits, and only the numbers of measurements falling into predetermined intervals can be recorded instead. This is the so-called grouped data. In this paper, we will show the existence and uniqueness of the maximum likelihood estimators of the two parameters of the underlying lognormal distribution with Type-I censored data and grouped data. The proof was first established under the case of normal distribution and extended to the lognormal distribution through invariance property. The results are applied to estimate the median and mean of the lognormal population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A compositional multivariate approach is used to analyse regional scale soil geochemical data obtained as part of the Tellus Project generated by the Geological Survey Northern Ireland (GSNI). The multi-element total concentration data presented comprise XRF analyses of 6862 rural soil samples collected at 20cm depths on a non-aligned grid at one site per 2 km2. Censored data were imputed using published detection limits. Using these imputed values for 46 elements (including LOI), each soil sample site was assigned to the regional geology map provided by GSNI initially using the dominant lithology for the map polygon. Northern Ireland includes a diversity of geology representing a stratigraphic record from the Mesoproterozoic, up to and including the Palaeogene. However, the advance of ice sheets and their meltwaters over the last 100,000 years has left at least 80% of the bedrock covered by superficial deposits, including glacial till and post-glacial alluvium and peat. The question is to what extent the soil geochemistry reflects the underlying geology or superficial deposits. To address this, the geochemical data were transformed using centered log ratios (clr) to observe the requirements of compositional data analysis and avoid closure issues. Following this, compositional multivariate techniques including compositional Principal Component Analysis (PCA) and minimum/maximum autocorrelation factor (MAF) analysis method were used to determine the influence of underlying geology on the soil geochemistry signature. PCA showed that 72% of the variation was determined by the first four principal components (PC’s) implying “significant” structure in the data. Analysis of variance showed that only 10 PC’s were necessary to classify the soil geochemical data. To consider an improvement over PCA that uses the spatial relationships of the data, a classification based on MAF analysis was undertaken using the first 6 dominant factors. Understanding the relationship between soil geochemistry and superficial deposits is important for environmental monitoring of fragile ecosystems such as peat. To explore whether peat cover could be predicted from the classification, the lithology designation was adapted to include the presence of peat, based on GSNI superficial deposit polygons and linear discriminant analysis (LDA) undertaken. Prediction accuracy for LDA classification improved from 60.98% based on PCA using 10 principal components to 64.73% using MAF based on the 6 most dominant factors. The misclassification of peat may reflect degradation of peat covered areas since the creation of superficial deposit classification. Further work will examine the influence of underlying lithologies on elemental concentrations in peat composition and the effect of this in classification analysis.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Survival models are being widely applied to the engineering field to model time-to-event data once censored data is here a common issue. Using parametric models or not, for the case of heterogeneous data, they may not always represent a good fit. The present study relays on critical pumps survival data where traditional parametric regression might be improved in order to obtain better approaches. Considering censored data and using an empiric method to split the data into two subgroups to give the possibility to fit separated models to our censored data, we’ve mixture two distinct distributions according a mixture-models approach. We have concluded that it is a good method to fit data that does not fit to a usual parametric distribution and achieve reliable parameters. A constant cumulative hazard rate policy was used as well to check optimum inspection times using the obtained model from the mixture-model, which could be a plus when comparing with the actual maintenance policies to check whether changes should be introduced or not.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

STUDY DESIGN: Reliability and case-control injury study. OBJECTIVES: 1) To determine if a novel device, designed to measure eccentric knee flexors strength via the Nordic hamstring exercise (NHE), displays acceptable test-retest reliability; 2) to determine normative values for eccentric knee flexors strength derived from the device in individuals without a history of hamstring strain injury (HSI) and; 3) to determine if the device could detect weakness in elite athletes with a previous history of unilateral HSI. BACKGROUND: HSIs and reinjuries are the most common cause of lost playing time in a number of sports. Eccentric knee flexors weakness is a major modifiable risk factor for future HSIs, however there is a lack of easily accessible equipment to assess this strength quality. METHODS: Thirty recreationally active males without a history of HSI completed NHEs on the device on 2 separate occasions. Intraclass correlation coefficients (ICCs), typical error (TE), typical error as a co-efficient of variation (%TE), and minimum detectable change at a 95% confidence interval (MDC95) were calculated. Normative strength data were determined using the most reliable measurement. An additional 20 elite athletes with a unilateral history of HSI within the previous 12 months performed NHEs on the device to determine if residual eccentric muscle weakness existed in the previously injured limb. RESULTS: The device displayed high to moderate reliability (ICC = 0.83 to 0.90; TE = 21.7 N to 27.5 N; %TE = 5.8 to 8.5; MDC95 = 76.2 to 60.1 N). Mean±SD normative eccentric flexors strength, based on the uninjured group, was 344.7 ± 61.1 N for the left and 361.2 ± 65.1 N for the right side. The previously injured limbs were 15% weaker than the contralateral uninjured limbs (mean difference = 50.3 N; 95% CI = 25.7 to 74.9N; P < .01), 15% weaker than the normative left limb data (mean difference = 50.0 N; 95% CI = 1.4 to 98.5 N; P = .04) and 18% weaker than the normative right limb data (mean difference = 66.5 N; 95% CI = 18.0 to 115.1 N; P < .01). CONCLUSIONS: The experimental device offers a reliable method to determine eccentric knee flexors strength and strength asymmetry and revealed residual weakness in previously injured elite athletes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Clinical trials have shown that weight reduction with lifestyles can delay or prevent diabetes and reduce blood pressure. An appropriate definition of obesity using anthropometric measures is useful in predicting diabetes and hypertension at the population level. However, there is debate on which of the measures of obesity is best or most strongly associated with diabetes and hypertension and on what are the optimal cut-off values for body mass index (BMI) and waist circumference (WC) in this regard. The aims of the study were 1) to compare the strength of the association for undiagnosed or newly diagnosed diabetes (or hypertension) with anthropometric measures of obesity in people of Asian origin, 2) to detect ethnic differences in the association of undiagnosed diabetes with obesity, 3) to identify ethnic- and sex-specific change point values of BMI and WC for changes in the prevalence of diabetes and 4) to evaluate the ethnic-specific WC cutoff values proposed by the International Diabetes Federation (IDF) in 2005 for central obesity. The study population comprised 28 435 men and 35 198 women, ≥ 25 years of age, from 39 cohorts participating in the DECODA and DECODE studies, including 5 Asian Indian (n = 13 537), 3 Mauritian Indian (n = 4505) and Mauritian Creole (n = 1075), 8 Chinese (n =10 801), 1 Filipino (n = 3841), 7 Japanese (n = 7934), 1 Mongolian (n = 1991), and 14 European (n = 20 979) studies. The prevalence of diabetes, hypertension and central obesity was estimated, using descriptive statistics, and the differences were determined with the χ2 test. The odds ratios (ORs) or  coefficients (from the logistic model) and hazard ratios (HRs, from the Cox model to interval censored data) for BMI, WC, waist-to-hip ratio (WHR), and waist-to-stature ratio (WSR) were estimated for diabetes and hypertension. The differences between BMI and WC, WHR or WSR were compared, applying paired homogeneity tests (Wald statistics with 1 df). Hierarchical three-level Bayesian change point analysis, adjusting for age, was applied to identify the most likely cut-off/change point values for BMI and WC in association with previously undiagnosed diabetes. The ORs for diabetes in men (women) with BMI, WC, WHR and WSR were 1.52 (1.59), 1.54 (1.70), 1.53 (1.50) and 1.62 (1.70), respectively and the corresponding ORs for hypertension were 1.68 (1.55), 1.66 (1.51), 1.45 (1.28) and 1.63 (1.50). For diabetes the OR for BMI did not differ from that for WC or WHR, but was lower than that for WSR (p = 0.001) in men while in women the ORs were higher for WC and WSR than for BMI (both p < 0.05). Hypertension was more strongly associated with BMI than with WHR in men (p < 0.001) and most strongly with BMI than with WHR (p < 0.001), WSR (p < 0.01) and WC (p < 0.05) in women. The HRs for incidence of diabetes and hypertension did not differ between BMI and the other three central obesity measures in Mauritian Indians and Mauritian Creoles during follow-ups of 5, 6 and 11 years. The prevalence of diabetes was highest in Asian Indians, lowest in Europeans and intermediate in others, given the same BMI or WC category. The  coefficients for diabetes in BMI (kg/m2) were (men/women): 0.34/0.28, 0.41/0.43, 0.42/0.61, 0.36/0.59 and 0.33/0.49 for Asian Indian, Chinese, Japanese, Mauritian Indian and European (overall homogeneity test: p > 0.05 in men and p < 0.001 in women). Similar results were obtained in WC (cm). Asian Indian women had lower  coefficients than women of other ethnicities. The change points for BMI were 29.5, 25.6, 24.0, 24.0 and 21.5 in men and 29.4, 25.2, 24.9, 25.3 and 22.5 (kg/m2) in women of European, Chinese, Mauritian Indian, Japanese, and Asian Indian descent. The change points for WC were 100, 85, 79 and 82 cm in men and 91, 82, 82 and 76 cm in women of European, Chinese, Mauritian Indian, and Asian Indian. The prevalence of central obesity using the 2005 IDF definition was higher in Japanese men but lower in Japanese women than in their Asian counterparts. The prevalence of central obesity was 52 times higher in Japanese men but 0.8 times lower in Japanese women compared to the National Cholesterol Education Programme definition. The findings suggest that both BMI and WC predicted diabetes and hypertension equally well in all ethnic groups. At the same BMI or WC level, the prevalence of diabetes was highest in Asian Indians, lowest in Europeans and intermediate in others. Ethnic- and sex-specific change points of BMI and WC should be considered in setting diagnostic criteria for obesity to detect undiagnosed or newly diagnosed diabetes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper re-examines the determinants of mutual fund fees paid by mutual fund shareholders for management costs and other expenses. There are two novelties with respect to previous studies. First, each type of fee is explained separately. Second, the paper employs a new dataset consisting of Spanish mutual funds, making it the second paper to study mutual fund fees outside the US market. Furthermore, the Spanish market has three interesting characteristics: (i) both distribution and management are highly dominated by banks and savings banks, which points towards potential conflicts of interest; (ii) Spanish mutual fund law imposes caps on all types of fees; and (iii) Spain ranks first in terms of average mutual fund fees among similar countries. We find significant differences in mutual fund fees not explained by the fund’s investment objective. For instance, management companies owned by banks and savings banks charge higher management fees and redemption fees to nonguaranteed funds. Also, investors in older non-guaranteed funds and non-guaranteed funds with a lower average investment are more likely to end up paying higher management fees. Moreover, there is clear evidence that some mutual funds enjoy better conditions from custodial institutions than others. In contrast to evidence from the US market, larger funds are not associated with lower fees, but with higher custody fees for guaranteed funds and higher redemption fees for both types of funds. Finally, fee-setting by mutual funds is not related to fund before-fee performance.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The final publication is available at Springer via http://dx.doi.org/10.1007/s10693-015-0230-1

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper a multiple classifier machine learning methodology for Predictive Maintenance (PdM) is presented. PdM is a prominent strategy for dealing with maintenance issues given the increasing need to minimize downtime and associated costs. One of the challenges with PdM is generating so called ’health factors’ or quantitative indicators of the status of a system associated with a given maintenance issue, and determining their relationship to operating costs and failure risk. The proposed PdM methodology allows dynamical decision rules to be adopted for maintenance management and can be used with high-dimensional and censored data problems. This is achieved by training multiple classification modules with different prediction horizons to provide different performance trade-offs in terms of frequency of unexpected breaks and unexploited lifetime and then employing this information in an operating cost based maintenance decision system to minimise expected costs. The effectiveness of the methodology is demonstrated using a simulated example and a benchmark semiconductor manufacturing maintenance problem.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE: To describe the distribution of central corneal thickness (CCT), intraocular pressure (IOP), and their determinants and association with glaucoma in Chinese adults.DESIGN: Population-based cross-sectional study.METHODS: Chinese adults aged 50 years and older were identified using cluster random sampling in Liwan District, Guangzhou. CCT (both optical [OCCT] and ultrasound [UCCT]), intraocular pressure (by Tonopen, IOP), refractive error (by autorefractor, RE), radius of corneal curvature (RCC), axial length (AL), and body mass index (BMI) were measured, and history of hypertension and diabetes (DM) was collected by questionnaire. Right eye data were analyzed.RESULTS: The mean values of OCCT, UCCT, and IOP were 512 ± 29.0 μm, 542 ± 31.4 μm, and 15.2 ± 3.1 mm Hg, respectively. In multiple regression models, CCT declined with age (P < .001) and increased with greater RCC (P < .001) and DM (P = .037). IOP was positively associated with greater CCT (P < .001), BMI (P < .001), and hypertension (P < .001). All 25 persons with open-angle glaucoma had IOP <21 mm Hg. CCT did not differ significantly between persons with and without open- or closed-angle glaucoma. Among 65 persons with ocular hypertension (IOP >97.5th percentile), CCT (555 ± 29 μm) was significantly (P = .01) higher than for normal persons.CONCLUSIONS: The distributions of CCT and IOP in this study are similar to that for other Chinese populations, though IOP was lower than for European populations, possibly due to lower BMI and blood pressure. Glaucoma with IOP <21 mm Hg is common in this population. We found no association between glaucoma and CCT, though power (0.3) for this analysis was low.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Low concentrations of elements in geochemical analyses have the peculiarity of being compositional data and, for a given level of significance, are likely to be beyond the capabilities of laboratories to distinguish between minute concentrations and complete absence, thus preventing laboratories from reporting extremely low concentrations of the analyte. Instead, what is reported is the detection limit, which is the minimum concentration that conclusively differentiates between presence and absence of the element. A spatially distributed exhaustive sample is employed in this study to generate unbiased sub-samples, which are further censored to observe the effect that different detection limits and sample sizes have on the inference of population distributions starting from geochemical analyses having specimens below detection limit (nondetects). The isometric logratio transformation is used to convert the compositional data in the simplex to samples in real space, thus allowing the practitioner to properly borrow from the large source of statistical techniques valid only in real space. The bootstrap method is used to numerically investigate the reliability of inferring several distributional parameters employing different forms of imputation for the censored data. The case study illustrates that, in general, best results are obtained when imputations are made using the distribution best fitting the readings above detection limit and exposes the problems of other more widely used practices. When the sample is spatially correlated, it is necessary to combine the bootstrap with stochastic simulation

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper examines a dataset which is modeled well by the Poisson-Log Normal process and by this process mixed with Log Normal data, which are both turned into compositions. This generates compositional data that has zeros without any need for conditional models or assuming that there is missing or censored data that needs adjustment. It also enables us to model dependence on covariates and within the composition

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This study examines current and forthcoming measures related to the exchange of data and information in EU Justice and Home Affairs policies, with a focus on the ‘smart borders’ initiative. It argues that there is no reversibility in the growing reliance on such schemes and asks whether current and forthcoming proposals are necessary and original. It outlines the main challenges raised by the proposals, including issues related to the right to data protection, but also to privacy and non-discrimination.