962 resultados para Gehan-Wilcoxon estimates
Resumo:
Adaptions of weighted rank regression to the accelerated failure time model for censored survival data have been successful in yielding asymptotically normal estimates and flexible weighting schemes to increase statistical efficiencies. However, for only one simple weighting scheme, Gehan or Wilcoxon weights, are estimating equations guaranteed to be monotone in parameter components, and even in this case are step functions, requiring the equivalent of linear programming for computation. The lack of smoothness makes standard error or covariance matrix estimation even more difficult. An induced smoothing technique overcame these difficulties in various problems involving monotone but pure jump estimating equations, including conventional rank regression. The present paper applies induced smoothing to the Gehan-Wilcoxon weighted rank regression for the accelerated failure time model, for the more difficult case of survival time data subject to censoring, where the inapplicability of permutation arguments necessitates a new method of estimating null variance of estimating functions. Smooth monotone parameter estimation and rapid, reliable standard error or covariance matrix estimation is obtained.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
A sustainable water resources management depends on sound information about the impacts of climate change. This information is, however, not easily derived because natural runoff variability interferes with the climate change signal. This study presents a procedure that leads to robust estimates of magnitude and Time Of Emergence (TOE) of climate-induced hydrological change that also account for the natural variability contained in the time series. Firstly, natural variability of 189 mesoscale catchments in Switzerland is sampled for 10 ENSEMBLES scenarios for the control (1984–2005) and two scenario periods (near future: 2025–2046, far future: 2074–2095) applying a bootstrap procedure. Then, the sampling distributions of mean monthly runoff are tested for significant differences with the Wilcoxon-Mann–Whitney test and for effect size with Cliff’s delta d. Finally, the TOE of a climate change induced hydrological change is determined when at least eight out of the ten hydrological projections significantly differ from natural variability. The results show that the TOE occurs in the near future period except for high-elevated catchments in late summer. The significant hydrological projections in the near future correspond, however, to only minor runoff changes. In the far future, hydrological change is statistically significant and runoff changes are substantial. Temperature change is the most important factor determining hydrological change in this mountainous region. Therefore, hydrological change depends strongly on a catchment’s mean elevation. Considering that the hydrological changes are predicted to be robust in the near future highlights the importance of accounting for these changes in water resources planning.
Resumo:
Sizes and power of selected two-sample tests of the equality of survival distributions are compared by simulation for small samples from unequally, randomly-censored exponential distributions. The tests investigated include parametric tests (F, Score, Likelihood, Asymptotic), logrank tests (Mantel, Peto-Peto), and Wilcoxon-Type tests (Gehan, Prentice). Equal sized samples, n = 18, 16, 32 with 1000 (size) and 500 (power) simulation trials, are compared for 16 combinations of the censoring proportions 0%, 20%, 40%, and 60%. For n = 8 and 16, the Asymptotic, Peto-Peto, and Wilcoxon tests perform at nominal 5% size expectations, but the F, Score and Mantel tests exceeded 5% size confidence limits for 1/3 of the censoring combinations. For n = 32, all tests showed proper size, with the Peto-Peto test most conservative in the presence of unequal censoring. Powers of all tests are compared for exponential hazard ratios of 1.4 and 2.0. There is little difference in power characteristics of the tests within the classes of tests considered. The Mantel test showed 90% to 95% power efficiency relative to parametric tests. Wilcoxon-type tests have the lowest relative power but are robust to differential censoring patterns. A modified Peto-Peto test shows power comparable to the Mantel test. For n = 32, a specific Weibull-exponential comparison of crossing survival curves suggests that the relative powers of logrank and Wilcoxon-type tests are dependent on the scale parameter of the Weibull distribution. Wilcoxon-type tests appear more powerful than logrank tests in the case of late-crossing and less powerful for early-crossing survival curves. Guidelines for the appropriate selection of two-sample tests are given. ^
Resumo:
An estimation of costs for maintenance and rehabilitation is subject to variation due to the uncertainties of input parameters. This paper presents the results of an analysis to identify input parameters that affect the prediction of variation in road deterioration. Road data obtained from 1688 km of a national highway located in the tropical northeast of Queensland in Australia were used in the analysis. Data were analysed using a probability-based method, the Monte Carlo simulation technique and HDM-4’s roughness prediction model. The results of the analysis indicated that among the input parameters the variability of pavement strength, rut depth, annual equivalent axle load and initial roughness affected the variability of the predicted roughness. The second part of the paper presents an analysis to assess the variation in cost estimates due to the variability of the overall identified critical input parameters.
Resumo:
Background: This study provides the latest available relative survival data for Australian childhood cancer patients. Methods: Data from the population-based Australian Paediatric Cancer Registry were used to describe relative survival outcomes using the period method for 11 903 children diagnosed with cancer between 1983 and 2006 and prevalent at any time between 1997 and 2006. Results: The overall relative survival was 90.4% after 1 year, 79.5% after 5 years and 74.7% after 20 years. Where information onstage at diagnosis was available (lymphomas, neuroblastoma, renal tumours and rhabdomyosarcomas), survival was significantly poorer for more-advanced stage. Survival was lower among infants compared with other children for those diagnosed with leukaemia, tumours of the central nervous system and renal tumours but higher for neuroblastoma. Recent improvements in overall childhood cancer survival over time are mainly because of improvements among leukaemia patients. Conclusion: The high and improving survival prognosis for children diagnosed with cancer in Australia is consistent with various international estimates. However, a 5-year survival estimate of 79% still means that many children who are diagnosed with cancer will die within 5 years, whereas others have long-term health morbidities and complications associated with their treatments. It is hoped that continued developments in treatment protocols will result in further improvements in survival.
Resumo:
Objective: Because studies of crowding in long-term care settings are lacking, the authors sought to: (1) generate initial estimates of crowding in nursing homes and assisted living facilities; and (2) evaluate two operational approaches to its measurement. ----- ----- Background: Reactions to density and proximity are complex. Greater density intensifies people's reaction to a situation in the direction (positive or negative) that they would react if the situation were to occur under less dense conditions. People with dementia are especially reactive to the environment. ----- ----- Methods: Using a cross-sectional correlational design in nursing homes and assisted living facilities involving 185 participants, multiple observations (N = 6,455) of crowding and other environmental variables were made. Crowding, location, and sound were measured three times per observation; ambiance was measured once. Data analyses consisted of descriptive statistics, t-tests, and one-way analysis of variance. ----- ----- Results: Crowding estimates were higher for nursing homes and in dining and activity rooms. Crowding also varied across settings and locations by time of day. Overall, the interaction of location and time affected crowding significantly (N = 5,559, df [47, 511], F = 105.69, p < .0001); effects were greater within location-by-hour than between location-by-hour, but the effect explained slightly less variance in Long-Term Care Crowding Index (LTC-CI) estimates (47.41%) than location alone. Crowding had small, direct, and highly significant correlations with sound and with the engaging subscale for ambiance; a similar, though inverse, correlation was seen with the soothing subscale for ambiance. ----- ----- Conclusions: Crowding fluctuates consistent with routine activities such as meals in long-term care settings. Furthermore, a relationship between crowding and other physical characteristics of the environment was found. The LTC-CI is likely to be more sensitive than simple people counts when seeking to evaluate the effects of crowding on the behavior of elders-particularly those with dementia-in long-term care settings. aging in place.
Resumo:
Decentralised sensor networks typically consist of multiple processing nodes supporting one or more sensors. These nodes are interconnected via wireless communication. Practical applications of Decentralised Data Fusion have generally been restricted to using Gaussian based approaches such as the Kalman or Information Filter This paper proposes the use of Parzen window estimates as an alternate representation to perform Decentralised Data Fusion. It is required that the common information between two nodes be removed from any received estimates before local data fusion may occur Otherwise, estimates may become overconfident due to data incest. A closed form approximation to the division of two estimates is described to enable conservative assimilation of incoming information to a node in a decentralised data fusion network. A simple example of tracking a moving particle with Parzen density estimates is shown to demonstrate how this algorithm allows conservative assimilation of network information.