33 resultados para Competing risks, Estimation of predator mortality, Over dispersion, Stochastic modeling

em University of Queensland eSpace - Australia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider a mixture model approach to the regression analysis of competing-risks data. Attention is focused on inference concerning the effects of factors on both the probability of occurrence and the hazard rate conditional on each of the failure types. These two quantities are specified in the mixture model using the logistic model and the proportional hazards model, respectively. We propose a semi-parametric mixture method to estimate the logistic and regression coefficients jointly, whereby the component-baseline hazard functions are completely unspecified. Estimation is based on maximum likelihood on the basis of the full likelihood, implemented via an expectation-conditional maximization (ECM) algorithm. Simulation studies are performed to compare the performance of the proposed semi-parametric method with a fully parametric mixture approach. The results show that when the component-baseline hazard is monotonic increasing, the semi-parametric and fully parametric mixture approaches are comparable for mildly and moderately censored samples. When the component-baseline hazard is not monotonic increasing, the semi-parametric method consistently provides less biased estimates than a fully parametric approach and is comparable in efficiency in the estimation of the parameters for all levels of censoring. The methods are illustrated using a real data set of prostate cancer patients treated with different dosages of the drug diethylstilbestrol. Copyright (C) 2003 John Wiley Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A mixture model for long-term survivors has been adopted in various fields such as biostatistics and criminology where some individuals may never experience the type of failure under study. It is directly applicable in situations where the only information available from follow-up on individuals who will never experience this type of failure is in the form of censored observations. In this paper, we consider a modification to the model so that it still applies in the case where during the follow-up period it becomes known that an individual will never experience failure from the cause of interest. Unless a model allows for this additional information, a consistent survival analysis will not be obtained. A partial maximum likelihood (ML) approach is proposed that preserves the simplicity of the long-term survival mixture model and provides consistent estimators of the quantities of interest. Some simulation experiments are performed to assess the efficiency of the partial ML approach relative to the full ML approach for survival in the presence of competing risks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background From the mid-1980s to mid-1990s, the WHO MONICA Project monitored coronary events and classic risk factors for coronary heart disease (CHD) in 38 populations from 21 countries. We assessed the extent to which changes in these risk factors explain the variation in the trends in coronary-event rates across the populations. Methods In men and women aged 35-64 years, non-fatal myocardial infarction and coronary deaths were registered continuously to assess trends in rates of coronary events. We carried out population surveys to estimate trends in risk factors. Trends in event rates were regressed on trends in risk score and in individual risk factors. Findings Smoking rates decreased in most male populations but trends were mixed in women; mean blood pressures and cholesterol concentrations decreased, body-mass index increased, and overall risk scores and coronary-event rates decreased. The model of trends in 10-year coronary-event rates against risk scores and single risk factors showed a poor fit, but this was improved with a 4-year time lag for coronary events. The explanatory power of the analyses was limited by imprecision of the estimates and homogeneity of trends in the study populations. Interpretation Changes in the classic risk factors seem to partly explain the variation in population trends in CHD. Residual variance is attributable to difficulties in measurement and analysis, including time lag, and to factors that were not included, such as medical interventions. The results support prevention policies based on the classic risk factors but suggest potential for prevention beyond these.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Binning and truncation of data are common in data analysis and machine learning. This paper addresses the problem of fitting mixture densities to multivariate binned and truncated data. The EM approach proposed by McLachlan and Jones (Biometrics, 44: 2, 571-578, 1988) for the univariate case is generalized to multivariate measurements. The multivariate solution requires the evaluation of multidimensional integrals over each bin at each iteration of the EM procedure. Naive implementation of the procedure can lead to computationally inefficient results. To reduce the computational cost a number of straightforward numerical techniques are proposed. Results on simulated data indicate that the proposed methods can achieve significant computational gains with no loss in the accuracy of the final parameter estimates. Furthermore, experimental results suggest that with a sufficient number of bins and data points it is possible to estimate the true underlying density almost as well as if the data were not binned. The paper concludes with a brief description of an application of this approach to diagnosis of iron deficiency anemia, in the context of binned and truncated bivariate measurements of volume and hemoglobin concentration from an individual's red blood cells.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents Monte Carlo techniques for estimating network reliability. For highly reliable networks, techniques based on graph evolution models provide very good performance. However, they are known to have significant simulation cost. An existing hybrid scheme (based on partitioning the time space) is available to speed up the simulations; however, there are difficulties with optimizing the important parameter associated with this scheme. To overcome these difficulties, a new hybrid scheme (based on partitioning the edge set) is proposed in this article. The proposed scheme shows orders of magnitude improvement of performance over the existing techniques in certain classes of network. It also provides reliability bounds with little overhead.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Smoking is a risk factor for several diseases and has been increasing in many developing countries. Our aim was to estimate global and regional mortality in 2000 caused by smoking, including an analysis of uncertainty. Methods Following the methods of Peto and colleagues, we used lung-cancer mortality as an indirect marker for accumulated smoking risk. Never-smoker lung-cancer mortality was estimated based on the household use of coal with poor ventilation. Relative risks were taken from the American Cancer Society Cancer Prevention Study, phase II, and the retrospective proportional mortality analysis of Liu and colleagues in China. Relative risks were corrected for confounding and extrapolation to other regions. Results We estimated that in 2000, 4.83 (uncertainty range 3.94-5.93) million premature deaths in the world were attributable to smoking; 2.41 (1.80-3.15) million in developing countries and 2.43 (2.13-2.78) million in industrialised countries. 3.84 million of these deaths were in men. The leading causes of death from smoking were cardiovascular diseases (1.69 million deaths), chronic obstructive pulmonary disease (0.97 million deaths), and lung cancer (0.85 million deaths). Interpretation Smoking was an important cause of global mortality in 2000. In view of the expected demographic and epidemiological transitions and current smoking patterns in the developing world, the health loss due to smoking will grow even larger unless effective interventions and policies that reduce smoking among men and prevent increases among women in developing countries are implemented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Sentinel node biopsy (SNB) is being increasingly used but its place outside randomized trials has not yet been established. Methods: The first 114 sentinel node (SN) biopsies performed for breast cancer at the Princess Alexandra Hospital from March 1999 to June 2001 are presented. In 111 cases axillary dissection was also performed, allowing the accuracy of the technique to be assessed. A standard combination of preoperative lymphoscintigraphy, intraoperative gamma probe and injection of blue dye was used in most cases. Results are discussed in relation to the risk and potential consequences of understaging. Results: Where both probe and dye were used, the SN was identified in 90% of patients. A significant number of patients were treated in two stages and the technique was no less effective in patients who had SNB performed at a second operation after the primary tumour had already been removed. The interval from radioisotope injection to operation was very wide (between 2 and 22 h) and did not affect the outcome. Nodal metastases were present in 42 patients in whom an SN was found, and in 40 of these the SN was positive, giving a false negative rate of 4.8% (2/42), with the overall percentage of patients understaged being 2%. For this particular group as a whole, the increased risk of death due to systemic therapy being withheld as a consequence of understaging (if SNB alone had been employed) is estimated at less than 1/500. The risk for individuals will vary depending on other features of the particular primary tumour. Conclusion: For patients who elect to have the axilla staged using SNB alone, the risk and consequences of understaging need to be discussed. These risks can be estimated by allowing for the specific surgeon's false negative rate for the technique, and considering the likelihood of nodal metastases for a given tumour. There appears to be no disadvantage with performing SNB at a second operation after the primary tumour has already been removed. Clearly, for a large number of patients, SNB alone will be safe, but ideally participation in randomized trials should continue to be encouraged.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Water-sampler equilibrium partitioning coefficients and aqueous boundary layer mass transfer coefficients for atrazine, diuron, hexazionone and fluometuron onto C18 and SDB-RPS Empore disk-based aquatic passive samplers have been determined experimentally under a laminar flow regime (Re = 5400). The method involved accelerating the time to equilibrium of the samplers by exposing them to three water concentrations, decreasing stepwise to 50% and then 25% of the original concentration. Assuming first-order Fickian kinetics across a rate-limiting aqueous boundary layer, both parameters are determined computationally by unconstrained nonlinear optimization. In addition, a method of estimation of mass transfer coefficients-therefore sampling rates-using the dimensionless Sherwood correlation developed for laminar flow over a flat plate is applied. For each of the herbicides, this correlation is validated to within 40% of the experimental data. The study demonstrates that for trace concentrations (sub 0.1 mu g/L) and these flow conditions, a naked Empore disk performs well as an integrative sampler over short deployments (up to 7 days) for the range of polar herbicides investigated. The SDB-RPS disk allows a longer integrative period than the C18 disk due to its higher sorbent mass and/or its more polar sorbent chemistry. This work also suggests that for certain passive sampler designs, empirical estimation of sampling rates may be possible using correlations that have been available in the chemical engineering literature for some time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The water retention curve (WRC) is a hydraulic characteristic of concrete required for advanced modeling of water (and thus solute) transport in variably saturated, heterogeneous concrete. Unfortunately, determination by a direct experimental method (for example, measuring equilibrium moisture levels of large samples stored in constant humidity cells) is a lengthy process, taking over 2 years for large samples. A surrogate approach is presented in which the WRC is conveniently estimated from mercury intrusion porosimetry (MIP) and validated by water sorption isotherms: The well-known Barrett, Joyner and Halenda (BJH) method of estimating the pore size distribution (PSD) from the water sorption isotherm is shown to complement the PSD derived from conventional MIP. This provides a basis for predicting the complete WRC from MIP data alone. The van Genuchten equation is used to model the combined water sorption and MIP results. It is a convenient tool for describing water retention characteristics over the full moisture content range. The van Genuchten parameter estimation based solely on MIP is shown to give a satisfactory approximation to the WRC, with a simple restriction on one. of the parameters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Comparisons are made between experimental measurements and numerical simulations of ionizing flows generated in a superorbital facility. Nitrogen, with a freestream velocity of around 10 km/s, was passed over a cylindrical model, and images were recorded using two-wavelength holographic interferometry. The resulting density, electron concentration, and temperature maps were compared with numerical simulations from the Langley Research Center aerothermodynamic upwind relaxation algorithm. The results showed generally good agreement in shock location and density distributions. Some discrepancies were observed for the electron concentration, possibly, because simulations were of a two-dimensional flow, whereas the experiments were likely to have small three-dimensional effects.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Despite its environmental (and financial) importance, there is no agreement in the literature as to which extractant most accurately estimates the phytoavailability of trace metals in soils. A large dataset was taken from the literature, and the effectiveness of various extractants to predict the phytoavailability of Cd, Zn, Ni, Cu, and Pb examined across a range of soil types and contamination levels. The data suggest that generally, the total soil trace metal content, and trace metal concentrations determined by complexing agents (such as the widely used DTPA and EDTA extractants) or acid extractants (such as 0.1 M HCl and the Mehlich 1 extractant) are only poorly correlated to plant phytoavailability. Whilst there is no consensus, it would appear that neutral salt extractants (such as 0.01 M CaCl2 and 0.1 M NaNO3) provide the most useful indication of metal phytoavailability across a range of metals of interest, although further research is required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bioelectrical impedance analysis (BIA) was used to assess body composition in rats fed on either standard laboratory diet or on a high-fat diet designed to induce obesity. Bioelectrical impedance analysis predictions of total body water and thus fat-free mass (FFM) for the group mean values were generally within 5% of the measured values by tritiated water ((H2O)-H-3) dilution. The limits of agreement for the procedure were, however, large, approximately +/-25%, limiting the applicability of the technique for measurement of body composition in individual animals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A significant problem in the collection of responses to potentially sensitive questions, such as relating to illegal, immoral or embarrassing activities, is non-sampling error due to refusal to respond or false responses. Eichhorn & Hayre (1983) suggested the use of scrambled responses to reduce this form of bias. This paper considers a linear regression model in which the dependent variable is unobserved but for which the sum or product with a scrambling random variable of known distribution, is known. The performance of two likelihood-based estimators is investigated, namely of a Bayesian estimator achieved through a Markov chain Monte Carlo (MCMC) sampling scheme, and a classical maximum-likelihood estimator. These two estimators and an estimator suggested by Singh, Joarder & King (1996) are compared. Monte Carlo results show that the Bayesian estimator outperforms the classical estimators in almost all cases, and the relative performance of the Bayesian estimator improves as the responses become more scrambled.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Carbon formation on Ni/gamma-Al2O3 catalysts and its kinetics during methane reforming with carbon dioxide was studied in the temperature range of 500-700 degrees C using a thermogravimetric analysis technique. The activation energies of methane cracking, carbon gasification in CO2, as well as carbon deposition in CO2-CH4 reforming were obtained. The results show that the activation energy for carbon gasification is larger than that of carbon formation in methane cracking and that the activation energy of coking in CO2-CH4 reforming is also larger than that of methane decomposition to carbon. The dependencies of coking rate on partial pressures of CH4 and CO2 indicate that methane decomposition is the main route for carbon deposition. A mechanism and kinetic model for carbon deposition is proposed.