965 resultados para Cure rate models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Suppose that we are interested in establishing simple, but reliable rules for predicting future t-year survivors via censored regression models. In this article, we present inference procedures for evaluating such binary classification rules based on various prediction precision measures quantified by the overall misclassification rate, sensitivity and specificity, and positive and negative predictive values. Specifically, under various working models we derive consistent estimators for the above measures via substitution and cross validation estimation procedures. Furthermore, we provide large sample approximations to the distributions of these nonsmooth estimators without assuming that the working model is correctly specified. Confidence intervals, for example, for the difference of the precision measures between two competing rules can then be constructed. All the proposals are illustrated with two real examples and their finite sample properties are evaluated via a simulation study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many clinical trials to evaluate treatment efficacy, it is believed that there may exist latent treatment effectiveness lag times after which medical procedure or chemical compound would be in full effect. In this article, semiparametric regression models are proposed and studied to estimate the treatment effect accounting for such latent lag times. The new models take advantage of the invariance property of the additive hazards model in marginalizing over random effects, so parameters in the models are easy to be estimated and interpreted, while the flexibility without specifying baseline hazard function is kept. Monte Carlo simulation studies demonstrate the appropriateness of the proposed semiparametric estimation procedure. Data collected in the actual randomized clinical trial, which evaluates the effectiveness of biodegradable carmustine polymers for treatment of recurrent brain tumors, are analyzed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we develop Bayesian hierarchical distributed lag models for estimating associations between daily variations in summer ozone levels and daily variations in cardiovascular and respiratory (CVDRESP) mortality counts for 19 U.S. large cities included in the National Morbidity Mortality Air Pollution Study (NMMAPS) for the period 1987 - 1994. At the first stage, we define a semi-parametric distributed lag Poisson regression model to estimate city-specific relative rates of CVDRESP associated with short-term exposure to summer ozone. At the second stage, we specify a class of distributions for the true city-specific relative rates to estimate an overall effect by taking into account the variability within and across cities. We perform the calculations with respect to several random effects distributions (normal, t-student, and mixture of normal), thus relaxing the common assumption of a two-stage normal-normal hierarchical model. We assess the sensitivity of the results to: 1) lag structure for ozone exposure; 2) degree of adjustment for long-term trends; 3) inclusion of other pollutants in the model;4) heat waves; 5) random effects distributions; and 6) prior hyperparameters. On average across cities, we found that a 10ppb increase in summer ozone level for every day in the previous week is associated with 1.25 percent increase in CVDRESP mortality (95% posterior regions: 0.47, 2.03). The relative rate estimates are also positive and statistically significant at lags 0, 1, and 2. We found that associations between summer ozone and CVDRESP mortality are sensitive to the confounding adjustment for PM_10, but are robust to: 1) the adjustment for long-term trends, other gaseous pollutants (NO_2, SO_2, and CO); 2) the distributional assumptions at the second stage of the hierarchical model; and 3) the prior distributions on all unknown parameters. Bayesian hierarchical distributed lag models and their application to the NMMAPS data allow us estimation of an acute health effect associated with exposure to ambient air pollution in the last few days on average across several locations. The application of these methods and the systematic assessment of the sensitivity of findings to model assumptions provide important epidemiological evidence for future air quality regulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We found a significant positive correlation between local summer air temperature (May-September) and the annual sediment mass accumulation rate (MAR) in Lake Silvaplana (46°N, 9°E, 1800 m a.s.l.) during the twentieth century (r = 0.69, p < 0.001 for decadal smoothed series). Sediment trap data (2001-2005) confirm this relation with exceptionally high particle yields during the hottest summer of the last 140 years in 2003. On this base we developed a decadal-scale summer temperature reconstruction back to AD 1580. Surprisingly, the comparison of our reconstruction with two other independent regional summer temperature reconstructions (based on tree-rings and documentary data) revealed a significant negative correlation for the pre-1900 data (ie, late ‘Little Ice Age’). This demonstrates that the correlation between MAR and summer temperature is not stable in time and the actualistic principle does not apply in this case. We suggest that different climatic regimes (modern/‘Little Ice Age’) lead to changing state conditions in the catchment and thus to considerably different sediment transport mechanisms. Therefore, we calibrated our MAR data with gridded early instrumental temperature series from AD 1760-1880 (r = -0.48, p < 0.01 for decadal smoothed series) to properly reconstruct the late LIA climatic conditions. We found exceptionally low temperatures between AD 1580 and 1610 (0.75°C below twentieth-century mean) and during the late Maunder Minimum from AD 1680 to 1710 (0.5°C below twentieth-century mean). In general, summer temperatures did not experience major negative departures from the twentieth-century mean during the late ‘Little Ice Age’. This compares well with the two existing independent regional reconstructions suggesting that the LIA in the Alps was mainly a phenomenon of the cold season.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High resolution digital elevation models (DEMs) of Santiaguito and Pacaya volcanoes, Guatemala, were used to estimate volume changes and eruption rates between 1954 and 2001. The DEMs were generated from contour maps and aerial photography, which were analyzed in ArcGIS 9.0®. Because both volcanoes were growing substantially over the five decade period, they provide a good data set for exploring effective methodology for estimating volume changes. The analysis shows that the Santiaguito dome complex grew by 0.78 ± 0.07 km3 (0.52 ± 0.05 m3 s-1) over the 1954-2001 period with nearly all the growth occurring on the El Brujo (1958-75) and Caliente domes (1971-2001). Adding information from field data prior to 1954, the total volume extruded from Santiaguito since 1922 is estimated at 1.48 ± 0.19 km3. Santiaguito’s growth rate is lower than most other volcanic domes, but it has been sustained over a much longer period and has undergone a change toward more exogenous and progressively slower extrusion with time. At Santiaguito some of the material being added at the dome is subsequently transported downstream by block and ash flows, mudflows and floods, creating channel shifting and areas of aggradation and erosion. At Pacaya volcano a total volume of 0.21 ± 0.05 km3 was erupted between 1961 and 2001 for an average extrusion rate of 0.17 ± 0.04 m3 s-1. Both the Santiaguito and Pacaya eruption rate estimates reported here are minima, because they do not include estimates of materials which are transported downslope after eruption and data on ashfall which may result in significant volumes of material spread over broad areas. Regular analysis of high resolution DEMs using the methods outlined here, would help quantify the effects of fluvial changes to downstream populated areas, as well as assist in tracking hazards related to dome collapse and eruption.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for a stronger and more durable building material is becoming more important as the structural engineering field expands and challenges the behavioral limits of current materials. One of the demands for stronger material is rooted in the effects that dynamic loading has on a structure. High strain rates on the order of 101 s-1 to 103 s-1, though a small part of the overall types of loading that occur anywhere between 10-8 s-1 to 104 s-1 and at any point in a structures life, have very important effects when considering dynamic loading on a structure. High strain rates such as these can cause the material and structure to behave differently than at slower strain rates, which necessitates the need for the testing of materials under such loading to understand its behavior. Ultra high performance concrete (UHPC), a relatively new material in the U.S. construction industry, exhibits many enhanced strength and durability properties compared to the standard normal strength concrete. However, the use of this material for high strain rate applications requires an understanding of UHPC’s dynamic properties under corresponding loads. One such dynamic property is the increase in compressive strength under high strain rate load conditions, quantified as the dynamic increase factor (DIF). This factor allows a designer to relate the dynamic compressive strength back to the static compressive strength, which generally is a well-established property. Previous research establishes the relationships for the concept of DIF in design. The generally accepted methodology for obtaining high strain rates to study the enhanced behavior of compressive material strength is the split Hopkinson pressure bar (SHPB). In this research, 83 Cor-Tuf UHPC specimens were tested in dynamic compression using a SHPB at Michigan Technological University. The specimens were separated into two categories: ambient cured and thermally treated, with aspect ratios of 0.5:1, 1:1, and 2:1 within each category. There was statistically no significant difference in mean DIF for the aspect ratios and cure regimes that were considered in this study. DIF’s ranged from 1.85 to 2.09. Failure modes were observed to be mostly Type 2, Type 4, or combinations thereof for all specimen aspect ratios when classified according to ASTM C39 fracture pattern guidelines. The Comite Euro-International du Beton (CEB) model for DIF versus strain rate does not accurately predict the DIF for UHPC data gathered in this study. Additionally, a measurement system analysis was conducted to observe variance within the measurement system and a general linear model analysis was performed to examine the interaction and main effects that aspect ratio, cannon pressure, and cure method have on the maximum dynamic stress.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In many animals, males congregate in leks that females visit for the sole purpose of mating. We observed male and female behavior on 3 different-sized leks of the bower-building cichlid fish Nyassachromis cf. microcephalus to test predictions of 3 prominent lek models: the "hotshot," "hot spot," and "female preference" models. In this system, we were able to refine these predictions by distinguishing between indirect mate choice, by which females restrict their set of potential mates in the absence of individual male assessment, and direct mate choice, by which females assess males and their territories through dyadic behavioral interactions. On no lek were males holding central territories favored by indirect or direct mate choice, contrary to the prediction of the hotshot model that leks form because inferior males establish territories surrounding hotshot males preferred by females. Average female encounter rate of males increased with lek size, a pattern typically interpreted as evidence that leks form through female preference for lekking males, rather than because males congregate in hot spots of high female density. Female propensity to engage in premating behavior once courted did not increase with lek size, suggesting female preference for males on larger leks operated through indirect choice rather than direct choice based on individual assessment. The frequency of male-male competitive interactions increased with lek size, whereas their foraging rate decreased, implying a cost to males maintaining territories on larger leks. Together these data most strongly support the female preference model, where females may benefit through indirect mate choice for males able to meet the competitive cost of occupying larger leks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper two models for the simulation of glucose-insulin metabolism of children with Type 1 diabetes are presented. The models are based on the combined use of Compartmental Models (CMs) and artificial Neural Networks (NNs). Data from children with Type 1 diabetes, stored in a database, have been used as input to the models. The data are taken from four children with Type 1 diabetes and contain information about glucose levels taken from continuous glucose monitoring system, insulin intake and food intake, along with corresponding time. The influences of taken insulin on plasma insulin concentration, as well as the effect of food intake on glucose input into the blood from the gut, are estimated from the CMs. The outputs of CMs, along with previous glucose measurements, are fed to a NN, which provides short-term prediction of glucose values. For comparative reasons two different NN architectures have been tested: a Feed-Forward NN (FFNN) trained with the back-propagation algorithm with adaptive learning rate and momentum, and a Recurrent NN (RNN), trained with the Real Time Recurrent Learning (RTRL) algorithm. The results indicate that the best prediction performance can be achieved by the use of RNN.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. Planet formation models have been developed during the past years to try to reproduce what has been observed of both the solar system and the extrasolar planets. Some of these models have partially succeeded, but they focus on massive planets and, for the sake of simplicity, exclude planets belonging to planetary systems. However, more and more planets are now found in planetary systems. This tendency, which is a result of radial velocity, transit, and direct imaging surveys, seems to be even more pronounced for low-mass planets. These new observations require improving planet formation models, including new physics, and considering the formation of systems. Aims: In a recent series of papers, we have presented some improvements in the physics of our models, focussing in particular on the internal structure of forming planets, and on the computation of the excitation state of planetesimals and their resulting accretion rate. In this paper, we focus on the concurrent effect of the formation of more than one planet in the same protoplanetary disc and show the effect, in terms of architecture and composition of this multiplicity. Methods: We used an N-body calculation including collision detection to compute the orbital evolution of a planetary system. Moreover, we describe the effect of competition for accretion of gas and solids, as well as the effect of gravitational interactions between planets. Results: We show that the masses and semi-major axes of planets are modified by both the effect of competition and gravitational interactions. We also present the effect of the assumed number of forming planets in the same system (a free parameter of the model), as well as the effect of the inclination and eccentricity damping. We find that the fraction of ejected planets increases from nearly 0 to 8% as we change the number of embryos we seed the system with from 2 to 20 planetary embryos. Moreover, our calculations show that, when considering planets more massive than ~5 M⊕, simulations with 10 or 20 planetary embryos statistically give the same results in terms of mass function and period distribution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this longitudinal study, conducted in a neonatal intensive care unit, was to characterize the response to pain of high-risk very low birth weight infants (<1,500 g) from 23 to 38 weeks post-menstrual age (PMA) by measuring heart rate variability (HRV). Heart period data were recorded before, during, and after a heel lanced or wrist venipunctured blood draw for routine clinical evaluation. Pain response to the blood draw procedure and age-related changes of HRV in low-frequency and high-frequency bands were modeled with linear mixed-effects models. HRV in both bands decreased during pain, followed by a recovery to near-baseline levels. Venipuncture and mechanical ventilation were factors that attenuated the HRV response to pain. HRV at the baseline increased with post-menstrual age but the growth rate of high-frequency power was reduced in mechanically ventilated infants. There was some evidence that low-frequency HRV response to pain improved with advancing PMA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Little is known about the effects of hypothermia therapy and subsequent rewarming on the PQRST intervals and heart rate variability (HRV) in term newborns with hypoxic-ischemic encephalopathy (HIE). OBJECTIVES: This study describes the changes in the PQRST intervals and HRV during rewarming to normal core body temperature of 2 newborns with HIE after hypothermia therapy. METHODS: Within 6 h after birth, 2 newborns with HIE were cooled to a core body temperature of 33.5 degrees C for 72 h using a cooling blanket, followed by gradual rewarming (0.5 degrees C per hour) until the body temperature reached 36.5 degrees C. Custom instrumentation recorded the electrocardiogram from the leads used for clinical monitoring of vital signs. Generalized linear mixed models were calculated to estimate temperature-related changes in PQRST intervals and HRV. Results: For every 1 degrees C increase in body temperature, the heart rate increased by 9.2 bpm (95% CI 6.8-11.6), the QTc interval decreased by 21.6 ms (95% CI 17.3-25.9), and low and high frequency HRV decreased by 0.480 dB (95% CI 0.052-0.907) and 0.938 dB (95% CI 0.460-1.416), respectively. CONCLUSIONS: Hypothermia-induced changes in the electrocardiogram should be monitored carefully in future studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prognosis for lung cancer patients remains poor. Five year survival rates have been reported to be 15%. Studies have shown that dose escalation to the tumor can lead to better local control and subsequently better overall survival. However, dose to lung tumor is limited by normal tissue toxicity. The most prevalent thoracic toxicity is radiation pneumonitis. In order to determine a safe dose that can be delivered to the healthy lung, researchers have turned to mathematical models predicting the rate of radiation pneumonitis. However, these models rely on simple metrics based on the dose-volume histogram and are not yet accurate enough to be used for dose escalation trials. The purpose of this work was to improve the fit of predictive risk models for radiation pneumonitis and to show the dosimetric benefit of using the models to guide patient treatment planning. The study was divided into 3 specific aims. The first two specifics aims were focused on improving the fit of the predictive model. In Specific Aim 1 we incorporated information about the spatial location of the lung dose distribution into a predictive model. In Specific Aim 2 we incorporated ventilation-based functional information into a predictive pneumonitis model. In the third specific aim a proof of principle virtual simulation was performed where a model-determined limit was used to scale the prescription dose. The data showed that for our patient cohort, the fit of the model to the data was not improved by incorporating spatial information. Although we were not able to achieve a significant improvement in model fit using pre-treatment ventilation, we show some promising results indicating that ventilation imaging can provide useful information about lung function in lung cancer patients. The virtual simulation trial demonstrated that using a personalized lung dose limit derived from a predictive model will result in a different prescription than what was achieved with the clinically used plan; thus demonstrating the utility of a normal tissue toxicity model in personalizing the prescription dose.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models of DNA sequence evolution and methods for estimating evolutionary distances are needed for studying the rate and pattern of molecular evolution and for inferring the evolutionary relationships of organisms or genes. In this dissertation, several new models and methods are developed.^ The rate variation among nucleotide sites: To obtain unbiased estimates of evolutionary distances, the rate heterogeneity among nucleotide sites of a gene should be considered. Commonly, it is assumed that the substitution rate varies among sites according to a gamma distribution (gamma model) or, more generally, an invariant+gamma model which includes some invariable sites. A maximum likelihood (ML) approach was developed for estimating the shape parameter of the gamma distribution $(\alpha)$ and/or the proportion of invariable sites $(\theta).$ Computer simulation showed that (1) under the gamma model, $\alpha$ can be well estimated from 3 or 4 sequences if the sequence length is long; and (2) the distance estimate is unbiased and robust against violations of the assumptions of the invariant+gamma model.^ However, this ML method requires a huge amount of computational time and is useful only for less than 6 sequences. Therefore, I developed a fast method for estimating $\alpha,$ which is easy to implement and requires no knowledge of tree. A computer program was developed for estimating $\alpha$ and evolutionary distances, which can handle the number of sequences as large as 30.^ Evolutionary distances under the stationary, time-reversible (SR) model: The SR model is a general model of nucleotide substitution, which assumes (i) stationary nucleotide frequencies and (ii) time-reversibility. It can be extended to SRV model which allows rate variation among sites. I developed a method for estimating the distance under the SR or SRV model, as well as the variance-covariance matrix of distances. Computer simulation showed that the SR method is better than a simpler method when the sequence length $L>1,000$ bp and is robust against deviations from time-reversibility. As expected, when the rate varies among sites, the SRV method is much better than the SR method.^ The evolutionary distances under nonstationary nucleotide frequencies: The statistical properties of the paralinear and LogDet distances under nonstationary nucleotide frequencies were studied. First, I developed formulas for correcting the estimation biases of the paralinear and LogDet distances. The performances of these formulas and the formulas for sampling variances were examined by computer simulation. Second, I developed a method for estimating the variance-covariance matrix of the paralinear distance, so that statistical tests of phylogenies can be conducted when the nucleotide frequencies are nonstationary. Third, a new method for testing the molecular clock hypothesis was developed in the nonstationary case. ^