965 resultados para Cure rate models
Resumo:
Doutoramento em Gestão
Resumo:
Doutoramento em Economia.
Resumo:
116 p.
Resumo:
This study focuses on multiple linear regression models relating six climate indices (temperature humidity THI, environmental stress ESI, equivalent temperature index ETI, heat load HLI, modified HLI (HLI new), and respiratory rate predictor RRP) with three main components of cow’s milk (yield, fat, and protein) for cows in Iran. The least absolute shrinkage selection operator (LASSO) and the Akaike information criterion (AIC) techniques are applied to select the best model for milk predictands with the smallest number of climate predictors. Uncertainty estimation is employed by applying bootstrapping through resampling. Cross validation is used to avoid over-fitting. Climatic parameters are calculated from the NASA-MERRA global atmospheric reanalysis. Milk data for the months from April to September, 2002 to 2010 are used. The best linear regression models are found in spring between milk yield as the predictand and THI, ESI, ETI, HLI, and RRP as predictors with p-value < 0.001 and R2 (0.50, 0.49) respectively. In summer, milk yield with independent variables of THI, ETI, and ESI show the highest relation (p-value < 0.001) with R2 (0.69). For fat and protein the results are only marginal. This method is suggested for the impact studies of climate variability/change on agriculture and food science fields when short-time series or data with large uncertainty are available.
Resumo:
Cardiovascular disease is one of the leading causes of death around the world. Resting heart rate has been shown to be a strong and independent risk marker for adverse cardiovascular events and mortality, and yet its role as a predictor of risk is somewhat overlooked in clinical practice. With the aim of highlighting its prognostic value, the role of resting heart rate as a risk marker for death and other adverse outcomes was further examined in a number of different patient populations. A systematic review of studies that previously assessed the prognostic value of resting heart rate for mortality and other adverse cardiovascular outcomes was presented. New analyses of nine clinical trials were carried out. Both the original and extended Cox model that allows for analysis of time-dependent covariates were used to evaluate and compare the predictive value of baseline and time-updated heart rate measurements for adverse outcomes in the CAPRICORN, EUROPA, PROSPER, PERFORM, BEAUTIFUL and SHIFT populations. Pooled individual patient meta-analyses of the CAPRICORN, EPHESUS, OPTIMAAL and VALIANT trials, and the BEAUTIFUL and SHIFT trials, were also performed. The discrimination and calibration of the models applied were evaluated using Harrell’s C-statistic and likelihood ratio tests, respectively. Finally, following on from the systematic review, meta-analyses of the relation between baseline and time-updated heart rate, and the risk of death from any cause and from cardiovascular causes, were conducted. Both elevated baseline and time-updated resting heart rates were found to be associated with an increase in the risk of mortality and other adverse cardiovascular events in all of the populations analysed. In some cases, elevated time-updated heart rate was associated with risk of events where baseline heart rate was not. Time-updated heart rate also contributed additional information about the risk of certain events despite knowledge of baseline heart rate or previous heart rate measurements. The addition of resting heart rate to the models where resting heart rate was found to be associated with risk of outcome improved both discrimination and calibration, and in general, the models including time-updated heart rate along with baseline or the previous heart rate measurement had the highest and similar C-statistics, and thus the greatest discriminative ability. The meta-analyses demonstrated that a 5bpm higher baseline heart rate was associated with a 7.9% and an 8.0% increase in the risk of all-cause and cardiovascular death, respectively (both p less than 0.001). Additionally, a 5bpm higher time-updated heart rate (adjusted for baseline heart rate in eight of the ten studies included in the analyses) was associated with a 12.8% (p less than 0.001) and a 10.9% (p less than 0.001) increase in the risk of all-cause and cardiovascular death, respectively. These findings may motivate health care professionals to routinely assess resting heart rate in order to identify individuals at a higher risk of adverse events. The fact that the addition of time-updated resting heart rate improved the discrimination and calibration of models for certain outcomes, even if only modestly, strengthens the case that it be added to traditional risk models. The findings, however, are of particular importance, and have greater implications for the clinical management of patients with pre-existing disease. An elevated, or increasing heart rate over time could be used as a tool, potentially alongside other established risk scores, to help doctors identify patient deterioration or those at higher risk, who might benefit from more intensive monitoring or treatment re-evaluation. Further exploration of the role of continuous recording of resting heart rate, say, when patients are at home, would be informative. In addition, investigation into the cost-effectiveness and optimal frequency of resting heart rate measurement is required. One of the most vital areas for future research is the definition of an objective cut-off value for the definition of a high resting heart rate.
Resumo:
Survival models are being widely applied to the engineering field to model time-to-event data once censored data is here a common issue. Using parametric models or not, for the case of heterogeneous data, they may not always represent a good fit. The present study relays on critical pumps survival data where traditional parametric regression might be improved in order to obtain better approaches. Considering censored data and using an empiric method to split the data into two subgroups to give the possibility to fit separated models to our censored data, we’ve mixture two distinct distributions according a mixture-models approach. We have concluded that it is a good method to fit data that does not fit to a usual parametric distribution and achieve reliable parameters. A constant cumulative hazard rate policy was used as well to check optimum inspection times using the obtained model from the mixture-model, which could be a plus when comparing with the actual maintenance policies to check whether changes should be introduced or not.
Resumo:
Both compressible and incompressible porous medium models are used in the literature to describe the mechanical aspects of living tissues. Using a stiff pressure law, it is possible to build a link between these two different representations. In the incompressible limit, compressible models generate free boundary problems where saturation holds in the moving domain. Our work aims at investigating the stiff pressure limit of reaction-advection-porous medium equations motivated by tumor development. Our first study concerns the analysis and numerical simulation of a model including the effect of nutrients. A coupled system of equations describes the cell density and the nutrient concentration and the derivation of the pressure equation in the stiff limit was an open problem for which the strong compactness of the pressure gradient is needed. To establish it, we use two new ideas: an L3-version of the celebrated Aronson-Bénilan estimate, and a sharp uniform L4-bound on the pressure gradient. We further investigate the sharpness of this bound through a finite difference upwind scheme, which we prove to be stable and asymptotic preserving. Our second study is centered around porous medium equations including convective effects. We are able to extend the techniques developed for the nutrient case, hence finding the complementarity relation on the limit pressure. Moreover, we provide an estimate of the convergence rate at the incompressible limit. Finally, we study a multi-species system. In particular, we account for phenotypic heterogeneity, including a structured variable into the problem. In this case, a cross-(degenerate)-diffusion system describes the evolution of the phenotypic distributions. Adapting methods recently developed in the context of two-species systems, we prove existence of weak solutions and we pass to the incompressible limit. Furthermore, we prove new regularity results on the total pressure, which is related to the total density by a power law of state.
Resumo:
Amphibians have been declining worldwide and the comprehension of the threats that they face could be improved by using mark-recapture models to estimate vital rates of natural populations. Recently, the consequences of marking amphibians have been under discussion and the effects of toe clipping on survival are debatable, although it is still the most common technique for individually identifying amphibians. The passive integrated transponder (PIT tag) is an alternative technique, but comparisons among marking techniques in free-ranging populations are still lacking. We compared these two marking techniques using mark-recapture models to estimate apparent survival and recapture probability of a neotropical population of the blacksmith tree frog, Hypsiboas faber. We tested the effects of marking technique and number of toe pads removed while controlling for sex. Survival was similar among groups, although slightly decreased from individuals with one toe pad removed, to individuals with two and three toe pads removed, and finally to PIT-tagged individuals. No sex differences were detected. Recapture probability slightly increased with the number of toe pads removed and was the lowest for PIT-tagged individuals. Sex was an important predictor for recapture probability, with males being nearly five times more likely to be recaptured. Potential negative effects of both techniques may include reduced locomotion and high stress levels. We recommend the use of covariates in models to better understand the effects of marking techniques on frogs. Accounting for the effect of the technique on the results should be considered, because most techniques may reduce survival. Based on our results, but also on logistical and cost issues associated with PIT tagging, we suggest the use of toe clipping with anurans like the blacksmith tree frog.
Resumo:
Prosopis rubriflora and Prosopis ruscifolia are important species in the Chaquenian regions of Brazil. Because of the restriction and frequency of their physiognomy, they are excellent models for conservation genetics studies. The use of microsatellite markers (Simple Sequence Repeats, SSRs) has become increasingly important in recent years and has proven to be a powerful tool for both ecological and molecular studies. In this study, we present the development and characterization of 10 new markers for P. rubriflora and 13 new markers for P. ruscifolia. The genotyping was performed using 40 P. rubriflora samples and 48 P. ruscifolia samples from the Chaquenian remnants in Brazil. The polymorphism information content (PIC) of the P. rubriflora markers ranged from 0.073 to 0.791, and no null alleles or deviation from Hardy-Weinberg equilibrium (HW) were detected. The PIC values for the P. ruscifolia markers ranged from 0.289 to 0.883, but a departure from HW and null alleles were detected for certain loci; however, this departure may have resulted from anthropic activities, such as the presence of livestock, which is very common in the remnant areas. In this study, we describe novel SSR polymorphic markers that may be helpful in future genetic studies of P. rubriflora and P. ruscifolia.
Resumo:
Since insect species are poikilothermic organisms, they generally exhibit different growth patterns depending on the temperature at which they develop. This factor is important in forensic entomology, especially for estimating postmortem interval (PMI) when it is based on the developmental time of the insects reared in decomposing bodies. This study aimed to estimate the rates of development, viability, and survival of immatures of Sarcophaga (Liopygia) ruficornis (Fabricius 1794) and Microcerella halli (Engel 1931) (Diptera: Sarcophagidae) reared in different temperatures: 10, 15, 20, 25, 30, and 35 ± 1 °C. Bovine raw ground meat was offered as food for all experimental groups, each consisting of four replicates, in the proportion of 2 g/larva. To measure the evolution of growth, ten specimens of each group were randomly chosen and weighed every 12 h, from initial feeding larva to pupae, and then discarded. Considering the records of weight gain, survival rates, and stability of growth rates, the range of optimum temperature for the development of S. (L.) ruficornis is between 20 and 35 °C, and that of M. halli is between 20 and 25 °C. For both species, the longest times of development were in the lowest temperatures. The survival rate at extreme temperatures (10 and 35 °C) was lower in both species. Biological data such as the ones obtained in this study are of great importance to achieve a more accurate estimate of the PMI.
Resumo:
In acquired immunodeficiency syndrome (AIDS) studies it is quite common to observe viral load measurements collected irregularly over time. Moreover, these measurements can be subjected to some upper and/or lower detection limits depending on the quantification assays. A complication arises when these continuous repeated measures have a heavy-tailed behavior. For such data structures, we propose a robust structure for a censored linear model based on the multivariate Student's t-distribution. To compensate for the autocorrelation existing among irregularly observed measures, a damped exponential correlation structure is employed. An efficient expectation maximization type algorithm is developed for computing the maximum likelihood estimates, obtaining as a by-product the standard errors of the fixed effects and the log-likelihood function. The proposed algorithm uses closed-form expressions at the E-step that rely on formulas for the mean and variance of a truncated multivariate Student's t-distribution. The methodology is illustrated through an application to an Human Immunodeficiency Virus-AIDS (HIV-AIDS) study and several simulation studies.
Resumo:
Investigate factors associated with the onset of diabetes in women aged more than 49 years. Cross-sectional, population-based study using self-reports with 622 women. The dependent variable was the age of occurrence of diabetes using the life table method. Cox multiple regression models were adjusted to analyse the onset of diabetes according to predictor variables. Sociodemographic, clinical and behavioural factors were evaluated. Of the 622 women interviewed, 22.7% had diabetes. The mean age at onset was 56 years. The factors associated with the age of occurrence of diabetes were self-rated health (very good, good) (coefficient=-0.792; SE of the coefficient=0.215; p=0.0001), more than two individuals living in the household (coefficient=0.656, SE of the coefficient=0.223; p=0.003), and body mass index (BMI) (kg/m(2)) at 20-30 years of age (coefficient= 0.056, SE of the coefficient=0.023; p=0.014). Self-rated health considered good or very good was associated with a higher rate of survival without diabetes. Sharing a home with two or more other people and a weight increase at 20-30 years of age was associated with the onset of type 2 diabetes.
Resumo:
Often in biomedical research, we deal with continuous (clustered) proportion responses ranging between zero and one quantifying the disease status of the cluster units. Interestingly, the study population might also consist of relatively disease-free as well as highly diseased subjects, contributing to proportion values in the interval [0, 1]. Regression on a variety of parametric densities with support lying in (0, 1), such as beta regression, can assess important covariate effects. However, they are deemed inappropriate due to the presence of zeros and/or ones. To evade this, we introduce a class of general proportion density, and further augment the probabilities of zero and one to this general proportion density, controlling for the clustering. Our approach is Bayesian and presents a computationally convenient framework amenable to available freeware. Bayesian case-deletion influence diagnostics based on q-divergence measures are automatic from the Markov chain Monte Carlo output. The methodology is illustrated using both simulation studies and application to a real dataset from a clinical periodontology study.
Resumo:
A method using the ring-oven technique for pre-concentration in filter paper discs and near infrared hyperspectral imaging is proposed to identify four detergent and dispersant additives, and to determine their concentration in gasoline. Different approaches were used to select the best image data processing in order to gather the relevant spectral information. This was attained by selecting the pixels of the region of interest (ROI), using a pre-calculated threshold value of the PCA scores arranged as histograms, to select the spectra set; summing up the selected spectra to achieve representativeness; and compensating for the superimposed filter paper spectral information, also supported by scores histograms for each individual sample. The best classification model was achieved using linear discriminant analysis and genetic algorithm (LDA/GA), whose correct classification rate in the external validation set was 92%. Previous classification of the type of additive present in the gasoline is necessary to define the PLS model required for its quantitative determination. Considering that two of the additives studied present high spectral similarity, a PLS regression model was constructed to predict their content in gasoline, while two additional models were used for the remaining additives. The results for the external validation of these regression models showed a mean percentage error of prediction varying from 5 to 15%.
Resumo:
Atmospheric carbon dioxide records indicate that the land surface has acted as a strong global carbon sink over recent decades, with a substantial fraction of this sink probably located in the tropics, particularly in the Amazon. Nevertheless, it is unclear how the terrestrial carbon sink will evolve as climate and atmospheric composition continue to change. Here we analyse the historical evolution of the biomass dynamics of the Amazon rainforest over three decades using a distributed network of 321 plots. While this analysis confirms that Amazon forests have acted as a long-term net biomass sink, we find a long-term decreasing trend of carbon accumulation. Rates of net increase in above-ground biomass declined by one-third during the past decade compared to the 1990s. This is a consequence of growth rate increases levelling off recently, while biomass mortality persistently increased throughout, leading to a shortening of carbon residence times. Potential drivers for the mortality increase include greater climate variability, and feedbacks of faster growth on mortality, resulting in shortened tree longevity. The observed decline of the Amazon sink diverges markedly from the recent increase in terrestrial carbon uptake at the global scale, and is contrary to expectations based on models.