881 resultados para Regression-based decomposition.


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In epidemiological work, outcomes are frequently non-normal, sample sizes may be large, and effects are often small. To relate health outcomes to geographic risk factors, fast and powerful methods for fitting spatial models, particularly for non-normal data, are required. We focus on binary outcomes, with the risk surface a smooth function of space. We compare penalized likelihood models, including the penalized quasi-likelihood (PQL) approach, and Bayesian models based on fit, speed, and ease of implementation. A Bayesian model using a spectral basis representation of the spatial surface provides the best tradeoff of sensitivity and specificity in simulations, detecting real spatial features while limiting overfitting and being more efficient computationally than other Bayesian approaches. One of the contributions of this work is further development of this underused representation. The spectral basis model outperforms the penalized likelihood methods, which are prone to overfitting, but is slower to fit and not as easily implemented. Conclusions based on a real dataset of cancer cases in Taiwan are similar albeit less conclusive with respect to comparing the approaches. The success of the spectral basis with binary data and similar results with count data suggest that it may be generally useful in spatial models and more complicated hierarchical models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The construction of a reliable, practically useful prediction rule for future response is heavily dependent on the "adequacy" of the fitted regression model. In this article, we consider the absolute prediction error, the expected value of the absolute difference between the future and predicted responses, as the model evaluation criterion. This prediction error is easier to interpret than the average squared error and is equivalent to the mis-classification error for the binary outcome. We show that the distributions of the apparent error and its cross-validation counterparts are approximately normal even under a misspecified fitted model. When the prediction rule is "unsmooth", the variance of the above normal distribution can be estimated well via a perturbation-resampling method. We also show how to approximate the distribution of the difference of the estimated prediction errors from two competing models. With two real examples, we demonstrate that the resulting interval estimates for prediction errors provide much more information about model adequacy than the point estimates alone.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Receiver Operating Characteristic (ROC) curve is a prominent tool for characterizing the accuracy of continuous diagnostic test. To account for factors that might invluence the test accuracy, various ROC regression methods have been proposed. However, as in any regression analysis, when the assumed models do not fit the data well, these methods may render invalid and misleading results. To date practical model checking techniques suitable for validating existing ROC regression models are not yet available. In this paper, we develop cumulative residual based procedures to graphically and numerically assess the goodness-of-fit for some commonly used ROC regression models, and show how specific components of these models can be examined within this framework. We derive asymptotic null distributions for the residual process and discuss resampling procedures to approximate these distributions in practice. We illustrate our methods with a dataset from the Cystic Fibrosis registry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Suppose that we are interested in establishing simple, but reliable rules for predicting future t-year survivors via censored regression models. In this article, we present inference procedures for evaluating such binary classification rules based on various prediction precision measures quantified by the overall misclassification rate, sensitivity and specificity, and positive and negative predictive values. Specifically, under various working models we derive consistent estimators for the above measures via substitution and cross validation estimation procedures. Furthermore, we provide large sample approximations to the distributions of these nonsmooth estimators without assuming that the working model is correctly specified. Confidence intervals, for example, for the difference of the precision measures between two competing rules can then be constructed. All the proposals are illustrated with two real examples and their finite sample properties are evaluated via a simulation study.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new method for fitting proportional hazards models with error-prone covariates. Regression coefficients are estimated by solving an estimating equation that is the average of the partial likelihood scores based on imputed true covariates. For the purpose of imputation, a linear spline model is assumed on the baseline hazard. We discuss consistency and asymptotic normality of the resulting estimators, and propose a stochastic approximation scheme to obtain the estimates. The algorithm is easy to implement, and reduces to the ordinary Cox partial likelihood approach when the measurement error has a degenerative distribution. Simulations indicate high efficiency and robustness. We consider the special case where error-prone replicates are available on the unobserved true covariates. As expected, increasing the number of replicate for the unobserved covariates increases efficiency and reduces bias. We illustrate the practical utility of the proposed method with an Eastern Cooperative Oncology Group clinical trial where a genetic marker, c-myc expression level, is subject to measurement error.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Approximate entropy (ApEn) of blood pressure (BP) can be easily measured based on software analysing 24-h ambulatory BP monitoring (ABPM), but the clinical value of this measure is unknown. In a prospective study we investigated whether ApEn of BP predicts, in addition to average and variability of BP, the risk of hypertensive crisis. In 57 patients with known hypertension we measured ApEn, average and variability of systolic and diastolic BP based on 24-h ABPM. Eight of these fifty-seven patients developed hypertensive crisis during follow-up (mean follow-up duration 726 days). In bivariate regression analysis, ApEn of systolic BP (P<0.01), average of systolic BP (P=0.02) and average of diastolic BP (P=0.03) were significant predictors of hypertensive crisis. The incidence rate ratio of hypertensive crisis was 14.0 (95% confidence interval (CI) 1.8, 631.5; P<0.01) for high ApEn of systolic BP as compared to low values. In multivariable regression analysis, ApEn of systolic (P=0.01) and average of diastolic BP (P<0.01) were independent predictors of hypertensive crisis. A combination of these two measures had a positive predictive value of 75%, and a negative predictive value of 91%, respectively. ApEn, combined with other measures of 24-h ABPM, is a potentially powerful predictor of hypertensive crisis. If confirmed in independent samples, these findings have major clinical implications since measures predicting the risk of hypertensive crisis define patients requiring intensive follow-up and intensified therapy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel variable decomposition approach for pose recovery of the distal locking holes using single calibrated fluoroscopic image. The problem is formulated as a model-based optimal fitting process, where the control variables are decomposed into two sets: (a) the angle between the nail axis and its projection on the imaging plane, and (b) the translation and rotation of the geometrical model of the distal locking hole around the nail axis. By using an iterative algorithm to find the optimal values of the latter set of variables for any given value of the former variable, we reduce the multiple-dimensional model-based optimal fitting problem to a one-dimensional search along a finite interval. We report the results of our in vitro experiments, which demonstrate that the accuracy of our approach is adequate for successful distal locking of intramedullary nails.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Blood lipid abnormalities in patients on highly active antiretroviral therapy (HAART) have been associated with exposure to protease inhibitors (PIs), particularly ritonavir. First therapy with a non-nucleoside reverse transcriptase inhibitor (NNRTI) leads to relatively favourable lipid profiles. We report on medium-term lipid profiles (up to 5 years) for antiretroviral-naive patients starting NNRTI- and PI-based HAART in the Swiss HIV Cohort Study. METHODS: Since April 2000, blood samples taken at visits scheduled every 6 months have been analysed for cholesterol and triglyceride concentrations. For 1065 antiretroviral-naive patients starting HAART after April 2000, we estimated changes in concentration over time using multivariate linear regression with adjustment for baseline covariates, use of lipid-lowering drugs and whether the sample was taken in a fasting state. RESULTS: Non-high density lipoprotein (HDL) cholesterol levels increase with increasing exposure to either PI- or NNRTI-based therapy, HDL cholesterol levels increase and triglyceride levels decrease with increasing exposure to NNRTI-based therapy, whereas triglyceride levels increase with increasing exposure to PI-based therapy. Between NNRTI-based therapies, there is a slight difference in triglyceride levels, which tend to increase with increasing exposure to efavirenz and to decrease with increasing exposure to nevirapine. Of the three common PI-based therapies, nelfinavir appears to have a relatively favourable lipid profile, with little change with increasing exposure. Of the other two PI therapies, lopinavir with ritonavir has a more favourable profile than indinavir with ritonavir, with smaller increases in both non-HDL cholesterol and triglycerides and an increase in HDL cholesterol. Increasing exposure to abacavir is associated with a decrease in the level of triglycerides. CONCLUSION: In general, NNRTI-based therapy is associated with a more favourable lipid profile than PI-based therapy, but different PI-based therapies are associated with very different lipid profiles. Nelfinavir appears to have a relatively favourable lipid profile. Of the two boosted PI therapies, lopinavir appears to have a more favourable lipid profile than indinavir.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Body fat changes are common in patients with HIV. For patients on protease inhibitor (PI)-based highly active antiretroviral therapy (HAART), these changes have been associated with increasing exposure to therapy in general and to stavudine in particular. Our objective is to show whether such associations are more or less likely for patients on non-nucleoside reverse transcriptase inhibitor (NNRTI)-based HAART. METHODS: We included all antiretroviral-naive patients in the Swiss HIV Cohort Study starting HAART after April 2000 who had had body weight, CD4 cell count and plasma HIV RNA measured between 6 months before and 3 months after starting HAART, and at least one assessment of body fat changes after starting HAART. At visits scheduled every 6 months, fat loss or fat gain is reported by agreement between patient and physician. We estimate the association between reported body fat changes and both time on therapy and time on stavudine, using conditional logistical regression. RESULTS: Body fat changes were reported for 85 (9%) out of 925 patients at their first assessment; a further 165 had only one assessment. Of the remaining 675 patients, body fat changes were reported for 156 patients at a rate of 13.2 changes per 100 patient-years. Body fat changes are more likely with increasing age [odds ratio (OR) 1.18 (1.00-1.38) per 10 years], with increasing BMI [OR 1.06 (1.01-1.11)] and in those with a lower baseline CD4 cell count [OR 0.91 (0.83-1.01) per 100 cells/microl]. There is only weak evidence that body fat changes are more likely with increasing time on HAART [OR 1.16 (0.93-1.46)]. After adjusting for time on HAART, fat loss is more likely with increasing stavudine use [OR 1.70 (1.34-2.15)]. There is no evidence of an association between reported fat changes and time on NNRTI therapy relative to PI therapy in those patients who used either one therapy or the other [OR 0.98 (0.56-1.63)]. CONCLUSION: Fat loss is more likely to be reported with increasing exposure to stavudine. We find no evidence of major differences between PI and NNRTI therapy in the risk of reported body fat changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Anthropogenic activities continue to drive atmospheric CO2 and O3 concentrations to levels higher than during the pre-industrial era. Accumulating evidence indicates that both elevated CO2 and elevated O3 could modify the quantity and biochemistry of woody plant biomass. Anatomical properties of woody plants are largely influenced by the activity of the cambium and the growth characteristics of wood cells, which are in turn influenced by a range of environmental factors. Hence, alterations in the concentrations of atmospheric CO2 and / or O3 could also impact wood anatomical properties. Many fungi derive their metabolic resources for growth from plant litter, including woody tissue, and therefore modifications in the quantity, biochemistry and anatomical properties of woody plants in response to elevated CO2 and / or O3 could impact the community of wood-decaying fungi and rates of wood decomposition. Consequently carbon and nutrient cycling and productivity of terrestrial ecosystem could also be impacted. Alterations in wood structure and biochemistry of woody plants could also impact wood density and subsequently impact wood quality. This dissertation examined the long term effects of elevated CO2 and / or O3 on wood anatomical properties, wood density, wood-decaying fungi and wood decomposition of northern hardwood tree species at the Aspen Free-Air CO2 and O3 Enrichment (Aspen FACE) project, near Rhinelander, WI, USA. Anatomical properties of wood varied significantly with species and aspen genotypes and radial position within the stem. Elevated CO2 did not have significant effects on wood anatomical properties in trembling aspen, paper birch or sugar maple, except for marginally increasing (P < 0.1) the number of vessels per square millimeter. Elevated O3 marginally or significantly altered vessel lumen diameter, cell wall area and vessel lumen area proportions depending on species and radial position. In line with the modifications in the anatomical properties, elevated CO2 and O3, alone, significantly modified wood density but effects were species and / or genotype specific. However, the effects of elevated CO2 and O3, alone, on wood anatomical properties and density were ameliorated when in combination. Wood species had a much greater impact on the wood-decaying fungal community and initial wood decomposition rate than did growth or decomposition of wood in elevated CO2 and / or O3. Polyporales, Agaricales, and Russulales were the dominant orders of fungi isolated. Based on the current results, future higher levels of CO2 and O3 may have moderate effects on wood quality of northern hardwoods, but for utilization purposes these may not be considered significant. However, wood-decaying fungal community composition and decomposition of northern hardwoods may be altered via shifts in species and / or genotype composition under future higher levels of CO2 and O3.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background mortality is an essential component of any forest growth and yield model. Forecasts of mortality contribute largely to the variability and accuracy of model predictions at the tree, stand and forest level. In the present study, I implement and evaluate state-of-the-art techniques to increase the accuracy of individual tree mortality models, similar to those used in many of the current variants of the Forest Vegetation Simulator, using data from North Idaho and Montana. The first technique addresses methods to correct for bias induced by measurement error typically present in competition variables. The second implements survival regression and evaluates its performance against the traditional logistic regression approach. I selected the regression calibration (RC) algorithm as a good candidate for addressing the measurement error problem. Two logistic regression models for each species were fitted, one ignoring the measurement error, which is the “naïve” approach, and the other applying RC. The models fitted with RC outperformed the naïve models in terms of discrimination when the competition variable was found to be statistically significant. The effect of RC was more obvious where measurement error variance was large and for more shade-intolerant species. The process of model fitting and variable selection revealed that past emphasis on DBH as a predictor variable for mortality, while producing models with strong metrics of fit, may make models less generalizable. The evaluation of the error variance estimator developed by Stage and Wykoff (1998), and core to the implementation of RC, in different spatial patterns and diameter distributions, revealed that the Stage and Wykoff estimate notably overestimated the true variance in all simulated stands, but those that are clustered. Results show a systematic bias even when all the assumptions made by the authors are guaranteed. I argue that this is the result of the Poisson-based estimate ignoring the overlapping area of potential plots around a tree. Effects, especially in the application phase, of the variance estimate justify suggested future efforts of improving the accuracy of the variance estimate. The second technique implemented and evaluated is a survival regression model that accounts for the time dependent nature of variables, such as diameter and competition variables, and the interval-censored nature of data collected from remeasured plots. The performance of the model is compared with the traditional logistic regression model as a tool to predict individual tree mortality. Validation of both approaches shows that the survival regression approach discriminates better between dead and alive trees for all species. In conclusion, I showed that the proposed techniques do increase the accuracy of individual tree mortality models, and are a promising first step towards the next generation of background mortality models. I have also identified the next steps to undertake in order to advance mortality models further.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The aim of this study was to determine the performance of a new, 3D-monitor based, objective stereotest in children under the age of four. METHODS: Random-dot circles (diameter 10 cm, crossed, disparity of 0.34 degrees) randomly changing their position were presented on an 3D-monitor while eye movements were monitored by infrared photo-oculography. If > or = 3 consecutive stimuli were seen, a positive response was assumed. One hundred thirty-four normal children aged 2 months to 4 years (average 17+/-15.3 months) were examined. RESULTS: Below the age of 12 months, we were not able to obtain a response to the 3D stimulus. For older children the following rates of positive responses were found: 12-18 months 25%, 18-24 months 10%, 24-30 months 16%, 30-36 months 57%, 36-42 months 100%, and 42-48 months 91%. Multiple linear logistic regression showed a significant influence on stimulus recognition of the explanatory variables age (p<0.00001) and child cooperation (p<0.001), but not of gender (p>0.1). CONCLUSIONS: This 3D-monitor based stereotest allows an objective measurement of random-dot stereopsis in younger children. It might open new ways to screen children for visual abnormalities and to study the development of stereovision. However, the current experimental setting does not allow determining random-dot stereopsis in children younger than 12 months.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis develops high performance real-time signal processing modules for direction of arrival (DOA) estimation for localization systems. It proposes highly parallel algorithms for performing subspace decomposition and polynomial rooting, which are otherwise traditionally implemented using sequential algorithms. The proposed algorithms address the emerging need for real-time localization for a wide range of applications. As the antenna array size increases, the complexity of signal processing algorithms increases, making it increasingly difficult to satisfy the real-time constraints. This thesis addresses real-time implementation by proposing parallel algorithms, that maintain considerable improvement over traditional algorithms, especially for systems with larger number of antenna array elements. Singular value decomposition (SVD) and polynomial rooting are two computationally complex steps and act as the bottleneck to achieving real-time performance. The proposed algorithms are suitable for implementation on field programmable gated arrays (FPGAs), single instruction multiple data (SIMD) hardware or application specific integrated chips (ASICs), which offer large number of processing elements that can be exploited for parallel processing. The designs proposed in this thesis are modular, easily expandable and easy to implement. Firstly, this thesis proposes a fast converging SVD algorithm. The proposed method reduces the number of iterations it takes to converge to correct singular values, thus achieving closer to real-time performance. A general algorithm and a modular system design are provided making it easy for designers to replicate and extend the design to larger matrix sizes. Moreover, the method is highly parallel, which can be exploited in various hardware platforms mentioned earlier. A fixed point implementation of proposed SVD algorithm is presented. The FPGA design is pipelined to the maximum extent to increase the maximum achievable frequency of operation. The system was developed with the objective of achieving high throughput. Various modern cores available in FPGAs were used to maximize the performance and details of these modules are presented in detail. Finally, a parallel polynomial rooting technique based on Newton’s method applicable exclusively to root-MUSIC polynomials is proposed. Unique characteristics of root-MUSIC polynomial’s complex dynamics were exploited to derive this polynomial rooting method. The technique exhibits parallelism and converges to the desired root within fixed number of iterations, making this suitable for polynomial rooting of large degree polynomials. We believe this is the first time that complex dynamics of root-MUSIC polynomial were analyzed to propose an algorithm. In all, the thesis addresses two major bottlenecks in a direction of arrival estimation system, by providing simple, high throughput, parallel algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traditional transportation fuel, petroleum, is limited and nonrenewable, and it also causes pollutions. Hydrogen is considered one of the best alternative fuels for transportation. The key issue for using hydrogen as fuel for transportation is hydrogen storage. Lithium nitride (Li3N) is an important material which can be used for hydrogen storage. The decompositions of lithium amide (LiNH2) and lithium imide (Li2NH) are important steps for hydrogen storage in Li3N. The effect of anions (e.g. Cl-) on the decomposition of LiNH2 has never been studied. Li3N can react with LiBr to form lithium nitride bromide Li13N4Br which has been proposed as solid electrolyte for batteries. The decompositions of LiNH2 and Li2NH with and without promoter were investigated by using temperature programmed decomposition (TPD) and X-ray diffraction (XRD) techniques. It was found that the decomposition of LiNH2 produced Li2NH and NH3 via two steps: LiNH2 into a stable intermediate species (Li1.5NH1.5) and then into Li2NH. The decomposition of Li2NH produced Li, N2 and H2 via two steps: Li2NH into an intermediate species --- Li4NH and then into Li. The kinetic analysis of Li2NH decomposition showed that the activation energies are 533.6 kJ/mol for the first step and 754.2 kJ/mol for the second step. Furthermore, XRD demonstrated that the Li4NH, which was generated in the decomposition of Li2NH, formed a solid solution with Li2NH. In the solid solution, Li4NH possesses a similar cubic structure as Li2NH. The lattice parameter of the cubic Li4NH is 0.5033nm. The decompositions of LiNH2 and Li2NH can be promoted by chloride ion (Cl-). The introduction of Cl- into LiNH2 resulted in the generation of a new NH3 peak at low temperature of 250 °C besides the original NH3 peak at 330 °C in TPD profiles. Furthermore, Cl- can decrease the decomposition temperature of Li2NH by about 110 °C. The degradation of Li3N was systematically investigated with techniques of XRD, Fourier transform infrared (FT-IR) spectroscopy, and UV-visible spectroscopy. It was found that O2 could not affect Li3N at room temperature. However, H2O in air can cause the degradation of Li3N due to the reaction between H2O and Li3N to LiOH. The produced LiOH can further react with CO2 in air to Li2CO3 at room temperature. Furthermore, it was revealed that Alfa-Li3N is more stable in air than Beta-Li3N. The chemical stability of Li13N4Br in air has been investigated by XRD, TPD-MS, and UV-vis absorption as a function of time. The aging process finally leads to the degradation of the Li13N4Br into Li2CO3, lithium bromite (LiBrO2) and the release of gaseous NH3. The reaction order n = 2.43 is the best fitting for the Li13N4Br degradation in air reaction. Li13N4Br energy gap was calculated to be 2.61 eV.