922 resultados para truncated regression
Resumo:
We develop fast fitting methods for generalized functional linear models. An undersmooth of the functional predictor is obtained by projecting on a large number of smooth eigenvectors and the coefficient function is estimated using penalized spline regression. Our method can be applied to many functional data designs including functions measured with and without error, sparsely or densely sampled. The methods also extend to the case of multiple functional predictors or functional predictors with a natural multilevel structure. Our approach can be implemented using standard mixed effects software and is computationally fast. Our methodology is motivated by a diffusion tensor imaging (DTI) study. The aim of this study is to analyze differences between various cerebral white matter tract property measurements of multiple sclerosis (MS) patients and controls. While the statistical developments proposed here were motivated by the DTI study, the methodology is designed and presented in generality and is applicable to many other areas of scientific research. An online appendix provides R implementations of all simulations.
Resumo:
BACKGROUND: Surfactant protein D (SP-D) deficient mice develop emphysema-like pathology associated with focal accumulations of foamy alveolar macrophages, an excess of surfactant phospholipids in the alveolar space and both hypertrophy and hyperplasia of alveolar type II cells. These findings are associated with a chronic inflammatory state. Treatment of SP-D deficient mice with a truncated recombinant fragment of human SP-D (rfhSP-D) has been shown to decrease the lipidosis and alveolar macrophage accumulation as well as production of proinflammatory chemokines. The aim of this study was to investigate if rfhSP-D treatment reduces the structural abnormalities in parenchymal architecture and type II cells characteristic of SP-D deficiency. METHODS: SP-D knock-out mice, aged 3 weeks, 6 weeks and 9 weeks were treated with rfhSP-D for 9, 6 and 3 weeks, respectively. All mice were sacrificed at age 12 weeks and compared to both PBS treated SP-D deficient and wild-type groups. Lung structure was quantified by design-based stereology at the light and electron microscopic level. Emphasis was put on quantification of emphysema, type II cell changes and intracellular surfactant. Data were analysed with two sided non-parametric Mann-Whitney U-test. MAIN RESULTS: After 3 weeks of treatment, alveolar number was higher and mean alveolar size was smaller compared to saline-treated SP-D knock-out controls. There was no significant difference concerning these indices of pulmonary emphysema within rfhSP-D treated groups. Type II cell number and size were smaller as a consequence of treatment. The total volume of lamellar bodies per type II cell and per lung was smaller after 6 weeks of treatment. CONCLUSION: Treatment of SP-D deficient mice with rfhSP-D leads to a reduction in the degree of emphysema and a correction of type II cell hyperplasia and hypertrophy. This supports the concept that rfhSP-D might become a therapeutic option in diseases that are characterized by decreased SP-D levels in the lung.
Resumo:
A combination of oral zidovudine (250 mg twice daily) and subcutaneous interferon-alpha (10 x 10(6) units daily) was evaluated for clinical, antiretroviral, and immunological efficacy and for side effects in 17 patients with AIDS-related Kaposi's sarcoma. Fifteen patients were evaluable. During the study period of 12 weeks, tumor responses were complete in two patients and partial in two patients (27% major response rate). Minimal responses were seen in two patients (40% overall response rate). An anti-HIV effect (reduction of serum p24 antigen by 70% or more) was observed in seven of ten evaluable patients who were initially antigenemic. CD4 lymphocyte counts remained unchanged. In six patients who had either a tumor response or a marked decline of HIV antigenemia, the treatment was continued between 12 and 59 weeks beyond the study period. Two of four patients with tumor regression at 12 weeks had an additional tumor response in this period despite prior dose reduction of interferon due to toxicity. Late progression of KS was eventually observed in four of six patients on prolonged treatment. The responsiveness of Kaposi's sarcoma seen in this study in patients with low CD4 counts and prior constitutional symptoms (fever, weight loss) was unexpected and needs further confirmation by larger patient groups. Dose-limiting toxicities were bone marrow depression (severe anemia in four and neutropenia with anemia in two patients), subjective adverse experiences (fever, fatigue, myalgia; four patients) and both (two patients).(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
Background mortality is an essential component of any forest growth and yield model. Forecasts of mortality contribute largely to the variability and accuracy of model predictions at the tree, stand and forest level. In the present study, I implement and evaluate state-of-the-art techniques to increase the accuracy of individual tree mortality models, similar to those used in many of the current variants of the Forest Vegetation Simulator, using data from North Idaho and Montana. The first technique addresses methods to correct for bias induced by measurement error typically present in competition variables. The second implements survival regression and evaluates its performance against the traditional logistic regression approach. I selected the regression calibration (RC) algorithm as a good candidate for addressing the measurement error problem. Two logistic regression models for each species were fitted, one ignoring the measurement error, which is the “naïve” approach, and the other applying RC. The models fitted with RC outperformed the naïve models in terms of discrimination when the competition variable was found to be statistically significant. The effect of RC was more obvious where measurement error variance was large and for more shade-intolerant species. The process of model fitting and variable selection revealed that past emphasis on DBH as a predictor variable for mortality, while producing models with strong metrics of fit, may make models less generalizable. The evaluation of the error variance estimator developed by Stage and Wykoff (1998), and core to the implementation of RC, in different spatial patterns and diameter distributions, revealed that the Stage and Wykoff estimate notably overestimated the true variance in all simulated stands, but those that are clustered. Results show a systematic bias even when all the assumptions made by the authors are guaranteed. I argue that this is the result of the Poisson-based estimate ignoring the overlapping area of potential plots around a tree. Effects, especially in the application phase, of the variance estimate justify suggested future efforts of improving the accuracy of the variance estimate. The second technique implemented and evaluated is a survival regression model that accounts for the time dependent nature of variables, such as diameter and competition variables, and the interval-censored nature of data collected from remeasured plots. The performance of the model is compared with the traditional logistic regression model as a tool to predict individual tree mortality. Validation of both approaches shows that the survival regression approach discriminates better between dead and alive trees for all species. In conclusion, I showed that the proposed techniques do increase the accuracy of individual tree mortality models, and are a promising first step towards the next generation of background mortality models. I have also identified the next steps to undertake in order to advance mortality models further.
Resumo:
In this thesis, we consider Bayesian inference on the detection of variance change-point models with scale mixtures of normal (for short SMN) distributions. This class of distributions is symmetric and thick-tailed and includes as special cases: Gaussian, Student-t, contaminated normal, and slash distributions. The proposed models provide greater flexibility to analyze a lot of practical data, which often show heavy-tail and may not satisfy the normal assumption. As to the Bayesian analysis, we specify some prior distributions for the unknown parameters in the variance change-point models with the SMN distributions. Due to the complexity of the joint posterior distribution, we propose an efficient Gibbs-type with Metropolis- Hastings sampling algorithm for posterior Bayesian inference. Thereafter, following the idea of [1], we consider the problems of the single and multiple change-point detections. The performance of the proposed procedures is illustrated and analyzed by simulation studies. A real application to the closing price data of U.S. stock market has been analyzed for illustrative purposes.
Resumo:
This morning Dr. Battle will introduce descriptive statistics and linear regression and how to apply these concepts in mathematical modeling. You will also learn how to use a spreadsheet to help with statistical analysis and to create graphs.
Resumo:
OBJECTIVES: This paper is concerned with checking goodness-of-fit of binary logistic regression models. For the practitioners of data analysis, the broad classes of procedures for checking goodness-of-fit available in the literature are described. The challenges of model checking in the context of binary logistic regression are reviewed. As a viable solution, a simple graphical procedure for checking goodness-of-fit is proposed. METHODS: The graphical procedure proposed relies on pieces of information available from any logistic analysis; the focus is on combining and presenting these in an informative way. RESULTS: The information gained using this approach is presented with three examples. In the discussion, the proposed method is put into context and compared with other graphical procedures for checking goodness-of-fit of binary logistic models available in the literature. CONCLUSION: A simple graphical method can significantly improve the understanding of any logistic regression analysis and help to prevent faulty conclusions.
Resumo:
A combinatorial protocol (CP) is introduced here to interface it with the multiple linear regression (MLR) for variable selection. The efficiency of CP-MLR is primarily based on the restriction of entry of correlated variables to the model development stage. It has been used for the analysis of Selwood et al data set [16], and the obtained models are compared with those reported from GFA [8] and MUSEUM [9] approaches. For this data set CP-MLR could identify three highly independent models (27, 28 and 31) with Q2 value in the range of 0.632-0.518. Also, these models are divergent and unique. Even though, the present study does not share any models with GFA [8], and MUSEUM [9] results, there are several descriptors common to all these studies, including the present one. Also a simulation is carried out on the same data set to explain the model formation in CP-MLR. The results demonstrate that the proposed method should be able to offer solutions to data sets with 50 to 60 descriptors in reasonable time frame. By carefully selecting the inter-parameter correlation cutoff values in CP-MLR one can identify divergent models and handle data sets larger than the present one without involving excessive computer time.
Resumo:
Sequential studies of osteopenic bone disease in small animals require the availability of non-invasive, accurate and precise methods to assess bone mineral content (BMC) and bone mineral density (BMD). Dual-energy X-ray absorptiometry (DXA), which is currently used in humans for this purpose, can also be applied to small animals by means of adapted software. Precision and accuracy of DXA was evaluated in 10 rats weighing 50-265 g. The rats were anesthetized with a mixture of ketamine-xylazine administrated intraperitoneally. Each rat was scanned six times consecutively in the antero-posterior incidence after repositioning using the rat whole-body software for determination of whole-body BMC and BMD (Hologic QDR 1000, software version 5.52). Scan duration was 10-20 min depending on rat size. After the last measurement, rats were sacrificed and soft tissues were removed by dermestid beetles. Skeletons were then scanned in vitro (ultra high resolution software, version 4.47). Bones were subsequently ashed and dissolved in hydrochloric acid and total body calcium directly assayed by atomic absorption spectrophotometry (TBCa[chem]). Total body calcium was also calculated from the DXA whole-body in vivo measurement (TBCa[DXA]) and from the ultra high resolution measurement (TBCa[UH]) under the assumption that calcium accounts for 40.5% of the BMC expressed as hydroxyapatite. Precision error for whole-body BMC and BMD (mean +/- S.D.) was 1.3% and 1.5%, respectively. Simple regression analysis between TBCa[DXA] or TBCa[UH] and TBCa[chem] revealed tight correlations (n = 0.991 and 0.996, respectively), with slopes and intercepts which were significantly different from 1 and 0, respectively.(ABSTRACT TRUNCATED AT 250 WORDS)
Resumo:
Truncated distributions of the exponential family have great influence in the simulation models. This paper discusses the truncated Weibull distribution specifically. The truncation of the distribution is achieved by the Maximum Likelihood Estimation method or combined with the expectation and variance expressions. After the fitting of distribution, the goodness-of-fit tests (the Chi-Square test and the Kolmogorov-Smirnov test) are executed to rule out the rejected hypotheses. Finally the distributions are integrated in various simulation models, e. g. shipment consolidation model, to compare the influence of truncated and original versions of Weibull distribution on the model.