953 resultados para Linear multivariate methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND & AIMS: Whether early parenteral lipids improve postnatal growth of preterm neonates remains unclear. We aimed to assess the effects of parenteral lipids on growth velocity in extremely-low-birth-weight infants. METHODS: This retrospective cohort study included 121 extremely-low-birth-weight infants. The associations between parenteral lipids (cumulative intakes during the first week and delays in their introduction) and growth velocities (weight, head circumference and length) up to 28 days of life and to 36 weeks of corrected age were analysed using uni- and multivariate linear regression. RESULTS: Univariate analyses showed a significant positive association between the cumulative intakes of parenteral lipids during the first week and i) weight gain up to day 28; ii) weight gain up to 36 weeks of corrected age; iii) head circumference growth up to day 28. There was a negative correlation between the delay in parenteral lipid introduction and weight gain up to day 28. In multivariate analyses, the association between the cumulative intakes of parenteral lipids and weight gain up to 28 days was independent of gestational age at birth, birth weight, sex, smallness for gestational age, and enteral intakes (regression coefficient: 0.19; 95% CI: 0.01-0.38) and, up to 36 weeks, independent of gestational age, birth weight, sex, smallness for gestational age and parenteral glucose and amino acids (0.16; 95% CI: 0.04-0.27). CONCLUSIONS: Parenteral lipids during the first week were positively associated with weight gain in extremely-low-birth-weight infants and could improve early nutritional support of preterm neonates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The method of instrumental variable (referred to as Mendelian randomization when the instrument is a genetic variant) has been initially developed to infer on a causal effect of a risk factor on some outcome of interest in a linear model. Adapting this method to nonlinear models, however, is known to be problematic. In this paper, we consider the simple case when the genetic instrument, the risk factor, and the outcome are all binary. We compare via simulations the usual two-stages estimate of a causal odds-ratio and its adjusted version with a recently proposed estimate in the context of a clinical trial with noncompliance. In contrast to the former two, we confirm that the latter is (under some conditions) a valid estimate of a causal odds-ratio defined in the subpopulation of compliers, and we propose its use in the context of Mendelian randomization. By analogy with a clinical trial with noncompliance, compliers are those individuals for whom the presence/absence of the risk factor X is determined by the presence/absence of the genetic variant Z (i.e., for whom we would observe X = Z whatever the alleles randomly received at conception). We also recall and illustrate the huge variability of instrumental variable estimates when the instrument is weak (i.e., with a low percentage of compliers, as is typically the case with genetic instruments for which this proportion is frequently smaller than 10%) where the inter-quartile range of our simulated estimates was up to 18 times higher compared to a conventional (e.g., intention-to-treat) approach. We thus conclude that the need to find stronger instruments is probably as important as the need to develop a methodology allowing to consistently estimate a causal odds-ratio.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present MBIS (Multivariate Bayesian Image Segmentation tool), a clustering tool based on the mixture of multivariate normal distributions model. MBIS supports multichannel bias field correction based on a B-spline model. A second methodological novelty is the inclusion of graph-cuts optimization for the stationary anisotropic hidden Markov random field model. Along with MBIS, we release an evaluation framework that contains three different experiments on multi-site data. We first validate the accuracy of segmentation and the estimated bias field for each channel. MBIS outperforms a widely used segmentation tool in a cross-comparison evaluation. The second experiment demonstrates the robustness of results on atlas-free segmentation of two image sets from scan-rescan protocols on 21 healthy subjects. Multivariate segmentation is more replicable than the monospectral counterpart on T1-weighted images. Finally, we provide a third experiment to illustrate how MBIS can be used in a large-scale study of tissue volume change with increasing age in 584 healthy subjects. This last result is meaningful as multivariate segmentation performs robustly without the need for prior knowledge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The Healthy Heart Kit (HHK) is a risk management and patient education kit for the prevention of cardiovascular disease (CVD) and the promotion of CV health. There are currently no published data examining predictors of HHK use by physicians. The main objective of this study was to examine the association between physicians' characteristics (socio-demographic, cognitive, and behavioural) and the use of the HHK. METHODS: All registered family physicians in Alberta (n=3068) were invited to participate in the "Healthy Heart Kit" Study. Consenting physicians (n=153) received the Kit and were requested to use it for two months. At the end of this period, a questionnaire collected data on the frequency of Kit use by physicians, as well as socio-demographic, cognitive, and behavioural variables pertaining to the physicians. RESULTS: The questionnaire was returned by 115 physicians (follow-up rate = 75%). On a scale ranging from 0 to 100, the mean score of Kit use was 61 [SD=26]. A multiple linear regression showed that "agreement with the Kit" and the degree of "confidence in using the Kit" was strongly associated with Kit use, explaining 46% of the variability for Kit use. Time since graduation was inversely associated with Kit use, and a trend was observed for smaller practices to be associated with lower use. CONCLUSION: Given these findings, future research and practice should explore innovative strategies to gain initial agreement among physicians to employ such clinical tools. Participation of older physicians and solo-practitioners in this process should be emphasized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: The longitudinal relaxation rate (R1 ) measured in vivo depends on the local microstructural properties of the tissue, such as macromolecular, iron, and water content. Here, we use whole brain multiparametric in vivo data and a general linear relaxometry model to describe the dependence of R1 on these components. We explore a) the validity of having a single fixed set of model coefficients for the whole brain and b) the stability of the model coefficients in a large cohort. METHODS: Maps of magnetization transfer (MT) and effective transverse relaxation rate (R2 *) were used as surrogates for macromolecular and iron content, respectively. Spatial variations in these parameters reflected variations in underlying tissue microstructure. A linear model was applied to the whole brain, including gray/white matter and deep brain structures, to determine the global model coefficients. Synthetic R1 values were then calculated using these coefficients and compared with the measured R1 maps. RESULTS: The model's validity was demonstrated by correspondence between the synthetic and measured R1 values and by high stability of the model coefficients across a large cohort. CONCLUSION: A single set of global coefficients can be used to relate R1 , MT, and R2 * across the whole brain. Our population study demonstrates the robustness and stability of the model. Magn Reson Med, 2014. © 2014 The Authors. Magnetic Resonance in Medicine published by Wiley Periodicals, Inc. Magn Reson Med 73:1309-1314, 2015. © 2014 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To report a single-center experience treating patients with squamous- cell carcinoma of the anal canal using helical Tomotherapy (HT) and concurrent chemotherapy (CT).Materials/Methods: From October 2007 to February 2011, 55 patients were treated with HT and concurrent CT (5-fluorouracil/capecitabin and mitomycin) for anal squamous-cell carcinoma. All patients underwent computed- tomography-based treatment planning, with pelvic and inguinal nodes receiving 36 Gy in 1.8 Gy/fraction. Following a planned 1-week break, primary tumor site and involved nodes were boosted to a total dose 59.4 Gy in 1.8 Gy/fraction. Dose-volume histograms of several organs at risk (OAR; bladder, small intestine, rectum, femoral heads, penile bulb, external genitalia) were assessed in terms of conformal avoidance. All toxicity was scored according to the CTCAE, v.3.0. HT plans and treatment were implemented using the Tomotherapy, Inc. software and hardware. For dosimetric comparisons, 3D RT and/or IMRT plans were also computed for some of the patients using the CMS planning system, for treatment with 6-18 MV photons and/or electrons with suitable energies from a Siemens Primus linear accelerator equipped with a multileaf collimator.Locoregional control and survival curves were compared with the log-rank test, and multivariate analysis by the Cox model.Results: With 360-degree-of-freedom beam projection, HT has an advantage over other RT techniques (3D or 5-field step-and-shot IMRT). There is significant improvement over 3D or 5-field IMRT plans in terms of dose conformity around the PTV, and dose gradients are steeper outside the target volume, resulting in reduced doses to OARs. Using HT, acute toxicity was acceptable, and seemed to be better than historical standards.Conclusions: Our results suggest that HT combined with concurrent CT for anal cancer is effective and tolerable. Compared to 3D RT or 5-field step-andshot IMRT, there is better conformity around the PTV, and better OAR sparing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic environmental monitoring networks enforced by wireless communication technologies provide large and ever increasing volumes of data nowadays. The use of this information in natural hazard research is an important issue. Particularly useful for risk assessment and decision making are the spatial maps of hazard-related parameters produced from point observations and available auxiliary information. The purpose of this article is to present and explore the appropriate tools to process large amounts of available data and produce predictions at fine spatial scales. These are the algorithms of machine learning, which are aimed at non-parametric robust modelling of non-linear dependencies from empirical data. The computational efficiency of the data-driven methods allows producing the prediction maps in real time which makes them superior to physical models for the operational use in risk assessment and mitigation. Particularly, this situation encounters in spatial prediction of climatic variables (topo-climatic mapping). In complex topographies of the mountainous regions, the meteorological processes are highly influenced by the relief. The article shows how these relations, possibly regionalized and non-linear, can be modelled from data using the information from digital elevation models. The particular illustration of the developed methodology concerns the mapping of temperatures (including the situations of Föhn and temperature inversion) given the measurements taken from the Swiss meteorological monitoring network. The range of the methods used in the study includes data-driven feature selection, support vector algorithms and artificial neural networks.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objectives of this work were to estimate the genetic and phenotypic parameters and to predict the genetic and genotypic values of the selection candidates obtained from intraspecific crosses in Panicum maximum as well as the performance of the hybrid progeny of the existing and projected crosses. Seventy-nine intraspecific hybrids obtained from artificial crosses among five apomictic and three sexual autotetraploid individuals were evaluated in a clonal test with two replications and ten plants per plot. Green matter yield, total and leaf dry matter yields and leaf percentage were evaluated in five cuts per year during three years. Genetic parameters were estimated and breeding and genotypic values were predicted using the restricted maximum likelihood/best linear unbiased prediction procedure (REML/BLUP). The dominant genetic variance was estimated by adjusting the effect of full-sib families. Low magnitude individual narrow sense heritabilities (0.02-0.05), individual broad sense heritabilities (0.14-0.20) and repeatability measured on an individual basis (0.15-0.21) were obtained. Dominance effects for all evaluated characteristics indicated that breeding strategies that explore heterosis must be adopted. Less than 5% increase in the parameter repeatability was obtained for a three-year evaluation period and may be the criterion to determine the maximum number of years of evaluation to be adopted, without compromising gain per cycle of selection. The identification of hybrid candidates for future cultivars and of those that can be incorporated into the breeding program was based on the genotypic and breeding values, respectively. The prediction of the performance of the hybrid progeny, based on the breeding values of the progenitors, permitted the identification of the best crosses and indicated the best parents to use in crosses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polynomial constraint solving plays a prominent role in several areas of hardware and software analysis and verification, e.g., termination proving, program invariant generation and hybrid system verification, to name a few. In this paper we propose a new method for solving non-linear constraints based on encoding the problem into an SMT problem considering only linear arithmetic. Unlike other existing methods, our method focuses on proving satisfiability of the constraints rather than on proving unsatisfiability, which is more relevant in several applications as we illustrate with several examples. Nevertheless, we also present new techniques based on the analysis of unsatisfiable cores that allow one to efficiently prove unsatisfiability too for a broad class of problems. The power of our approach is demonstrated by means of extensive experiments comparing our prototype with state-of-the-art tools on benchmarks taken both from the academic and the industrial world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known the relationship between source separation and blind deconvolution: If a filtered version of an unknown i.i.d. signal is observed, temporal independence between samples can be used to retrieve the original signal, in the same manner as spatial independence is used for source separation. In this paper we propose the use of a Genetic Algorithm (GA) to blindly invert linear channels. The use of GA is justified in the case of small number of samples, where other gradient-like methods fails because of poor estimation of statistics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artifacts are present in most of the electroencephalography (EEG) recordings, making it difficult to interpret or analyze the data. In this paper a cleaning procedure based on a multivariate extension of empirical mode decomposition is used to improve the quality of the data. This is achieved by applying the cleaning method to raw EEG data. Then, a synchrony measure is applied on the raw and the clean data in order to compare the improvement of the classification rate. Two classifiers are used, linear discriminant analysis and neural networks. For both cases, the classification rate is improved about 20%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A simple method using liquid chromatography-linear ion trap mass spectrometry for simultaneous determination of testosterone glucuronide (TG), testosterone sulfate (TS), epitestosterone glucuronide (EG) and epitestosterone sulfate (ES) in urine samples was developed. For validation purposes, a urine containing no detectable amount of TG, TS and EG was selected and fortified with steroid conjugate standards. Quantification was performed using deuterated testosterone conjugates to correct for ion suppression/enhancement during ESI. Assay validation was performed in terms of lower limit of detection (1-3ng/mL), recovery (89-101%), intraday precision (2.0-6.8%), interday precision (3.4-9.6%) and accuracy (101-103%). Application of the method to short-term stability testing of urine samples at temperature ranging from 4 to 37 degrees C during a time-storage of a week lead to the conclusion that addition of sodium azide (10mg/mL) is required for preservation of the analytes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objectives of this work were to evaluate the genotype x environment (GxE) interaction for popcorn and to compare two multivariate analyses methods. Nine popcorn cultivars were sown on four dates one month apart during each of the agricultural years 1998/1999 and 1999/2000. The experiments were carried out using randomized block designs, with four replicates. The cv. Zélia contributed the least to the GxE interaction. The cv. Viçosa performed similarly to cv. Rosa-claro. Optimization of GxE was obtained for cv. CMS 42 for a favorable mega-environment, and for cv. CMS 43 for an unfavorable environment. Multivariate analysis supported the results from the method of Eberhart & Russell. The graphic analysis of the Additive Main effects and Multiplicative Interaction (AMMI) model was simple, allowing conclusions to be made about stability, genotypic performance, genetic divergence between cultivars, and the environments that optimize cultivar performance. The graphic analysis of the Genotype main effects and Genotype x Environment interaction (GGE) method added to AMMI information on environmental stratification, defining mega-environments and the cultivars that optimized performance in those mega-environments. Both methods are adequate to explain the genotype x environment interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This report evaluates the use of remotely sensed images in implementing the Iowa DOT LRS that is currently in the stages of system architecture. The Iowa Department of Transportation is investing a significant amount of time and resources into creation of a linear referencing system (LRS). A significant portion of the effort in implementing the system will be creation of a datum, which includes geographically locating anchor points and then measuring anchor section distances between those anchor points. Currently, system architecture and evaluation of different data collection methods to establish the LRS datum is being performed for the DOT by an outside consulting team.