946 resultados para Linear analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objectives of this work were to estimate the genetic and phenotypic parameters and to predict the genetic and genotypic values of the selection candidates obtained from intraspecific crosses in Panicum maximum as well as the performance of the hybrid progeny of the existing and projected crosses. Seventy-nine intraspecific hybrids obtained from artificial crosses among five apomictic and three sexual autotetraploid individuals were evaluated in a clonal test with two replications and ten plants per plot. Green matter yield, total and leaf dry matter yields and leaf percentage were evaluated in five cuts per year during three years. Genetic parameters were estimated and breeding and genotypic values were predicted using the restricted maximum likelihood/best linear unbiased prediction procedure (REML/BLUP). The dominant genetic variance was estimated by adjusting the effect of full-sib families. Low magnitude individual narrow sense heritabilities (0.02-0.05), individual broad sense heritabilities (0.14-0.20) and repeatability measured on an individual basis (0.15-0.21) were obtained. Dominance effects for all evaluated characteristics indicated that breeding strategies that explore heterosis must be adopted. Less than 5% increase in the parameter repeatability was obtained for a three-year evaluation period and may be the criterion to determine the maximum number of years of evaluation to be adopted, without compromising gain per cycle of selection. The identification of hybrid candidates for future cultivars and of those that can be incorporated into the breeding program was based on the genotypic and breeding values, respectively. The prediction of the performance of the hybrid progeny, based on the breeding values of the progenitors, permitted the identification of the best crosses and indicated the best parents to use in crosses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction Women with Chagas disease receiving treatment with nifurtimox are discouraged from breast feeding. Many patients who would receive treatment with nifurtimox live in extreme poverty, have limited access to resources such as clean water and baby formula and may not have safe alternatives to breast milk. Aim We aimed to estimate, using limited available pharmacokinetics data, potential infant exposure to nifurtimox through breast milk. Methods Original nifurtimox plasma concentrations were obtained from published studies. Pharmacokinetic parameters were estimated using non-linear mixed-effect modelling with NONMEM V.VI. A total of 1000 nifurtimox plasma-concentration profiles were simulated and used to calculate the amount of drug that an infant would be exposed to, if breast fed 150 ml/kg/day. Results Breast milk concentrations on the basis of peak plasma levels (1361 ng/ml) and milk-plasma ratio were estimated. We calculated infant nifurtimox exposure of a breastfed infant of a mother treated with this drug to be below 10% of the maternal weight-adjusted dose, even if milk-plasma ratio were overestimated. Simulation led to similar estimates. Discussion Risk for significant infant exposure to nifurtimox through breast milk seems small and below the level of exposure of infants with Chagas disease receiving nifurtimox treatment. This potential degree of exposure may not justify discontinuation of breast feeding.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The choice network revenue management (RM) model incorporates customer purchase behavioras customers purchasing products with certain probabilities that are a function of the offeredassortment of products, and is the appropriate model for airline and hotel network revenuemanagement, dynamic sales of bundles, and dynamic assortment optimization. The underlyingstochastic dynamic program is intractable and even its certainty-equivalence approximation, inthe form of a linear program called Choice Deterministic Linear Program (CDLP) is difficultto solve in most cases. The separation problem for CDLP is NP-complete for MNL with justtwo segments when their consideration sets overlap; the affine approximation of the dynamicprogram is NP-complete for even a single-segment MNL. This is in contrast to the independentclass(perfect-segmentation) case where even the piecewise-linear approximation has been shownto be tractable. In this paper we investigate the piecewise-linear approximation for network RMunder a general discrete-choice model of demand. We show that the gap between the CDLP andthe piecewise-linear bounds is within a factor of at most 2. We then show that the piecewiselinearapproximation is polynomially-time solvable for a fixed consideration set size, bringing itinto the realm of tractability for small consideration sets; small consideration sets are a reasonablemodeling tradeoff in many practical applications. Our solution relies on showing that forany discrete-choice model the separation problem for the linear program of the piecewise-linearapproximation can be solved exactly by a Lagrangian relaxation. We give modeling extensionsand show by numerical experiments the improvements from using piecewise-linear approximationfunctions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of this study was to evaluate the efficiency of spatial statistical analysis in the selection of genotypes in a plant breeding program and, particularly, to demonstrate the benefits of the approach when experimental observations are not spatially independent. The basic material of this study was a yield trial of soybean lines, with five check varieties (of fixed effect) and 110 test lines (of random effects), in an augmented block design. The spatial analysis used a random field linear model (RFML), with a covariance function estimated from the residuals of the analysis considering independent errors. Results showed a residual autocorrelation of significant magnitude and extension (range), which allowed a better discrimination among genotypes (increase of the power of statistical tests, reduction in the standard errors of estimates and predictors, and a greater amplitude of predictor values) when the spatial analysis was applied. Furthermore, the spatial analysis led to a different ranking of the genetic materials, in comparison with the non-spatial analysis, and a selection less influenced by local variation effects was obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polynomial constraint solving plays a prominent role in several areas of hardware and software analysis and verification, e.g., termination proving, program invariant generation and hybrid system verification, to name a few. In this paper we propose a new method for solving non-linear constraints based on encoding the problem into an SMT problem considering only linear arithmetic. Unlike other existing methods, our method focuses on proving satisfiability of the constraints rather than on proving unsatisfiability, which is more relevant in several applications as we illustrate with several examples. Nevertheless, we also present new techniques based on the analysis of unsatisfiable cores that allow one to efficiently prove unsatisfiability too for a broad class of problems. The power of our approach is demonstrated by means of extensive experiments comparing our prototype with state-of-the-art tools on benchmarks taken both from the academic and the industrial world.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known the relationship between source separation and blind deconvolution: If a filtered version of an unknown i.i.d. signal is observed, temporal independence between samples can be used to retrieve the original signal, in the same manner as spatial independence is used for source separation. In this paper we propose the use of a Genetic Algorithm (GA) to blindly invert linear channels. The use of GA is justified in the case of small number of samples, where other gradient-like methods fails because of poor estimation of statistics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Artifacts are present in most of the electroencephalography (EEG) recordings, making it difficult to interpret or analyze the data. In this paper a cleaning procedure based on a multivariate extension of empirical mode decomposition is used to improve the quality of the data. This is achieved by applying the cleaning method to raw EEG data. Then, a synchrony measure is applied on the raw and the clean data in order to compare the improvement of the classification rate. Two classifiers are used, linear discriminant analysis and neural networks. For both cases, the classification rate is improved about 20%.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To estimate the possible direct effect of birth weight on blood pressure, it is conventional to condition on the mediator, current weight. Such conditioning can induce bias. Our aim was to assess the potential biasing effect of U, an unmeasured common cause of current weight and blood pressure, on the estimate of the controlled direct effect of birth weight on blood pressure, with the help of sensitivity analyses. We used data from a school-based study conducted in Switzerland in 2005-2006 (n = 3,762; mean age = 12.3 years). A small negative association was observed between birth weight and systolic blood pressure (linear regression coefficient βbw = -0.3 mmHg/kg, 95% confidence interval: -0.9, 0.3). The association was strengthened upon adjustment for current weight (βbw|C = -1.5 mmHg/kg, 95% confidence interval: -2.1, -0.9). Sensitivity analyses revealed that the negative conditional association was explained by U only if U was relatively strongly associated with blood pressure and if there was a large difference in the prevalence of U between low-birth weight and normal-birth weight children. This weakens the hypothesis that the negative relationship between birth weight and blood pressure arises only from collider-stratification bias induced by conditioning on current weight.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The major objective of this work was to evaluate the potential of image analysis for characterizing air voids in Portland cement Concrete (PCC), voids and constituents of Asphalt Concrete (AC) and aggregate gradation in AC. Images for analysis were obtained from a scanning electron microscope (SEM). Sample preparation techniques are presented that enhance signal differences so that backscattered electron (BSE) imaging, which is sensitive to atomic number changes, can be effectively employed. Work with PCC and AC pavement core samples has shown that the low vacuum scanning electron microscope (LVSEM) is better suited towards rapid analyses. The conventional high vacuum SEM can also be used for AC and PCC analyses but some distortion within the sample matrix will occur. Images with improved resolution can be obtained from scanning electron microscope (SEM) backscatter electron (BSE) micrographs. In a BSE image, voids filled with barium sulfate/resin yield excellent contrast in both PCC and AC. There is a good correlation between percent of air by image analysis and linear traverse.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The computer simulation of reaction dynamics has nowadays reached a remarkable degree of accuracy. Triatomic elementary reactions are rigorously studied with great detail on a straightforward basis using a considerable variety of Quantum Dynamics computational tools available to the scientific community. In our contribution we compare the performance of two quantum scattering codes in the computation of reaction cross sections of a triatomic benchmark reaction such as the gas phase reaction Ne + H2+ %12. NeH++ H. The computational codes are selected as representative of time-dependent (Real Wave Packet [ ]) and time-independent (ABC [ ]) methodologies. The main conclusion to be drawn from our study is that both strategies are, to a great extent, not competing but rather complementary. While time-dependent calculations advantages with respect to the energy range that can be covered in a single simulation, time-independent approaches offer much more detailed information from each single energy calculation. Further details such as the calculation of reactivity at very low collision energies or the computational effort related to account for the Coriolis couplings are analyzed in this paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES: This study aimed at measuring the lipophilicity and ionization constants of diastereoisomeric dipeptides, interpreting them in terms of conformational behavior, and developing statistical models to predict them. METHODS: A series of 20 dipeptides of general structure NH(2) -L-X-(L or D)-His-OMe was designed and synthetized. Their experimental ionization constants (pK(1) , pK(2) and pK(3) ) and lipophilicity parameters (log P(N) and log D(7.4) ) were measured by potentiometry. Molecular modeling in three media (vacuum, water, and chloroform) was used to explore and sample their conformational space, and for each stored conformer to calculate their radius of gyration, virtual log P (preferably written as log P(MLP) , meaning obtained by the molecular lipophilicity potential (MLP) method) and polar surface area (PSA). Means and ranges were calculated for these properties, as was their sensitivity (i.e., the ratio between property range and number of rotatable bonds). RESULTS: Marked differences between diastereoisomers were seen in their experimental ionization constants and lipophilicity parameters. These differences are explained by molecular flexibility, configuration-dependent differences in intramolecular interactions, and accessibility of functional groups. Multiple linear equations correlated experimental lipophilicity parameters and ionization constants with PSA range and other calculated parameters. CONCLUSION: This study documents the differences in lipophilicity and ionization constants between diastereoisomeric dipeptides. Such configuration-dependent differences are shown to depend markedly on differences in conformational behavior and to be amenable to multiple linear regression. Chirality 24:566-576, 2012. © 2012 Wiley Periodicals, Inc.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Experimental research has identified many putative agents of amphibian decline, yet the population-level consequences of these agents remain unknown, owing to lack of information on compensatory density dependence in natural populations. Here, we investigate the relative importance of intrinsic (density-dependent) and extrinsic (climatic) factors impacting the dynamics of a tree frog (Hyla arborea) population over 22 years. A combination of log-linear density dependence and rainfall (with a 2-year time lag corresponding to development time) explain 75% of the variance in the rate of increase. Such fluctuations around a variable return point might be responsible for the seemingly erratic demography and disequilibrium dynamics of many amphibian populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global positioning systems (GPS) offer a cost-effective and efficient method to input and update transportation data. The spatial location of objects provided by GPS is easily integrated into geographic information systems (GIS). The storage, manipulation, and analysis of spatial data are also relatively simple in a GIS. However, many data storage and reporting methods at transportation agencies rely on linear referencing methods (LRMs); consequently, GPS data must be able to link with linear referencing. Unfortunately, the two systems are fundamentally incompatible in the way data are collected, integrated, and manipulated. In order for the spatial data collected using GPS to be integrated into a linear referencing system or shared among LRMs, a number of issues need to be addressed. This report documents and evaluates several of those issues and offers recommendations. In order to evaluate the issues associated with integrating GPS data with a LRM, a pilot study was created. To perform the pilot study, point features, a linear datum, and a spatial representation of a LRM were created for six test roadway segments that were located within the boundaries of the pilot study conducted by the Iowa Department of Transportation linear referencing system project team. Various issues in integrating point features with a LRM or between LRMs are discussed and recommendations provided. The accuracy of the GPS is discussed, including issues such as point features mapping to the wrong segment. Another topic is the loss of spatial information that occurs when a three-dimensional or two-dimensional spatial point feature is converted to a one-dimensional representation on a LRM. Recommendations such as storing point features as spatial objects if necessary or preserving information such as coordinates and elevation are suggested. The lack of spatial accuracy characteristic of most cartography, on which LRM are often based, is another topic discussed. The associated issues include linear and horizontal offset error. The final topic discussed is some of the issues in transferring point feature data between LRMs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Assessment of image quality for digital x-ray mammography systems used in European screening programs relies mainly on contrast-detail CDMAM phantom scoring and requires the acquisition and analysis of many images in order to reduce variability in threshold detectability. Part II of this study proposes an alternative method based on the detectability index (d') calculated for a non-prewhitened model observer with an eye filter (NPWE). The detectability index was calculated from the normalized noise power spectrum and image contrast, both measured from an image of a 5 cm poly(methyl methacrylate) phantom containing a 0.2 mm thick aluminium square, and the pre-sampling modulation transfer function. This was performed as a function of air kerma at the detector for 11 different digital mammography systems. These calculated d' values were compared against threshold gold thickness (T) results measured with the CDMAM test object and against derived theoretical relationships. A simple relationship was found between T and d', as a function of detector air kerma; a linear relationship was found between d' and contrast-to-noise ratio. The values of threshold thickness used to specify acceptable performance in the European Guidelines for 0.10 and 0.25 mm diameter discs were equivalent to threshold calculated detectability indices of 1.05 and 6.30, respectively. The NPWE method is a validated alternative to CDMAM scoring for use in the image quality specification, quality control and optimization of digital x-ray systems for screening mammography.