945 resultados para Multi-trait analysis
Resumo:
Relativistic multi-configuration Dirac-Fock wavefunctions, coupled to good angular momentum J, have been calculated for low lying states of Ba I and Ba II. The resulting electronic factors show good agreement with data derived from recent high-resolution laser spectroscopy experiments and results from a comparison of muonic and optical data.
Resumo:
Evaluation of major feed resources was conducted in four crop-livestock mixed farming systems of central southern Ethiopia, with 90 farmers, selected using multi-stage purposive and random sampling methods. Discussions were held with focused groups and key informants for vernacular name identification of feed, followed by feed sampling to analyse chemical composition (CP, ADF and NDF), in-vitro dry matter digestibility (IVDMD), and correlate with indigenous technical knowledge (ITK). Native pastures, crop residues (CR) and multi-purpose trees (MPT) are the major feed resources, demonstrated great variations in seasonality, chemical composition and IVDMD. The average CP, NDF and IVDMD values for grasses were 83.8 (ranged: 62.9–190), 619 (ranged: 357–877) and 572 (ranged: 317–743) g kg^(−1) DM, respectively. Likewise, the average CP, NDF and IVDMD for CR were 58 (ranged: 20–90), 760 (ranged: 340–931) and 461 (ranged: 285–637)g kg^(−1) DM, respectively. Generally, the MPT and non-conventional feeds (NCF, Ensete ventricosum and Ipomoea batatas) possessed higher CP (ranged: 155–164 g kg^(−1) DM) and IVDMD values (611–657 g kg^(−1) DM) while lower NDF (331–387 g kg^(−1) DM) and ADF (321–344 g kg^(−1) DM) values. The MPT and NCF were ranked as the best nutritious feeds by ITK while crop residues were the least. This study indicates that there are remarkable variations within and among forage resources in terms of chemical composition. There were also complementarities between ITK and feed laboratory results, and thus the ITK need to be taken into consideration in evaluation of local feed resources.
Resumo:
There are numerous text documents available in electronic form. More and more are becoming available every day. Such documents represent a massive amount of information that is easily accessible. Seeking value in this huge collection requires organization; much of the work of organizing documents can be automated through text classification. The accuracy and our understanding of such systems greatly influences their usefulness. In this paper, we seek 1) to advance the understanding of commonly used text classification techniques, and 2) through that understanding, improve the tools that are available for text classification. We begin by clarifying the assumptions made in the derivation of Naive Bayes, noting basic properties and proposing ways for its extension and improvement. Next, we investigate the quality of Naive Bayes parameter estimates and their impact on classification. Our analysis leads to a theorem which gives an explanation for the improvements that can be found in multiclass classification with Naive Bayes using Error-Correcting Output Codes. We use experimental evidence on two commonly-used data sets to exhibit an application of the theorem. Finally, we show fundamental flaws in a commonly-used feature selection algorithm and develop a statistics-based framework for text feature selection. Greater understanding of Naive Bayes and the properties of text allows us to make better use of it in text classification.
Resumo:
Aunque el concepto de sabiduría ha sido ampliamente estudiado por expertos de áreas como la filosofía, la religión y la psicología, aún enfrenta limitaciones en cuanto a su definición y evaluación. Por esto, el presente trabajo tiene como objetivo, formular una definición del concepto de sabiduría que permita realizar una propuesta de evaluación del concepto como competencia en los gerentes. Para esto, se realizó un análisis documental de tipo cualitativo. De esta manera, se analizaron diversos textos sobre la historia, las definiciones y las metodologías para evaluar tanto la sabiduría como las competencias; diferenciando la sabiduría de otros constructos y analizando la diferencia entre las competencias generales y las gerenciales para posteriormente, definir la sabiduría como una competencia gerencial. Como resultado de este análisis se generó un prototipo de prueba denominado SAPIENS-O, a través del cuál se busca evaluar la sabiduría como competencia gerencial. Como alcances del instrumento se pueden identificar la posibilidad de medir la sabiduría como competencia en los gerentes, la posibilidad de dar un nuevo panorama a las dificultades teóricas y empíricas sobre la sabiduría y la posibilidad de facilitar el estudio de la sabiduría en ambientes reales, más específicamente en ambientes organizacionales.
Resumo:
Recurrent spontaneous abortion (RSA) is defined as the loss of three or more consecutive pregnancies during the first trimester of embryonic intrauterine development. This kind of human infertility is frequent among the general population since it affects 1 to 5% of women. In half of the cases the etiology remains unelucidated. In the present study, we used interspecific recombinant congenic mouse strains (IRCS) in the aim to identify genes responsible for embryonic lethality. Applying a cartographic approach using a genotype/phenotype association, we identified a minimal QTL region, of about 6 Mb on chromosome 1, responsible for a high rate of embryonic death (,30%). Genetic analysis suggests that the observed phenotype is linked to uterine dysfunction. Transcriptomic analysis of the uterine tissue revealed a preferential deregulation of genes of this region compared to the rest of the genome. Some genes from the QTL region are associated with VEGF signaling, mTOR signaling and ubiquitine/proteasome-protein degradation pathways. This work may contribute to elucidate the molecular basis of a multifactorial and complex human disorder as RSA.
Resumo:
Background: Genetic and epigenetic factors interacting with the environment over time are the main causes of complex diseases such as autoimmune diseases (ADs). Among the environmental factors are organic solvents (OSs), which are chemical compounds used routinely in commercial industries. Since controversy exists over whether ADs are caused by OSs, a systematic review and meta-analysis were performed to assess the association between OSs and ADs. Methods and Findings: The systematic search was done in the PubMed, SCOPUS, SciELO and LILACS databases up to February 2012. Any type of study that used accepted classification criteria for ADs and had information about exposure to OSs was selected. Out of a total of 103 articles retrieved, 33 were finally included in the meta-analysis. The final odds ratios (ORs) and 95% confidence intervals (CIs) were obtained by the random effect model. A sensitivity analysis confirmed results were not sensitive to restrictions on the data included. Publication bias was trivial. Exposure to OSs was associated to systemic sclerosis, primary systemic vasculitis and multiple sclerosis individually and also to all the ADs evaluated and taken together as a single trait (OR: 1.54; 95% CI: 1.25-1.92; p-value, 0.001). Conclusion: Exposure to OSs is a risk factor for developing ADs. As a corollary, individuals with non-modifiable risk factors (i.e., familial autoimmunity or carrying genetic factors) should avoid any exposure to OSs in order to avoid increasing their risk of ADs.
Resumo:
Recurrent spontaneous abortion (RSA) is defined as the loss of three or more consecutive pregnancies during the first trimester of embryonic intrauterine development. This kind of human infertility is frequent among the general population since it affects 1 to 5% of women. In half of the cases the etiology remains unelucidated. In the present study, we used interspecific recombinant congenic mouse strains (IRCS) in the aim to identify genes responsible for embryonic lethality. Applying a cartographic approach using a genotype/phenotype association, we identified a minimal QTL region, of about 6 Mb on chromosome 1, responsible for a high rate of embryonic death (similar to 30%). Genetic analysis suggests that the observed phenotype is linked to uterine dysfunction. Transcriptomic analysis of the uterine tissue revealed a preferential deregulation of genes of this region compared to the rest of the genome. Some genes from the QTL region are associated with VEGF signaling, mTOR signaling and ubiquitine/proteasome-protein degradation pathways. This work may contribute to elucidate the molecular basis of a multifactorial and complex human disorder as RSA.
Resumo:
El objetivo principal de este trabajo es realizar una revisión teórica de los estudios que han elaborado un análisis acerca de la Inteligencia Emocional con la capacidad para afrontar situaciones generadoras de estrés. Los diferentes estudios muestran que niveles altos en Inteligencia Emocional se relacionan con estrategias de afrontamiento basadas en el análisis y resolución de conflictos, mientras que niveles bajos de inteligencia emocional se relacionan con estrategias de afrontamiento basadas en la evitación, la superstición, y la resistencia al cambio. La evidencia que arrojan los estudios indican que la inteligencia emocional es fundamental en el autocontrol emocional y en la habilidad de adaptación de los individuos para afrontar situaciones generadoras de estrés.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Excavations on the multi-period settlement at Old Scatness, Shetland have uncovered a number of Iron Age structures with compacted, floor-like layers. Thin section analysis was undertaken in order to investigate and compare the characteristics of these layers. The investigation also draws on earlier analyses of the Iron Age agricultural soil around the settlement and the midden deposits that accumulated within the settlement, to create a 'joined-up' analysis which considers the way material from the settlement was used and then recycled as fertiliser for the fields. Peat was collected from the nearby uplands and was used for fuel and possibly also for flooring. It is suggested that organic-rich floors from the structures were periodically removed and the material was spread onto the fields as fertilisers. More organic-rich material may have been used selectively for fertiliser, while the less organic peat ash was allowed to accumulate in middens. Several of the structures may have functioned as byres, which suggests a prehistoric plaggen system.
Resumo:
The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
The technique of linear responsibility analysis is used for a retrospective case study of a private industrial development consisting of an engineering factory and offices. A multi-disciplinary professional practice was used to manage and design the project. The organizational structure adopted on the project is analysed using concepts from systems theory which are included in Walker's theoretical model of the structure of building project organizations (Walker, 1981). This model proposes that the process of buildings provision can be viewed as systems and sub-systems which are differentiated form each other at decision points. Further to this, the sub-systematic analysis of the relationship between the contributors gives a quantitative assessment of the efficiency of the organizational structure used. There was a high level of satisfaction with the completed project and this is reflected by the way in which the organization structure corresponded to the model's proposition. However, the project was subject to string environmental forces which the project organization was not capable of entirely overcoming.
Resumo:
The modelled El Nino-mean state-seasonal cycle interactions in 23 coupled ocean-atmosphere GCMs, including the recent IPCC AR4 models, are assessed and compared to observations and theory. The models show a clear improvement over previous generations in simulating the tropical Pacific climatology. Systematic biases still include too strong mean and seasonal cycle of trade winds. El Nino amplitude is shown to be an inverse function of the mean trade winds in agreement with the observed shift of 1976 and with theoretical studies. El Nino amplitude is further shown to be an inverse function of the relative strength of the seasonal cycle. When most of the energy is within the seasonal cycle, little is left for inter-annual signals and vice versa. An interannual coupling strength (ICS) is defined and its relation with the modelled El Nino frequency is compared to that predicted by theoretical models. An assessment of the modelled El Nino in term of SST mode (S-mode) or thermocline mode (T-mode) shows that most models are locked into a S-mode and that only a few models exhibit a hybrid mode, like in observations. It is concluded that several basic El Nino-mean state-seasonal cycle relationships proposed by either theory or analysis of observations seem to be reproduced by CGCMs. This is especially true for the amplitude of El Nino and is less clear for its frequency. Most of these relationships, first established for the pre-industrial control simulations, hold for the double and quadruple CO2 stabilized scenarios. The models that exhibit the largest El Nino amplitude change in these greenhouse gas (GHG) increase scenarios are those that exhibit a mode change towards a T-mode (either from S-mode to hybrid or hybrid to T-mode). This follows the observed 1976 climate shift in the tropical Pacific, and supports the-still debated-finding of studies that associated this shift to increased GHGs. In many respects, these models are also among those that best simulate the tropical Pacific climatology (ECHAM5/MPI-OM, GFDL-CM2.0, GFDL-CM2.1, MRI-CGM2.3.2, UKMO-HadCM3). Results from this large subset of models suggest the likelihood of increased El Nino amplitude in a warmer climate, though there is considerable spread of El Nino behaviour among the models and the changes in the subsurface thermocline properties that may be important for El Nino change could not be assessed. There are no clear indications of an El Nino frequency change with increased GHG.
Resumo:
Locality to other nodes on a peer-to-peer overlay network can be established by means of a set of landmarks shared among the participating nodes. Each node independently collects a set of latency measures to landmark nodes, which are used as a multi-dimensional feature vector. Each peer node uses the feature vector to generate a unique scalar index which is correlated to its topological locality. A popular dimensionality reduction technique is the space filling Hilbert’s curve, as it possesses good locality preserving properties. However, there exists little comparison between Hilbert’s curve and other techniques for dimensionality reduction. This work carries out a quantitative analysis of their properties. Linear and non-linear techniques for scaling the landmark vectors to a single dimension are investigated. Hilbert’s curve, Sammon’s mapping and Principal Component Analysis have been used to generate a 1d space with locality preserving properties. This work provides empirical evidence to support the use of Hilbert’s curve in the context of locality preservation when generating peer identifiers by means of landmark vector analysis. A comparative analysis is carried out with an artificial 2d network model and with a realistic network topology model with a typical power-law distribution of node connectivity in the Internet. Nearest neighbour analysis confirms Hilbert’s curve to be very effective in both artificial and realistic network topologies. Nevertheless, the results in the realistic network model show that there is scope for improvements and better techniques to preserve locality information are required.
Resumo:
Discrepancies between recent global earth albedo anomaly data obtained from the climate models, space and ground observations call for a new and better earth reflectance measurement technique. The SALEX (Space Ashen Light Explorer) instrument is a space-based visible and IR instrument for precise estimation of the global earth albedo by measuring the ashen light reflected off the shadowy side of the Moon from the low earth orbit. The instrument consists of a conventional 2-mirror telescope, a pair of a 3-mirror visible imager and an IR bolometer. The performance of this unique multi-channel optical system is sensitive to the stray light contamination due to the complex optical train incorporating several reflecting and refracting elements, associated mounts and the payload mechanical enclosure. This could be further aggravated by the very bright and extended observation target (i.e. the Moon). In this paper, we report the details of extensive stray light analysis including ghosts and cross-talks, leading to the optimum set of stray light precautions for the highest signal-to-noise ratio attainable.