988 resultados para Selection tool


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing emphasis on energy efficiency is starting to yield results in the reduction in greenhouse gas emissions; however, the effort is still far from sufficient. Therefore, new technical solutions that will enhance the efficiency of power generation systems are required to maintain the sustainable growth rate, without spoiling the environment. A reduction in greenhouse gas emissions is only possible with new low-carbon technologies, which enable high efficiencies. The role of the rotating electrical machine development is significant in the reduction of global emissions. A high proportion of the produced and consumed electrical energy is related to electrical machines. One of the technical solutions that enables high system efficiency on both the energy production and consumption sides is high-speed electrical machines. This type of electrical machines has a high system overall efficiency, a small footprint, and a high power density compared with conventional machines. Therefore, high-speed electrical machines are favoured by the manufacturers producing, for example, microturbines, compressors, gas compression applications, and air blowers. High-speed machine technology is challenging from the design point of view, and a lot of research is in progress both in academia and industry regarding the solution development. The solid technical basis is of importance in order to make an impact in the industry considering the climate change. This work describes the multidisciplinary design principles and material development in high-speed electrical machines. First, high-speed permanent magnet synchronous machines with six slots, two poles, and tooth-coil windings are discussed in this doctoral dissertation. These machines have unique features, which help in solving rotordynamic problems and reducing the manufacturing costs. Second, the materials for the high-speed machines are discussed in this work. The materials are among the key limiting factors in electrical machines, and to overcome this limit, an in-depth analysis of the material properties and behavior is required. Moreover, high-speed machines are sometimes operating in a harsh environment because they need to be as close as possible to the rotating tool and fully exploit their advantages. This sets extra requirements for the materials applied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.Embedded systems are usually designed for a single or a specified set of tasks. This specificity means the system design as well as its hardware/software development can be highly optimized. Embedded software must meet the requirements such as high reliability operation on resource-constrained platforms, real time constraints and rapid development. This necessitates the adoption of static machine codes analysis tools running on a host machine for the validation and optimization of embedded system codes, which can help meet all of these goals. This could significantly augment the software quality and is still a challenging field.This dissertation contributes to an architecture oriented code validation, error localization and optimization technique assisting the embedded system designer in software debugging, to make it more effective at early detection of software bugs that are otherwise hard to detect, using the static analysis of machine codes. The focus of this work is to develop methods that automatically localize faults as well as optimize the code and thus improve the debugging process as well as quality of the code.Validation is done with the help of rules of inferences formulated for the target processor. The rules govern the occurrence of illegitimate/out of place instructions and code sequences for executing the computational and integrated peripheral functions. The stipulated rules are encoded in propositional logic formulae and their compliance is tested individually in all possible execution paths of the application programs. An incorrect sequence of machine code pattern is identified using slicing techniques on the control flow graph generated from the machine code.An algorithm to assist the compiler to eliminate the redundant bank switching codes and decide on optimum data allocation to banked memory resulting in minimum number of bank switching codes in embedded system software is proposed. A relation matrix and a state transition diagram formed for the active memory bank state transition corresponding to each bank selection instruction is used for the detection of redundant codes. Instances of code redundancy based on the stipulated rules for the target processor are identified.This validation and optimization tool can be integrated to the system development environment. It is a novel approach independent of compiler/assembler, applicable to a wide range of processors once appropriate rules are formulated. Program states are identified mainly with machine code pattern, which drastically reduces the state space creation contributing to an improved state-of-the-art model checking. Though the technique described is general, the implementation is architecture oriented, and hence the feasibility study is conducted on PIC16F87X microcontrollers. The proposed tool will be very useful in steering novices towards correct use of difficult microcontroller features in developing embedded systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High-density oligonucleotide (oligo) arrays are a powerful tool for transcript profiling. Arrays based on GeneChip® technology are amongst the most widely used, although GeneChip® arrays are currently available for only a small number of plant and animal species. Thus, we have developed a method to improve the sensitivity of high-density oligonucleotide arrays when applied to heterologous species and tested the method by analysing the transcriptome of Brassica oleracea L., a species for which no GeneChip® array is available, using a GeneChip® array designed for Arabidopsis thaliana (L.) Heynh. Genomic DNA from B. oleracea was labelled and hybridised to the ATH1-121501 GeneChip® array. Arabidopsis thaliana probe-pairs that hybridised to the B. oleracea genomic DNA on the basis of the perfect-match (PM) probe signal were then selected for subsequent B. oleracea transcriptome analysis using a .cel file parser script to generate probe mask files. The transcriptional response of B. oleracea to a mineral nutrient (phosphorus; P) stress was quantified using probe mask files generated for a wide range of gDNA hybridisation intensity thresholds. An example probe mask file generated with a gDNA hybridisation intensity threshold of 400 removed > 68 % of the available PM probes from the analysis but retained >96 % of available A. thaliana probe-sets. Ninety-nine of these genes were then identified as significantly regulated under P stress in B. oleracea, including the homologues of P stress responsive genes in A. thaliana. Increasing the gDNA hybridisation intensity thresholds up to 500 for probe-selection increased the sensitivity of the GeneChip® array to detect regulation of gene expression in B. oleracea under P stress by up to 13-fold. Our open-source software to create probe mask files is freely available http://affymetrix.arabidopsis.info/xspecies/ webcite and may be used to facilitate transcriptomic analyses of a wide range of plant and animal species in the absence of custom arrays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Restricted breeding seasons in beef cattle lead to censoring of reproductive data. In this paper, age at first conception (AFC) of Nellore females exposed to the sires for the first time between 11 and 16 months of age, was studied aiming to verify the possibility of genetically advance sexual precocity using a survival model. The final data set contained 6699 records of AFC in days. Records of females that did not calve in the next year following exposure to the sire were considered censored (77.5% of total). The model used was a Weibull mixed survival model including effects of contemporary groups, period (fixed) and animal (random). The effect of the contemporary groups on AFC was important (p < 0.01). Heritabilities were 0.51 and 0.76 in logarithmic and original scales respectively. Results indicate that it is possible to genetically advance sexual precocity, using the outcome of survival analysis of AFC as selection criterion. They also suggest that improvements of the environment could advance sexual precocity too, thus an adequate pregnancy rate for farmers could quickly be achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work has as objectives the implementation of a intelligent computational tool to identify the non-technical losses and to select its most relevant features, considering information from the database with industrial consumers profiles of a power company. The solution to this problem is not trivial and not of regional character, the minimization of non-technical loss represents the guarantee of investments in product quality and maintenance of power systems, introduced by a competitive environment after the period of privatization in the national scene. This work presents using the WEKA software to the proposed objective, comparing various classification techniques and optimization through intelligent algorithms, this way, can be possible to automate applications on Smart Grids. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As the methodologies available for the detection of positive selection from genomic data vary in terms of assumptions and execution, weak correlations are expected among them. However, if there is any given signal that is consistently supported across different methodologies, it is strong evidence that the locus has been under past selection. In this paper, a straightforward frequentist approach based on the Stouffer Method to combine P-values across different tests for evidence of recent positive selection in common variations, as well as strategies for extracting biological information from the detected signals, were described and applied to high density single nucleotide polymorphism (SNP) data generated from dairy and beef cattle (taurine and indicine). The ancestral Bovinae allele state of over 440,000 SNP is also reported. Using this combination of methods, highly significant (P<3.17×10-7) population-specific sweeps pointing out to candidate genes and pathways that may be involved in beef and dairy production were identified. The most significant signal was found in the Cornichon homolog 3 gene (CNIH3) in Brown Swiss (P = 3.82×10-12), and may be involved in the regulation of pre-ovulatory luteinizing hormone surge. Other putative pathways under selection are the glucolysis/gluconeogenesis, transcription machinery and chemokine/cytokine activity in Angus; calpain-calpastatin system and ribosome biogenesis in Brown Swiss; and gangliosides deposition in milk fat globules in Gyr. The composite method, combined with the strategies applied to retrieve functional information, may be a useful tool for surveying genome-wide selective sweeps and providing insights in to the source of selection.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chimpanzees have been the traditional referential models for investigating human evolution and stone tool use by hominins. We enlarge this comparative scenario by describing normative use of hammer stones and anvils in two wild groups of bearded capuchin monkeys (Cebus libidinosus) over one year. We found that most of the individuals habitually use stones and anvils to crack nuts and other encased food items. Further, we found that in adults (1) males use stone tools more frequently than females, (2) males crack high resistance nuts more frequently than females, (3) efficiency at opening a food by percussive tool use varies according to the resistance of the encased food, (4) heavier individuals are more efficient at cracking high resistant nuts than smaller individuals, and (5) to crack open encased foods, both sexes select hammer stones on the basis of material and weight. These findings confirm and extend previous experimental evidence concerning tool selectivity in wild capuchin monkeys (Visalberghi et al., 2009b; Fragaszy et al., 2010b). Male capuchins use tools more frequently than females and body mass is the best predictor of efficiency, but the sexes do not differ in terms of efficiency. We argue that the contrasting pattern of sex differences in capuchins compared with chimpanzees, in which females use tools more frequently and more skillfully than males, may have arisen from the degree of sexual dimorphism in body size of the two species, which is larger in capuchins than in chimpanzees. Our findings show the importance of taking sex and body mass into account as separate variables to assess their role in tool use. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Appreciation of objects` affordances and planning is a hallmark of human technology. Archeological evidence suggests that Pliocene hominins selected raw material for tool making [1, 2]. Stone pounding has been considered a precursor to tool making [3, 4], and tool use by living primates provides insight into the origins of material selection by human ancestors. No study has experimentally investigated selectivity of stone tools in wild animals, although chimpanzees appear to select stones according to properties of different nut species [5, 6]. We recently discovered that wild capuchins with terrestrial habits [7] use hammers to crack open nuts on anvils [8-10]. As for chimpanzees, examination of anvil sites suggests stone selectivity [11], but indirect evidence cannot prove it. Here, we demonstrate that capuchins, which last shared a common ancestor with humans 35 million years ago, faced with stones differing in functional features (friability and weight) choose, transport, and use the effective stone to crack nuts. Moreover, when weight cannot be judged by visual attributes, capuchins act to gain information to guide their selection. Thus, planning actions and intentional selection of tools is within the ken of monkeys and similar to the tool activities of hominins and apes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a creative and practical approach to dealing with the problem of selection bias. Selection bias may be the most important vexing problem in program evaluation or in any line of research that attempts to assert causality. Some of the greatest minds in economics and statistics have scrutinized the problem of selection bias, with the resulting approaches – Rubin’s Potential Outcome Approach(Rosenbaum and Rubin,1983; Rubin, 1991,2001,2004) or Heckman’s Selection model (Heckman, 1979) – being widely accepted and used as the best fixes. These solutions to the bias that arises in particular from self selection are imperfect, and many researchers, when feasible, reserve their strongest causal inference for data from experimental rather than observational studies. The innovative aspect of this thesis is to propose a data transformation that allows measuring and testing in an automatic and multivariate way the presence of selection bias. The approach involves the construction of a multi-dimensional conditional space of the X matrix in which the bias associated with the treatment assignment has been eliminated. Specifically, we propose the use of a partial dependence analysis of the X-space as a tool for investigating the dependence relationship between a set of observable pre-treatment categorical covariates X and a treatment indicator variable T, in order to obtain a measure of bias according to their dependence structure. The measure of selection bias is then expressed in terms of inertia due to the dependence between X and T that has been eliminated. Given the measure of selection bias, we propose a multivariate test of imbalance in order to check if the detected bias is significant, by using the asymptotical distribution of inertia due to T (Estadella et al. 2005) , and by preserving the multivariate nature of data. Further, we propose the use of a clustering procedure as a tool to find groups of comparable units on which estimate local causal effects, and the use of the multivariate test of imbalance as a stopping rule in choosing the best cluster solution set. The method is non parametric, it does not call for modeling the data, based on some underlying theory or assumption about the selection process, but instead it calls for using the existing variability within the data and letting the data to speak. The idea of proposing this multivariate approach to measure selection bias and test balance comes from the consideration that in applied research all aspects of multivariate balance, not represented in the univariate variable- by-variable summaries, are ignored. The first part contains an introduction to evaluation methods as part of public and private decision process and a review of the literature of evaluation methods. The attention is focused on Rubin Potential Outcome Approach, matching methods, and briefly on Heckman’s Selection Model. The second part focuses on some resulting limitations of conventional methods, with particular attention to the problem of how testing in the correct way balancing. The third part contains the original contribution proposed , a simulation study that allows to check the performance of the method for a given dependence setting and an application to a real data set. Finally, we discuss, conclude and explain our future perspectives.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decade, the reverse vaccinology approach shifted the paradigm of vaccine discovery from conventional culture-based methods to high-throughput genome-based approaches for the development of recombinant protein-based vaccines against pathogenic bacteria. Besides reaching its main goal of identifying new vaccine candidates, this new procedure produced also a huge amount of molecular knowledge related to them. In the present work, we explored this knowledge in a species-independent way and we performed a systematic in silico molecular analysis of more than 100 protective antigens, looking at their sequence similarity, domain composition and protein architecture in order to identify possible common molecular features. This meta-analysis revealed that, beside a low sequence similarity, most of the known bacterial protective antigens shared structural/functional Pfam domains as well as specific protein architectures. Based on this, we formulated the hypothesis that the occurrence of these molecular signatures can be predictive of possible protective properties of other proteins in different bacterial species. We tested this hypothesis in Streptococcus agalactiae and identified four new protective antigens. Moreover, in order to provide a second proof of the concept for our approach, we used Staphyloccus aureus as a second pathogen and identified five new protective antigens. This new knowledge-driven selection process, named MetaVaccinology, represents the first in silico vaccine discovery tool based on conserved and predictive molecular and structural features of bacterial protective antigens and not dependent upon the prediction of their sub-cellular localization.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bone metastases are responsible for different clinical complications defined as skeletal-related events (SREs) such as pathologic fractures, spinal cord compression, hypercalcaemia, bone marrow infiltration and severe bone pain requiring palliative radiotherapy. The general aim of these three years research period was to improve the management of patients with bone metastases through two different approaches of translational research. Firstly in vitro preclinical tests were conducted on breast cancer cells and on indirect co-colture of cancer cells and osteoclasts to evaluate bone targeted therapy singly and in combination with conventional chemotherapy. The study suggests that zoledronic acid has an antitumor activity in breast cancer cell lines. Its mechanism of action involves the decrease of RAS and RHO, as in osteoclasts. Repeated treatment enhances antitumor activity compared to non-repeated treatment. Furthermore the combination Zoledronic Acid + Cisplatin induced a high antitumoral activity in the two triple-negative lines MDA-MB-231 and BRC-230. The p21, pMAPK and m-TOR pathways were regulated by this combined treatment, particularly at lower Cisplatin doses. A co-colture system to test the activity of bone-targeted molecules on monocytes-breast conditioned by breast cancer cells was also developed. Another important criticism of the treatment of breast cancer patients, is the selection of patients who will benefit of bone targeted therapy in the adjuvant setting. A retrospective case-control study on breast cancer patients to find new predictive markers of bone metastases in the primary tumors was performed. Eight markers were evaluated and TFF1 and CXCR4 were found to discriminate between patients with relapse to bone respect to patients with no evidence of disease. In particular TFF1 was the most accurate marker reaching a sensitivity of 63% and a specificity of 79%. This marker could be a useful tool for clinicians to select patients who could benefit for bone targeted therapy in adjuvant setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 2003, the QUADAS tool for systematic reviews of diagnostic accuracy studies was developed. Experience, anecdotal reports, and feedback suggested areas for improvement; therefore, QUADAS-2 was developed. This tool comprises 4 domains: patient selection, index test, reference standard, and flow and timing. Each domain is assessed in terms of risk of bias, and the first 3 domains are also assessed in terms of concerns regarding applicability. Signalling questions are included to help judge risk of bias. The QUADAS-2 tool is applied in 4 phases: summarize the review question, tailor the tool and produce review-specific guidance, construct a flow diagram for the primary study, and judge bias and applicability. This tool will allow for more transparent rating of bias and applicability of primary diagnostic accuracy studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Many preschool children have wheeze or cough, but only some have asthma later. Existing prediction tools are difficult to apply in clinical practice or exhibit methodological weaknesses. OBJECTIVE We sought to develop a simple and robust tool for predicting asthma at school age in preschool children with wheeze or cough. METHODS From a population-based cohort in Leicestershire, United Kingdom, we included 1- to 3-year-old subjects seeing a doctor for wheeze or cough and assessed the prevalence of asthma 5 years later. We considered only noninvasive predictors that are easy to assess in primary care: demographic and perinatal data, eczema, upper and lower respiratory tract symptoms, and family history of atopy. We developed a model using logistic regression, avoided overfitting with the least absolute shrinkage and selection operator penalty, and then simplified it to a practical tool. We performed internal validation and assessed its predictive performance using the scaled Brier score and the area under the receiver operating characteristic curve. RESULTS Of 1226 symptomatic children with follow-up information, 345 (28%) had asthma 5 years later. The tool consists of 10 predictors yielding a total score between 0 and 15: sex, age, wheeze without colds, wheeze frequency, activity disturbance, shortness of breath, exercise-related and aeroallergen-related wheeze/cough, eczema, and parental history of asthma/bronchitis. The scaled Brier scores for the internally validated model and tool were 0.20 and 0.16, and the areas under the receiver operating characteristic curves were 0.76 and 0.74, respectively. CONCLUSION This tool represents a simple, low-cost, and noninvasive method to predict the risk of later asthma in symptomatic preschool children, which is ready to be tested in other populations.