927 resultados para Negative Selection Algorithm
Resumo:
The objective of the present work was to evaluate 27 progenies of cocoa crosses considering the agronomic traits and select F1 plants within superior crosses. The experiment was installed in March 2005, in the Experimental Station Joaquim Bahiana (ESJOB), in Itajuipe, Bahia. The area of the experiment is of approximately 3 ha, with a total of 3240 plants. Thirteen evaluations of vegetative brooms, five of cushion brooms and 15 of number of pods per plant were accomplished. Thirty pollinations were made for each selected plant to test for self-compatibility. The production, based on the number of pods per plant, and resistance to witches´ broom indicated CEPEC 94 x CCN 10, RB 39 x CCN 51 and CCN 10 x VB 1151 as superior progenies. All selections tested were self-compatible. The analyses of progenies and individual tree data, associated to visual field observations, allowed the selection of 17 plants which were included in a network of regional tests to determine the phenotypic stability.
Resumo:
The objectives of this study were to evaluate the possibility of selecting anthracnose resistant common bean plants using detached primary leaves in partially controlled environment of a greenhouse and identify differences in the reaction of genotypes to anthracnose. The common bean cultivars Ouro Negro, OuroVermelho, ManteigãoFosco 11, Rudá, Rudá-R, VP8, BRSMG Madrepérola, Pérola, MeiaNoite and BRSMG Talismãwere characterizedfor resistance to the races 65, 81 and 453 of Colletotrichum lindemuthianum and the method of detached primary leaves was compared to the method with the traditional inoculation of plants at the phenological stage V2. The lines Rudá, Rudá-R and Pérola were inoculated with the races 65 and 453 of C. lindemuthianum, aiming to assess the rate of coincidence of anthracnose severity by both inoculation methods. In general, the two methods presented similar results for the reaction of the cultivars. The use of detached primary leaves of common bean plants in the partially controlled environment was feasible for selection of plants resistant to anthracnose and has the advantages of low-needed infrastructure and reduction of resources, space and time.
Resumo:
Quantitative analysis of cine cardiac magnetic resonance (CMR) images for the assessment of global left ventricular morphology and function remains a routine task in clinical cardiology practice. To date, this process requires user interaction and therefore prolongs the examination (i.e. cost) and introduces observer variability. In this study, we sought to validate the feasibility, accuracy, and time efficiency of a novel framework for automatic quantification of left ventricular global function in a clinical setting.
Resumo:
INTRODUCTION: The correct identification of the underlying cause of death and its precise assignment to a code from the International Classification of Diseases are important issues to achieve accurate and universally comparable mortality statistics These factors, among other ones, led to the development of computer software programs in order to automatically identify the underlying cause of death. OBJECTIVE: This work was conceived to compare the underlying causes of death processed respectively by the Automated Classification of Medical Entities (ACME) and the "Sistema de Seleção de Causa Básica de Morte" (SCB) programs. MATERIAL AND METHOD: The comparative evaluation of the underlying causes of death processed respectively by ACME and SCB systems was performed using the input data file for the ACME system that included deaths which occurred in the State of S. Paulo from June to December 1993, totalling 129,104 records of the corresponding death certificates. The differences between underlying causes selected by ACME and SCB systems verified in the month of June, when considered as SCB errors, were used to correct and improve SCB processing logic and its decision tables. RESULTS: The processing of the underlying causes of death by the ACME and SCB systems resulted in 3,278 differences, that were analysed and ascribed to lack of answer to dialogue boxes during processing, to deaths due to human immunodeficiency virus [HIV] disease for which there was no specific provision in any of the systems, to coding and/or keying errors and to actual problems. The detailed analysis of these latter disclosed that the majority of the underlying causes of death processed by the SCB system were correct and that different interpretations were given to the mortality coding rules by each system, that some particular problems could not be explained with the available documentation and that a smaller proportion of problems were identified as SCB errors. CONCLUSION: These results, disclosing a very low and insignificant number of actual problems, guarantees the use of the version of the SCB system for the Ninth Revision of the International Classification of Diseases and assures the continuity of the work which is being undertaken for the Tenth Revision version.
Resumo:
Reclaimed water from small wastewater treatment facilities in the rural areas of the Beira Interior region (Portugal) may constitute an alternative water source for aquifer recharge. A 21-month monitoring period in a constructed wetland treatment system has shown that 21,500 m(3) year(-1) of treated wastewater (reclaimed water) could be used for aquifer recharge. A GIS-based multi-criteria analysis was performed, combining ten thematic maps and economic, environmental and technical criteria, in order to produce a suitability map for the location of sites for reclaimed water infiltration. The areas chosen for aquifer recharge with infiltration basins are mainly composed of anthrosol with more than 1 m deep and fine sand texture, which allows an average infiltration velocity of up to 1 m d(-1). These characteristics will provide a final polishing treatment of the reclaimed water after infiltration (soil aquifer treatment (SAT)), suitable for the removal of the residual load (trace organics, nutrients, heavy metals and pathogens). The risk of groundwater contamination is low since the water table in the anthrosol areas ranges from 10 m to 50 m. Oil the other hand, these depths allow a guaranteed unsaturated area suitable for SAT. An area of 13,944 ha was selected for study, but only 1607 ha are suitable for reclaimed water infiltration. Approximately 1280 m(2) were considered enough to set up 4 infiltration basins to work in flooding and drying cycles.
Resumo:
In this paper we present results on the optimization of device architectures for colour and imaging applications, using a device with a TCO/pinpi'n/TCO configuration. The effect of the applied voltage on the color selectivity is discussed. Results show that the spectral response curves demonstrate rather good separation between the red, green and blue basic colors. Combining the information obtained under positive and negative applied bias a colour image is acquired without colour filters or pixel architecture. A low level image processing algorithm is used for the colour image reconstruction.
Resumo:
5th. European Congress on Computational Methods in Applied Sciences and Engineering (ECCOMAS 2008) 8th. World Congress on Computational Mechanics (WCCM8)
Resumo:
This paper presents an algorithm to efficiently generate the state-space of systems specified using the IOPT Petri-net modeling formalism. IOPT nets are a non-autonomous Petri-net class, based on Place-Transition nets with an extended set of features designed to allow the rapid prototyping and synthesis of system controllers through an existing hardware-software co-design framework. To obtain coherent and deterministic operation, IOPT nets use a maximal-step execution semantics where, in a single execution step, all enabled transitions will fire simultaneously. This fact increases the resulting state-space complexity and can cause an arc "explosion" effect. Real-world applications, with several million states, will reach a higher order of magnitude number of arcs, leading to the need for high performance state-space generator algorithms. The proposed algorithm applies a compilation approach to read a PNML file containing one IOPT model and automatically generate an optimized C program to calculate the corresponding state-space.
Resumo:
In recent years the use of several new resources in power systems, such as distributed generation, demand response and more recently electric vehicles, has significantly increased. Power systems aim at lowering operational costs, requiring an adequate energy resources management. In this context, load consumption management plays an important role, being necessary to use optimization strategies to adjust the consumption to the supply profile. These optimization strategies can be integrated in demand response programs. The control of the energy consumption of an intelligent house has the objective of optimizing the load consumption. This paper presents a genetic algorithm approach to manage the consumption of a residential house making use of a SCADA system developed by the authors. Consumption management is done reducing or curtailing loads to keep the power consumption in, or below, a specified energy consumption limit. This limit is determined according to the consumer strategy and taking into account the renewable based micro generation, energy price, supplier solicitations, and consumers’ preferences. The proposed approach is compared with a mixed integer non-linear approach.
Resumo:
To maintain a power system within operation limits, a level ahead planning it is necessary to apply competitive techniques to solve the optimal power flow (OPF). OPF is a non-linear and a large combinatorial problem. The Ant Colony Search (ACS) optimization algorithm is inspired by the organized natural movement of real ants and has been successfully applied to different large combinatorial optimization problems. This paper presents an implementation of Ant Colony optimization to solve the OPF in an economic dispatch context. The proposed methodology has been developed to be used for maintenance and repairing planning with 48 to 24 hours antecipation. The main advantage of this method is its low execution time that allows the use of OPF when a large set of scenarios has to be analyzed. The paper includes a case study using the IEEE 30 bus network. The results are compared with other well-known methodologies presented in the literature.
Resumo:
This paper presents a Unit Commitment model with reactive power compensation that has been solved by Genetic Algorithm (GA) optimization techniques. The GA has been developed a computational tools programmed/coded in MATLAB. The main objective is to find the best generations scheduling whose active power losses are minimal and the reactive power to be compensated, subjected to the power system technical constraints. Those are: full AC power flow equations, active and reactive power generation constraints. All constraints that have been represented in the objective function are weighted with a penalty factors. The IEEE 14-bus system has been used as test case to demonstrate the effectiveness of the proposed algorithm. Results and conclusions are dully drawn.
Resumo:
Electricity market players operating in a liberalized environment requires access to an adequate decision support tool, allowing them to consider all the business opportunities and take strategic decisions. Ancillary services represent a good negotiation opportunity that must be considered by market players. For this, decision support tools must include ancillary market simulation. This paper proposes two different methods (Linear Programming and Genetic Algorithm approaches) for ancillary services dispatch. The methodologies are implemented in MASCEM, a multi-agent based electricity market simulator. A test case concerning the dispatch of Regulation Down, Regulation Up, Spinning Reserve and Non-Spinning Reserve services is included in this paper.
Resumo:
Distributed generation unlike centralized electrical generation aims to generate electrical energy on small scale as near as possible to load centers, interchanging electric power with the network. This work presents a probabilistic methodology conceived to assist the electric system planning engineers in the selection of the distributed generation location, taking into account the hourly load changes or the daily load cycle. The hourly load centers, for each of the different hourly load scenarios, are calculated deterministically. These location points, properly weighted according to their load magnitude, are used to calculate the best fit probability distribution. This distribution is used to determine the maximum likelihood perimeter of the area where each source distributed generation point should preferably be located by the planning engineers. This takes into account, for example, the availability and the cost of the land lots, which are factors of special relevance in urban areas, as well as several obstacles important for the final selection of the candidates of the distributed generation points. The proposed methodology has been applied to a real case, assuming three different bivariate probability distributions: the Gaussian distribution, a bivariate version of Freund’s exponential distribution and the Weibull probability distribution. The methodology algorithm has been programmed in MATLAB. Results are presented and discussed for the application of the methodology to a realistic case and demonstrate the ability of the proposed methodology for efficiently handling the determination of the best location of the distributed generation and their corresponding distribution networks.
Resumo:
Although it is always weak between RFID Tag and Terminal in focus of the security, there are no security skills in RFID Tag. Recently there are a lot of studying in order to protect it, but because it has some physical limitation of RFID, that is it should be low electric power and high speed, it is impossible to protect with the skills. At present, the methods of RFID security are using a security server, a security policy and security. One of them the most famous skill is the security module, then they has an authentication skill and an encryption skill. In this paper, we designed and implemented after modification original SEED into 8 Round and 64 bits for Tag.
Resumo:
Bone is constantly being molded and shaped by the action of osteoclasts and osteoblasts. A proper equilibrium between both cell types metabolic activities is required to ensure an adequate skeletal tissue structure, and it involves resorption of old bone and formation of new bone tissue. It is reported that treatment with antiepileptic drugs (AEDs) can elicit alterations in skeletal structure, in particular in bone mineral density. Nevertheless, the knowledge regarding the effects of AEDs on bone cells are still scarce, particularly on osteoclastic behaviour. In this context, the aim of this study was to investigate the effects of five different AEDs on human osteoclastic cells. Osteoclastic cell cultures were established from precursor cells isolated from human peripheral blood, and were maintained in the absence (control) or in the presence of 10-8-10-4 M of different AEDs (valproate, carbamazepine, gabapentin, lamotrigine and topiramate). Cell cultures were characterized throughout a 21-day period for tartrate-resistant acid phosphatase (TRAP) activity, number of TRAP+ multinucleated cells, presence of cells with actin rings and expressing vitronectin and calcitonin receptors, and apoptosis rate. Also, the involvement of several signaling pathways on the cellular response was addressed. All the tested drugs were able to affect osteoclastic cell development, although with different profiles on their osteoclastogenic modulation properties. Globally, the tendency was to inhibit the process. Furthermore, the signaling pathways involved in the process also seemed to be differentially affected by the AEDs, suggesting that the different drugs may affect osteoclastogenesis through different mechanisms. In conclusion, the present study showed that the different AEDs had the ability to negatively modulate the osteoclastogenesis process, shedding new light towards a better understanding of how these drugs can affect bone tissue.