882 resultados para Branch and bound algorithms
Resumo:
The ramification and the distribution of the phrenic nerves right and left had been studied in 30 muscles diaphragms of unknown breed adult domestic cats, 7 males and 23 females, and were fixed in 10% buffered formalin solution. After fixation and dissection, it was observed that the phrenic nerves ramified for the respective carnous parts of the muscle diaphragm, pars lumbalis, costalis and sternalis, and had finished in higher frequency in dorsolateral trunk and ventral branch (63.33%). We observed the following arrangements: dorsal, ventral and lateral branches (25.0%); dorsal branch and ventrolateral trunk (6.66%); dorsolateral and ventrolateral trunks (3.33%); dorsolateral trunk, lateral and ventral branches (1.66%). The phrenic nerves had distributed symmetrically in 11 samples (36.66%), only showing the termination in dorsolateral trunk and ventral branch. The dorsal branches supplied pars lumbalis (73.33% to right and 56.66% to the left) and pars costalis (13.33% to right and 10.0% to the left). The right dorsal branch supplied the crus mediale dexter of the right pillar (100.0%) and the left dorsal branch supplied the crus mediale sinister of the right pillar and the left pillar (100.0%). The lateral branches supply pars lumbalis (23.33% to right and 33.33% to the left), pars costalis (96.66% to right and 100.0% to the left) and pars sternalis (3.33% only to the right). The ventral branches supplied the ventral region of pars costalis (46.66 % to right and 43.33% to the left) and pars sternalis (96.66% to right and 100.0% to the left). Four female animals (13.33%) had shown fibers crossing proceeding from the left ventral branch for right antimere had been that in one of these samples (3.33%) occurred connection between the left ventral branch and the right.
Resumo:
In the spatial electric load forecasting, the future land use determination is one of the most important tasks, and one of the most difficult, because of the stochastic nature of the city growth. This paper proposes a fast and efficient algorithm to find out the future land use for the vacant land in the utility service area, using ideas from knowledge extraction and evolutionary algorithms. The methodology was implemented into a full simulation software for spatial electric load forecasting, showing a high rate of success when the results are compared to information gathered from specialists. The importance of this methodology lies in the reduced set of data needed to perform the task and the simplicity for implementation, which is a great plus for most of the electric utilities without specialized tools for this planning activity. © 2008 IEEE.
Resumo:
The present paper evaluates meta-heuristic approaches to solve a soft drink industry problem. This problem is motivated by a real situation found in soft drink companies, where the lot sizing and scheduling of raw materials in tanks and products in lines must be simultaneously determined. Tabu search, threshold accepting and genetic algorithms are used as procedures to solve the problem at hand. The methods are evaluated with a set of instance already available for this problem. This paper also proposes a new set of complex instances. The computational results comparing these approaches are reported. © 2008 IEEE.
Resumo:
In this paper, a methodology based on Unconstrained Binary Programming (UBP) model and Genetic Algorithms (GAs) is proposed for estimating fault sections in automated distribution substations. The UBP model, established by using the parsimonious set covering theory, looks for the match between the relays' protective alarms informed by the SCADA system and their expected states. The GA is developed to minimize the UBP model and estimate the fault sections in a swift and reliable manner. The proposed methodology is tested by utilizing a real-life automated distribution substation. Control parameters of the GA are tuned to achieve maximum computational efficiency and reduction of processing time. Results show the potential and efficiency of the methodology for estimating fault section in real-time at Distribution Control Centers. ©2009 IEEE.
Resumo:
Since Sharir and Pnueli, algorithms for context-sensitivity have been defined in terms of 'valid' paths in an interprocedural flow graph. The definition of valid paths requires atomic call and ret statements, and encapsulated procedures. Thus, the resulting algorithms are not directly applicable when behavior similar to call and ret instructions may be realized using non-atomic statements, or when procedures do not have rigid boundaries, such as with programs in low level languages like assembly or RTL. We present a framework for context-sensitive analysis that requires neither atomic call and ret instructions, nor encapsulated procedures. The framework presented decouples the transfer of control semantics and the context manipulation semantics of statements. A new definition of context-sensitivity, called stack contexts, is developed. A stack context, which is defined using trace semantics, is more general than Sharir and Pnueli's interprocedural path based calling-context. An abstract interpretation based framework is developed to reason about stack-contexts and to derive analogues of calling-context based algorithms using stack-context. The framework presented is suitable for deriving algorithms for analyzing binary programs, such as malware, that employ obfuscations with the deliberate intent of defeating automated analysis. The framework is used to create a context-sensitive version of Venable et al.'s algorithm for analyzing x86 binaries without requiring that a binary conforms to a standard compilation model for maintaining procedures, calls, and returns. Experimental results show that a context-sensitive analysis using stack-context performs just as well for programs where the use of Sharir and Pnueli's calling-context produces correct approximations. However, if those programs are transformed to use call obfuscations, a contextsensitive analysis using stack-context still provides the same, correct results and without any additional overhead. © Springer Science+Business Media, LLC 2011.
Resumo:
In this paper, some new constraints and an extended formulation are presented for a Lot Sizing and Scheduling Model proposed in the literature. In the production process considered a key material is prepared and is transformed into different final items. The sequencing decisions are related to the order in which the materials are processed and the lot sizing decisions are related to the final items production. The mathematical formulation considers sequence-dependent setup costs and times. Results of the computational tests executed using the software Cplex 10.0 showed that the performance of the branch-and-cut method can be improved by the proposed a priori reformulation.
Resumo:
The performance of muon reconstruction in CMS is evaluated using a large data sample of cosmic-ray muons recorded in 2008. Efficiencies of various high-level trigger, identification, and reconstruction algorithms have been measured for a broad range of muon momenta, and were found to be in good agreement with expectations from Monte Carlo simulation. The relative momentum resolution for muons crossing the barrel part of the detector is better than 1% at 10 GeV/c and is about 8% at 500 GeV/c, the latter being only a factor of two worse than expected with ideal alignment conditions. Muon charge misassignment ranges from less than 0.01% at 10GeV/c to about 1% at 500 GeV/c. © 2010 IOP Publishing Ltd and SISSA.
Resumo:
We consider the two-level network design problem with intermediate facilities. This problem consists of designing a minimum cost network respecting some requirements, usually described in terms of the network topology or in terms of a desired flow of commodities between source and destination vertices. Each selected link must receive one of two types of edge facilities and the connection of different edge facilities requires a costly and capacitated vertex facility. We propose a hybrid decomposition approach which heuristically obtains tentative solutions for the vertex facilities number and location and use these solutions to limit the computational burden of a branch-and-cut algorithm. We test our method on instances of the power system secondary distribution network design problem. The results show that the method is efficient both in terms of solution quality and computational times. © 2010 Elsevier Ltd.
Resumo:
In this paper we describe the development of a low-cost high-accuracy Galileo Code receiver, user application software and positioning algorithms for land management applications, which have been implemented using a dedicated FPGA board and dual frequency Galileo E5/L1 Radio Frequency Front-End. The current situation of rural property surveying in Brazil is described and the use of code measurements from the new Galileo signals E5 AltBOC combined with E1 MBOC for use in land management applications is explored. We explain how such approach is expected to allow delivering an absolute positioning solution which could bridge the gap between receivers of high cost/complexity/accuracy based on carrier phase and receivers of lower cost/accuracy based on pseudorange observables. The system is presented together with a detailed description of main components: the Code Receiver and the Application Software. The work presented is part of an ongoing European-Brazilian consortium effort to explore the use of new Galileo for land management applications in Brazil and sponsored by the GNSS Supervisory Authority (GSA).
Resumo:
Detecting misbehavior (such as transmissions of false information) in vehicular ad hoc networks (VANETs) is a very important problem with wide range of implications, including safety related and congestion avoidance applications. We discuss several limitations of existing misbehavior detection schemes (MDS) designed for VANETs. Most MDS are concerned with detection of malicious nodes. In most situations, vehicles would send wrong information because of selfish reasons of their owners, e.g. for gaining access to a particular lane. It is therefore more important to detect false information than to identify misbehaving nodes. We introduce the concept of data-centric misbehavior detection and propose algorithms which detect false alert messages and misbehaving nodes by observing their actions after sending out the alert messages. With the data-centric MDS, each node can decide whether an information received is correct or false. The decision is based on the consistency of recent messages and new alerts with reported and estimated vehicle positions. No voting or majority decisions is needed, making our MDS resilient to Sybil attacks. After misbehavior is detected, we do not revoke all the secret credentials of misbehaving nodes, as done in most schemes. Instead, we impose fines on misbehaving nodes (administered by the certification authority), discouraging them to act selfishly. This reduces the computation and communication costs involved in revoking all the secret credentials of misbehaving nodes. © 2011 IEEE.
Resumo:
Increased accessibility to high-performance computing resources has created a demand for user support through performance evaluation tools like the iSPD (iconic Simulator for Parallel and Distributed systems), a simulator based on iconic modelling for distributed environments such as computer grids. It was developed to make it easier for general users to create their grid models, including allocation and scheduling algorithms. This paper describes how schedulers are managed by iSPD and how users can easily adopt the scheduling policy that improves the system being simulated. A thorough description of iSPD is given, detailing its scheduler manager. Some comparisons between iSPD and Simgrid simulations, including runs of the simulated environment in a real cluster, are also presented. © 2012 IEEE.
Resumo:
HLA-G has an important role in the modulation of the maternal immune system during pregnancy, and evidence that balancing selection acts in the promoter and 3′UTR regions has been previously reported. To determine whether selection acts on the HLA-G coding region in the Amazon Rainforest, exons 2, 3 and 4 were analyzed in a sample of 142 Amerindians from nine villages of five isolated tribes that inhabit the Central Amazon. Six previously described single-nucleotide polymorphisms (SNPs) were identified and the Expectation-Maximization (EM) and PHASE algorithms were used to computationally reconstruct SNP haplotypes (HLA-G alleles). A new HLA-G allele, which originated in Amerindian populations by a crossing-over event between two widespread HLA-G alleles, was identified in 18 individuals. Neutrality tests evidenced that natural selection has a complex part in the HLA-G coding region. Although balancing selection is the type of selection that shapes variability at a local level (Native American populations), we have also shown that purifying selection may occur on a worldwide scale. Moreover, the balancing selection does not seem to act on the coding region as strongly as it acts on the flanking regulatory regions, and such coding signature may actually reflect a hitchhiking effect.Genes and Immunity advance online publication, 3 October 2013; doi:10.1038/gene.2013.47.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Ciências Sociais - FFC