941 resultados para combinatorial optimisation
Resumo:
Dissertation presented to obtain the Master Degree in Molecular, Genetics and Biomedicine
Resumo:
Dissertação para obtenção do Grau de Mestre em Logica Computicional
Resumo:
Combinatorial Optimization Problems occur in a wide variety of contexts and generally are NP-hard problems. At a corporate level solving this problems is of great importance since they contribute to the optimization of operational costs. In this thesis we propose to solve the Public Transport Bus Assignment problem considering an heterogeneous fleet and line exchanges, a variant of the Multi-Depot Vehicle Scheduling Problem in which additional constraints are enforced to model a real life scenario. The number of constraints involved and the large number of variables makes impracticable solving to optimality using complete search techniques. Therefore, we explore metaheuristics, that sacrifice optimality to produce solutions in feasible time. More concretely, we focus on the development of algorithms based on a sophisticated metaheuristic, Ant-Colony Optimization (ACO), which is based on a stochastic learning mechanism. For complex problems with a considerable number of constraints, sophisticated metaheuristics may fail to produce quality solutions in a reasonable amount of time. Thus, we developed parallel shared-memory (SM) synchronous ACO algorithms, however, synchronism originates the straggler problem. Therefore, we proposed three SM asynchronous algorithms that break the original algorithm semantics and differ on the degree of concurrency allowed while manipulating the learned information. Our results show that our sequential ACO algorithms produced better solutions than a Restarts metaheuristic, the ACO algorithms were able to learn and better solutions were achieved by increasing the amount of cooperation (number of search agents). Regarding parallel algorithms, our asynchronous ACO algorithms outperformed synchronous ones in terms of speedup and solution quality, achieving speedups of 17.6x. The cooperation scheme imposed by asynchronism also achieved a better learning rate than the original one.
Resumo:
The performance of parts produced by Free Form Extrusion (FFE), an increasingly popular additive manufacturing technique, depends mainly on their dimensional accuracy, surface quality and mechanical performance. These attributes are strongly influenced by the evolution of the filament temperature and deformation during deposition and solidification. Consequently, the availability of adequate process modelling software would offer a powerful tool to support efficient process set-up and optimisation. This work examines the contribution to the overall heat transfer of various thermal phenomena developing during the manufacturing sequence, including convection and radiation with the environment, conduction with support and between adjacent filaments, radiation between adjacent filaments and convection with entrapped air. The magnitude of the mechanical deformation is also studied. Once this exercise is completed, it is possible to select the material properties, process variables and thermal phenomena that should be taken in for effective numerical modelling of FFE.
Resumo:
The monitoring data collected during tunnel excavation can be used in inverse analysis procedures in order to identify more realistic geomechanical parameters that can increase the knowledge about the interested formations. These more realistic parameters can be used in real time to adapt the project to the real structure in situ behaviour. However, monitoring plans are normally designed for safety assessment and not especially for the purpose of inverse analysis. In fact, there is a lack of knowledge about what types and quantity of measurements are needed to succeed in identifying the parameters of interest. Also, the optimisation algorithm chosen for the identification procedure may be important for this matter. In this work, this problem is addressed using a theoretical case with which a thorough parametric study was carried out using two optimisation algorithms based on different calculation paradigms, namely a conventional gradient-based algorithm and an evolution strategy algorithm. Calculations were carried for different sets of parameters to identify several combinations of types and amount of monitoring data. The results clearly show the high importance of the available monitoring data and the chosen algorithm for the success rate of the inverse analysis process.
Resumo:
[Excerpt] The purine core is a privileged scaffold in medicinal chemistry and the biological relevance of purine derivatives makes them attractive targets in the preparation of combinatorial libraries.1,2 In particular, there is a great interest in the synthesis of 8-substituted purines due to their important potential as antiviral and anticancer agents.3 Reports on 8-aminopurines are limited and general methods to obtain these purine derivatives are still needed.4 Cyclic amines and hydrazines are key structural motifs in various bioactive agents.5 Here we report a novel, efficient and inexpensive method for the synthesis of 6,8-diaminopurines 4 incorporating cycloalkylamino substituents at N3position of the purine ring. (...)
Resumo:
For a given self-map f of M, a closed smooth connected and simply-connected manifold of dimension m ≥ 4, we provide an algorithm for estimating the values of the topological invariant Dm r [f], which equals the minimal number of r-periodic points in the smooth homotopy class of f. Our results are based on the combinatorial scheme for computing Dm r [f] introduced by G. Graff and J. Jezierski [J. Fixed Point Theory Appl. 13 (2013), 63–84]. An open-source implementation of the algorithm programmed in C++ is publicly available at http://www.pawelpilarczyk.com/combtop/.
Resumo:
Tese de Doutoramento em Ciências da Saúde
Resumo:
Tese de Doutoramento em Ciências da Saúde
Resumo:
Spinal cord injury (SCI) is a central nervous system- (CNS-) related disorder for which there is yet no successful treatment. Within the past several years, cell-based therapies have been explored for SCI repair, including the use of pluripotent human stem cells, and a number of adult-derived stem and mature cells such as mesenchymal stem cells, olfactory ensheathing cells, and Schwann cells. Although promising, cell transplantation is often overturned by the poor cell survival in the treatment of spinal cord injuries. Alternatively, the therapeutic role of different cells has been used in tissue engineering approaches by engrafting cells with biomaterials. The latter have the advantages of physically mimicking the CNS tissue, while promoting a more permissive environment for cell survival, growth, and differentiation. The roles of both cell- and biomaterial-based therapies as single therapeutic approaches for SCI repair will be discussed in this review. Moreover, as the multifactorial inhibitory environment of a SCI suggests that combinatorial approaches would be more effective, the importance of using biomaterials as cell carriers will be herein highlighted, as well as the recent advances and achievements of these promising tools for neural tissue regeneration.
Resumo:
Fluorescence in situ hybridization (FISH) is based on the use of fluorescent staining dyes, however, the signal intensity of the images obtained by microscopy is seldom quantified with accuracy by the researcher. The development of innovative digital image processing programs and tools has been trying to overcome this problem, however, the determination of fluorescent intensity in microscopy images still has issues due to the lack of precision in the results and the complexity of existing software. This work presents FISHji, a set of new ImageJ methods for automated quantification of fluorescence in images obtained by epifluorescence microscopy. To validate the methods, results obtained by FISHji were compared with results obtained by flow cytometry. The mean correlation between FISHji and flow cytometry was high and significant, showing that the imaging methods are able to accurately assess the signal intensity of fluorescence images. FISHji are available for non-commercial use at http://paginas.fe.up.pt/nazevedo/.
Resumo:
Power network planning, liberalisation, optimisation, game theory, distribution network, dispersed generation, modelling, simulation, energy market
Resumo:
Value-added process, Prcess optimisation, imultaneous Endineering, Concurrent Engineering, Process Factors, Procecc modelling
Resumo:
v.72:no.1(1977)
Resumo:
Markowitz portfolio theory (1952) has induced research into the efficiency of portfolio management. This paper studies existing nonparametric efficiency measurement approaches for single period portfolio selection from a theoretical perspective and generalises currently used efficiency measures into the full mean-variance space. Therefore, we introduce the efficiency improvement possibility function (a variation on the shortage function), study its axiomatic properties in the context of Markowitz efficient frontier, and establish a link to the indirect mean-variance utility function. This framework allows distinguishing between portfolio efficiency and allocative efficiency. Furthermore, it permits retrieving information about the revealed risk aversion of investors. The efficiency improvement possibility function thus provides a more general framework for gauging the efficiency of portfolio management using nonparametric frontier envelopment methods based on quadratic optimisation.