37 resultados para Many-to-many-assignment problem
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
The Generalized Assignment Problem consists in assigning a setof tasks to a set of agents with minimum cost. Each agent hasa limited amount of a single resource and each task must beassigned to one and only one agent, requiring a certain amountof the resource of the agent. We present new metaheuristics forthe generalized assignment problem based on hybrid approaches.One metaheuristic is a MAX-MIN Ant System (MMAS), an improvedversion of the Ant System, which was recently proposed byStutzle and Hoos to combinatorial optimization problems, and itcan be seen has an adaptive sampling algorithm that takes inconsideration the experience gathered in earlier iterations ofthe algorithm. Moreover, the latter heuristic is combined withlocal search and tabu search heuristics to improve the search.A greedy randomized adaptive search heuristic (GRASP) is alsoproposed. Several neighborhoods are studied, including one basedon ejection chains that produces good moves withoutincreasing the computational effort. We present computationalresults of the comparative performance, followed by concludingremarks and ideas on future research in generalized assignmentrelated problems.
Resumo:
This paper proposes an heuristic for the scheduling of capacity requests and the periodic assignment of radio resources in geostationary (GEO) satellite networks with star topology, using the Demand Assigned Multiple Access (DAMA) protocol in the link layer, and Multi-Frequency Time Division Multiple Access (MF-TDMA) and Adaptive Coding and Modulation (ACM) in the physical layer.
Resumo:
The Drivers Scheduling Problem (DSP) consists of selecting a set of duties for vehicle drivers, for example buses, trains, plane or boat drivers or pilots, for the transportation of passengers or goods. This is a complex problem because it involves several constraints related to labour and company rules and can also present different evaluation criteria and objectives. Being able to develop an adequate model for this problem that can represent the real problem as close as possible is an important research area.The main objective of this research work is to present new mathematical models to the DSP problem that represent all the complexity of the drivers scheduling problem, and also demonstrate that the solutions of these models can be easily implemented in real situations. This issue has been recognized by several authors and as important problem in Public Transportation. The most well-known and general formulation for the DSP is a Set Partition/Set Covering Model (SPP/SCP). However, to a large extend these models simplify some of the specific business aspects and issues of real problems. This makes it difficult to use these models as automatic planning systems because the schedules obtained must be modified manually to be implemented in real situations. Based on extensive passenger transportation experience in bus companies in Portugal, we propose new alternative models to formulate the DSP problem. These models are also based on Set Partitioning/Covering Models; however, they take into account the bus operator issues and the perspective opinions and environment of the user.We follow the steps of the Operations Research Methodology which consist of: Identify the Problem; Understand the System; Formulate a Mathematical Model; Verify the Model; Select the Best Alternative; Present the Results of theAnalysis and Implement and Evaluate. All the processes are done with close participation and involvement of the final users from different transportation companies. The planner s opinion and main criticisms are used to improve the proposed model in a continuous enrichment process. The final objective is to have a model that can be incorporated into an information system to be used as an automatic tool to produce driver schedules. Therefore, the criteria for evaluating the models is the capacity to generate real and useful schedules that can be implemented without many manual adjustments or modifications. We have considered the following as measures of the quality of the model: simplicity, solution quality and applicability. We tested the alternative models with a set of real data obtained from several different transportation companies and analyzed the optimal schedules obtained with respect to the applicability of the solution to the real situation. To do this, the schedules were analyzed by the planners to determine their quality and applicability. The main result of this work is the proposition of new mathematical models for the DSP that better represent the realities of the passenger transportation operators and lead to better schedules that can be implemented directly in real situations.
Resumo:
Despite the huge increase in processor and interprocessor network performace, many computational problems remain unsolved due to lack of some critical resources such as floating point sustained performance, memory bandwidth, etc... Examples of these problems are found in areas of climate research, biology, astrophysics, high energy physics (montecarlo simulations) and artificial intelligence, among others. For some of these problems, computing resources of a single supercomputing facility can be 1 or 2 orders of magnitude apart from the resources needed to solve some them. Supercomputer centers have to face an increasing demand on processing performance, with the direct consequence of an increasing number of processors and systems, resulting in a more difficult administration of HPC resources and the need for more physical space, higher electrical power consumption and improved air conditioning, among other problems. Some of the previous problems can´t be easily solved, so grid computing, intended as a technology enabling the addition and consolidation of computing power, can help in solving large scale supercomputing problems. In this document, we describe how 2 supercomputing facilities in Spain joined their resources to solve a problem of this kind. The objectives of this experience were, among others, to demonstrate that such a cooperation can enable the solution of bigger dimension problems and to measure the efficiency that could be achieved. In this document we show some preliminary results of this experience and to what extend these objectives were achieved.
Resumo:
This article studies how product introduction decisions relate to profitability and uncertainty in the context of multi-product firms and product differentiation. These two features, common to many modern industries, have not received much attention in the literature as compared to the classical problem of firm entry, even if the determinants of firm and product entry are quite different. The theoretical predictions about the sign of the impact of uncertainty on product entry are not conclusive. Therefore, an econometric model relating firms’ product introduction decisions with profitability and profit uncertainty is proposed. Firm’s estimated profits are obtained from a structural model of product demand and supply, and uncertainty is proxied by profits’ variance. The empirical analysis is carried out using data on the Spanish car industry for the period 1990-2000. The results show a positive relationship between product introduction and profitability, and a negative one with respect to profit variability. Interestingly, the degree of uncertainty appears to be a driving force of entry stronger than profitability, suggesting that the product proliferation process in the Spanish car market may have been mainly a consequence of lower uncertainty rather than the result of having a more profitable market. Keywords: Product introduction, entry, uncertainty, multiproduct firms, automobile JEL codes: L11, L13
Resumo:
Durante el siglo XV Cerdeña desempeña, gracias a su posicionamiento geográfico en el centro del Mediterráneo, un papel muy importante en la red de comercio internacional. Este movimiento económico-cultural fomenta una extensa producción artística que está caracterizada por depender fuertemente de la presencia en la isla de talleres y artistas catalanes. Al día de hoy se conoce muy poco la presencia y las características de la pintura del gótico tardío en la Cerdeña. Las dificultades encontradas por los investigadores en la reconstrucción de la historia del arte sarda de esta época son numerosas y difíciles de solucionar. El problema depende de la falta de fuentes documentales (directas e indirectas), de la gran dispersión de muchísimas obras, de la inaccesibilidad de unas de las obras más interesantes; del desplazamiento casi total de las obras de su sitio original, y del total anonimato de la mayoría de los artistas. En estas condiciones la única solución para alcanzar nuevos elementos científicos es: integrar los conocimientos procedente de las fuentes documentales; ampliar la practica metodológica (multidisciplinariedad), conectar el tejido sardo (pintura, miniatura, grabados) con el ámbito internacional. El objetivo final es introducir la pintura sarda en un contexto más amplio, para poder descubrir el sistema de modelos y de relaciones artísticas, que la conectan con el mundo artístico mediterráneo. El trabajo de este primer año ha sido particularmente intenso y complicado por las dificultades encontradas en ponerse en un tema tan vasto como lo de la pintura y los modelos figurativos de la segunda mitad del siglo XV. Principales objetivos: averiguar toda la bibliografía conocida sobre el tema e intentar ampliar los estudios cada vez que se encontraban indicios sobre posibles autores y obras sardas; empezar el trabajo de catalogación de las obras pictóricas (conocidas o recien descubiertas.)
Resumo:
Durant els últims anys al tram final del riu Ebre s’han produit canvis molt importants a l’ecosistema fluvial: l’augment de la transparència de l’aigua ha comportat una proliferació massiva de macròfits que ha provocat canvis en l’estructura tròfica i en la composició de les comunitats biològiques, representant un greu perill per espècies amenaçades com Margaritifera auricularia. A més del problema ecològic, els macròfits estan provocant molts problemes socio-econòmics perjudicant les captacions d’aigua (centrals nuclears, hidroelèctriques i regadius), creant problemes per a la navegació fluvial, i afavorint la proliferació d’espècies molestes com la mosca negra (Simulium erythrocephalum). Entre les diferents causes que podrien explicar aquests canvis en l’ecosistema hi ha: la disminució del fòsfor dissolt, la regularització i la disminució de cabals, i l’aparició d’espècies introduides com el musclo zebra. Segurament es tractarà d’un efecte combinat de les diferents causes però és necessari analitzar-les per tal de conèixer quines tenen més incidència i així, poder proposar mesures de gestió per als problemes ecològics que pateix el tram final de l’Ebre. Al present projecte de tesi (inclós en el projecte d’I+D: efectes de la millora de la qualitat de l'aigua i de l'alteració del règim de cabals sobre les comunitats biològiques del tram final del riu Ebre) s’estudiarà la comunitat de macròfits i macroinvertebrats associats per tal de determinar el paper que tenen en el canvis que s’han produit al riu durant els últims anys.
Resumo:
This article studies how product introduction decisions relate to profitability and uncertainty in the context of multi-product firms and product differentiation. These two features, common to many modern industries, have not received much attention in the literature as compared to the classical problem of firm entry, even if the determinants of firm and product entry are quite different. The theoretical predictions about the sign of the impact of uncertainty on product entry are not conclusive. Therefore, an econometric model relating firms’ product introduction decisions with profitability and profit uncertainty is proposed. Firm’s estimated profits are obtained from a structural model of product demand and supply, and uncertainty is proxied by profits’ variance. The empirical analysis is carried out using data on the Spanish car industry for the period 1990-2000. The results show a positive relationship between product introduction and profitability, and a negative one with respect to profit variability. Interestingly, the degree of uncertainty appears to be a driving force of entry stronger than profitability, suggesting that the product proliferation process in the Spanish car market may have been mainly a consequence of lower uncertainty rather than the result of having a more profitable market
Resumo:
Our essay aims at studying suitable statistical methods for the clustering ofcompositional data in situations where observations are constituted by trajectories ofcompositional data, that is, by sequences of composition measurements along a domain.Observed trajectories are known as “functional data” and several methods have beenproposed for their analysis.In particular, methods for clustering functional data, known as Functional ClusterAnalysis (FCA), have been applied by practitioners and scientists in many fields. To ourknowledge, FCA techniques have not been extended to cope with the problem ofclustering compositional data trajectories. In order to extend FCA techniques to theanalysis of compositional data, FCA clustering techniques have to be adapted by using asuitable compositional algebra.The present work centres on the following question: given a sample of compositionaldata trajectories, how can we formulate a segmentation procedure giving homogeneousclasses? To address this problem we follow the steps described below.First of all we adapt the well-known spline smoothing techniques in order to cope withthe smoothing of compositional data trajectories. In fact, an observed curve can bethought of as the sum of a smooth part plus some noise due to measurement errors.Spline smoothing techniques are used to isolate the smooth part of the trajectory:clustering algorithms are then applied to these smooth curves.The second step consists in building suitable metrics for measuring the dissimilaritybetween trajectories: we propose a metric that accounts for difference in both shape andlevel, and a metric accounting for differences in shape only.A simulation study is performed in order to evaluate the proposed methodologies, usingboth hierarchical and partitional clustering algorithm. The quality of the obtained resultsis assessed by means of several indices
Resumo:
Delivery context-aware adaptative heterogenous systems. Currently, many types of devices that have gained access to the network is large and diverse. The different capabilities and characteristics of them, in addition to the different characteristics and preferences of users, have generated a new goal to overcome: how to adapt the contents taking into account this heterogeneity, known as the “delivery context.” The concepts of adaptation and accessibility have been widely discussed and have resulted in many proposals, standards and techniques designed to solve the problem, making it necessary to refine the analysis of the issue to be considered in the process of adaptation. We present a tour of the various proposals and standards that have marked the area of heterogeneous systems works, and others who have worked since the real-time interaction through agents based platforms. All targeted to solve a common goal: the delivery context
Resumo:
Background: Recent advances on high-throughput technologies have produced a vast amount of protein sequences, while the number of high-resolution structures has seen a limited increase. This has impelled the production of many strategies to built protein structures from its sequence, generating a considerable amount of alternative models. The selection of the closest model to the native conformation has thus become crucial for structure prediction. Several methods have been developed to score protein models by energies, knowledge-based potentials and combination of both.Results: Here, we present and demonstrate a theory to split the knowledge-based potentials in scoring terms biologically meaningful and to combine them in new scores to predict near-native structures. Our strategy allows circumventing the problem of defining the reference state. In this approach we give the proof for a simple and linear application that can be further improved by optimizing the combination of Zscores. Using the simplest composite score () we obtained predictions similar to state-of-the-art methods. Besides, our approach has the advantage of identifying the most relevant terms involved in the stability of the protein structure. Finally, we also use the composite Zscores to assess the conformation of models and to detect local errors.Conclusion: We have introduced a method to split knowledge-based potentials and to solve the problem of defining a reference state. The new scores have detected near-native structures as accurately as state-of-art methods and have been successful to identify wrongly modeled regions of many near-native conformations.
Resumo:
Isotopic and isotonic chains of superheavy nuclei are analyzed to search for spherical double shell closures beyond Z=82 and N=126 within the new effective field theory model of Furnstahl, Serot, and Tang for the relativistic nuclear many-body problem. We take into account several indicators to identify the occurrence of possible shell closures, such as two-nucleon separation energies, two-nucleon shell gaps, average pairing gaps, and the shell correction energy. The effective Lagrangian model predicts N=172 and Z=120 and N=258 and Z=120 as spherical doubly magic superheavy nuclei, whereas N=184 and Z=114 show some magic character depending on the parameter set. The magicity of a particular neutron (proton) number in the analyzed mass region is found to depend on the number of protons (neutrons) present in the nucleus.
Resumo:
The performance of density-functional theory to solve the exact, nonrelativistic, many-electron problem for magnetic systems has been explored in a new implementation imposing space and spin symmetry constraints, as in ab initio wave function theory. Calculations on selected systems representative of organic diradicals, molecular magnets and antiferromagnetic solids carried out with and without these constraints lead to contradictory results, which provide numerical illustration on this usually obviated problem. It is concluded that the present exchange-correlation functionals provide reasonable numerical results although for the wrong physical reasons, thus evidencing the need for continued search for more accurate expressions.
Resumo:
The performance of density-functional theory to solve the exact, nonrelativistic, many-electron problem for magnetic systems has been explored in a new implementation imposing space and spin symmetry constraints, as in ab initio wave function theory. Calculations on selected systems representative of organic diradicals, molecular magnets and antiferromagnetic solids carried out with and without these constraints lead to contradictory results, which provide numerical illustration on this usually obviated problem. It is concluded that the present exchange-correlation functionals provide reasonable numerical results although for the wrong physical reasons, thus evidencing the need for continued search for more accurate expressions.
Resumo:
This paper seeks to address the problem of the empirical identification of housing market segmentation,once we assume that submarkets exist. The typical difficulty in identifying housing submarkets when dealing with many locations is the vast number of potential solutions and, in such cases, the use of the Chow test for hedonic functions is not a practical solution. Here, we solve this problem by undertaking an identification process with a heuristic for spatially constrained clustering, the"Housing Submarket Identifier" (HouSI). The solution is applied to the housing market in the city of Barcelona (Spain), where we estimate a hedonic model for fifty thousand dwellings aggregated into ten groups. In order to determine the utility of the procedure we seek to verify whether the final solution provided by the heuristic is comparable with the division of the city into ten administrative districts.