535 resultados para GRASP
Resumo:
This paper presents an Optimised Search Heuristic that combines a tabu search method with the verification of violated valid inequalities. The solution delivered by the tabu search is partially destroyed by a randomised greedy procedure, and then the valid inequalities are used to guide the reconstruction of a complete solution. An application of the new method to the Job-Shop Scheduling problem is presented.
Resumo:
The public transportation is gaining importance every year basically duethe population growth, environmental policies and, route and streetcongestion. Too able an efficient management of all the resources relatedto public transportation, several techniques from different areas are beingapplied and several projects in Transportation Planning Systems, indifferent countries, are being developed. In this work, we present theGIST Planning Transportation Systems, a Portuguese project involving twouniversities and six public transportation companies. We describe indetail one of the most relevant modules of this project, the crew-scheduling module. The crew-scheduling module is based on the application of meta-heuristics, in particular GRASP, tabu search and geneticalgorithm to solve the bus-driver-scheduling problem. The metaheuristicshave been successfully incorporated in the GIST Planning TransportationSystems and are actually used by several companies.
Resumo:
Aim This study used data from temperate forest communities to assess: (1) five different stepwise selection methods with generalized additive models, (2) the effect of weighting absences to ensure a prevalence of 0.5, (3) the effect of limiting absences beyond the environmental envelope defined by presences, (4) four different methods for incorporating spatial autocorrelation, and (5) the effect of integrating an interaction factor defined by a regression tree on the residuals of an initial environmental model. Location State of Vaud, western Switzerland. Methods Generalized additive models (GAMs) were fitted using the grasp package (generalized regression analysis and spatial predictions, http://www.cscf.ch/grasp). Results Model selection based on cross-validation appeared to be the best compromise between model stability and performance (parsimony) among the five methods tested. Weighting absences returned models that perform better than models fitted with the original sample prevalence. This appeared to be mainly due to the impact of very low prevalence values on evaluation statistics. Removing zeroes beyond the range of presences on main environmental gradients changed the set of selected predictors, and potentially their response curve shape. Moreover, removing zeroes slightly improved model performance and stability when compared with the baseline model on the same data set. Incorporating a spatial trend predictor improved model performance and stability significantly. Even better models were obtained when including local spatial autocorrelation. A novel approach to include interactions proved to be an efficient way to account for interactions between all predictors at once. Main conclusions Models and spatial predictions of 18 forest communities were significantly improved by using either: (1) cross-validation as a model selection method, (2) weighted absences, (3) limited absences, (4) predictors accounting for spatial autocorrelation, or (5) a factor variable accounting for interactions between all predictors. The final choice of model strategy should depend on the nature of the available data and the specific study aims. Statistical evaluation is useful in searching for the best modelling practice. However, one should not neglect to consider the shapes and interpretability of response curves, as well as the resulting spatial predictions in the final assessment.
Resumo:
The Generalized Assignment Problem consists in assigning a setof tasks to a set of agents with minimum cost. Each agent hasa limited amount of a single resource and each task must beassigned to one and only one agent, requiring a certain amountof the resource of the agent. We present new metaheuristics forthe generalized assignment problem based on hybrid approaches.One metaheuristic is a MAX-MIN Ant System (MMAS), an improvedversion of the Ant System, which was recently proposed byStutzle and Hoos to combinatorial optimization problems, and itcan be seen has an adaptive sampling algorithm that takes inconsideration the experience gathered in earlier iterations ofthe algorithm. Moreover, the latter heuristic is combined withlocal search and tabu search heuristics to improve the search.A greedy randomized adaptive search heuristic (GRASP) is alsoproposed. Several neighborhoods are studied, including one basedon ejection chains that produces good moves withoutincreasing the computational effort. We present computationalresults of the comparative performance, followed by concludingremarks and ideas on future research in generalized assignmentrelated problems.
Resumo:
A new direction of research in Competitive Location theory incorporatestheories of Consumer Choice Behavior in its models. Following thisdirection, this paper studies the importance of consumer behavior withrespect to distance or transportation costs in the optimality oflocations obtained by traditional Competitive Location models. To dothis, it considers different ways of defining a key parameter in thebasic Maximum Capture model (MAXCAP). This parameter will reflectvarious ways of taking into account distance based on several ConsumerChoice Behavior theories. The optimal locations and the deviation indemand captured when the optimal locations of the other models are usedinstead of the true ones, are computed for each model. A metaheuristicbased on GRASP and Tabu search procedure is presented to solve all themodels. Computational experience and an application to 55-node networkare also presented.
Resumo:
While markets are often decentralized, in many other cases agents in one role can only negotiate with a proper subset of the agents in the complementary role. There may be proximity issues or restricted communication flows. For example, information may be transmitted only through word-of-mouth, as is often the case for job openings, business opportunities, and confidential transactions. Bargaining can be considered to occur over a network that summarizes the structure of linkages among people. We conduct an alternating-offer bargaining experiment using separate simple networks, which are then joined during the session by an additional link. The results diverge sharply depending on how this connection is made. Payoffs can be systematically affected even for agents who are not connected by the new link. We use a graph-theoretic analysis to show that any two-sided network can be decomposed into simple networks of three types, so that our result can be generalized to more complex bargaining environments. Participants appear to grasp the essential characteristics of the networks and we observe a rather consistently high level of bargaining efficiency.
Resumo:
This paper presents a simple Optimised Search Heuristic for the Job Shop Scheduling problem that combines a GRASP heuristic with a branch-and-bound algorithm. The proposed method is compared with similar approaches and leads to better results in terms of solution quality and computing times.
Resumo:
We present new metaheuristics for solving real crew scheduling problemsin a public transportation bus company. Since the crews of thesecompanies are drivers, we will designate the problem by the bus-driverscheduling problem. Crew scheduling problems are well known and severalmathematical programming based techniques have been proposed to solvethem, in particular using the set-covering formulation. However, inpractice, there exists the need for improvement in terms of computationalefficiency and capacity of solving large-scale instances. Moreover, thereal bus-driver scheduling problems that we consider can present variantaspects of the set covering, as for example a different objectivefunction, implying that alternative solutions methods have to bedeveloped. We propose metaheuristics based on the following approaches:GRASP (greedy randomized adaptive search procedure), tabu search andgenetic algorithms. These metaheuristics also present some innovationfeatures based on and genetic algorithms. These metaheuristics alsopresent some innovation features based on the structure of the crewscheduling problem, that guide the search efficiently and able them tofind good solutions. Some of these new features can also be applied inthe development of heuristics to other combinatorial optimizationproblems. A summary of computational results with real-data problems ispresented.
Resumo:
Mirror therapy, which provides the visual illusion of a functional paretic limb by using the mirror reflection of the non-paretic arm, is used in the rehabilitation of hemiparesis after stroke in adults. We tested the effectiveness and feasibility of mirror therapy in children with hemiplegia by performing a pilot crossover study in ten participants (aged 6-14 y; five males, five females; Manual Ability Classification System levels: one at level I, two at level II, four at level III, three at level IV) randomly assigned to 15 minutes of daily bimanual training with and without a mirror for 3 weeks. Assessments of maximal grasp and pinch strengths, and upper limb function measured by the Shriner's Hospital Upper Extremity Evaluation were performed at weeks 0 (baseline), 3, 6 (intervention), and 9 (wash-out). Testing of grasp strength behind the mirror improved performance by 15% (p=0.004). Training with the mirror significantly improved grasp strength (with mirror +20.4%, p=0.033; without +5.9%, p>0.1) and upper limb dynamic position (with mirror +4.6%, p=0.044; without +1.2%, p>0.1), while training without a mirror significantly improved pinch strength (with mirror +6.9%, p>0.1; without +21.9%, p=0.026). This preliminary study demonstrates the feasibility of mirror therapy in children with hemiplegia and that it may improve strength and dynamic function of the paretic arm.
Resumo:
The educational sphere has an internal function relatively agreed by social scientists. Nonetheless, the contribution that educational systems provide to the society (i.e., their social function) does not have the same degree of consensus. Taking into consideration such theoretical precedent, the current article raises an analytical schema to grasp the social function of education considering a sociological perspective. Starting from the assumption that there is an intrinsic relationship between the internal and social functions of social systems, we suggest there are particular stratification determinants modifying the internal pedagogical function of education, which impact on its social function by creating simultaneous conditions of equity and differentiation. Throughout the paper this social function is considered a paradoxical mechanism. We highlight how this paradoxical dynamic is deployed in different structural levels of the educational sphere. Additionally, we discuss eventual consequences of this paradoxical social function for the inclusion possibilities that educational systems offer to individuals.
Resumo:
This article intends to grasp the stabilization process, deterioration or improvement of the conjugal intimacy over five years, based on a representative sample of couples living in Switzerland. The dynamics develop in different ways depending on the degree of autonomy of the partners, the gendering of household tasks, conjugal openess and the coping strategies of the couples.
Resumo:
With the advancement of high-throughput sequencing and dramatic increase of available genetic data, statistical modeling has become an essential part in the field of molecular evolution. Statistical modeling results in many interesting discoveries in the field, from detection of highly conserved or diverse regions in a genome to phylogenetic inference of species evolutionary history Among different types of genome sequences, protein coding regions are particularly interesting due to their impact on proteins. The building blocks of proteins, i.e. amino acids, are coded by triples of nucleotides, known as codons. Accordingly, studying the evolution of codons leads to fundamental understanding of how proteins function and evolve. The current codon models can be classified into three principal groups: mechanistic codon models, empirical codon models and hybrid ones. The mechanistic models grasp particular attention due to clarity of their underlying biological assumptions and parameters. However, they suffer from simplified assumptions that are required to overcome the burden of computational complexity. The main assumptions applied to the current mechanistic codon models are (a) double and triple substitutions of nucleotides within codons are negligible, (b) there is no mutation variation among nucleotides of a single codon and (c) assuming HKY nucleotide model is sufficient to capture essence of transition- transversion rates at nucleotide level. In this thesis, I develop a framework of mechanistic codon models, named KCM-based model family framework, based on holding or relaxing the mentioned assumptions. Accordingly, eight different models are proposed from eight combinations of holding or relaxing the assumptions from the simplest one that holds all the assumptions to the most general one that relaxes all of them. The models derived from the proposed framework allow me to investigate the biological plausibility of the three simplified assumptions on real data sets as well as finding the best model that is aligned with the underlying characteristics of the data sets. -- Avec l'avancement de séquençage à haut débit et l'augmentation dramatique des données géné¬tiques disponibles, la modélisation statistique est devenue un élément essentiel dans le domaine dé l'évolution moléculaire. Les résultats de la modélisation statistique dans de nombreuses découvertes intéressantes dans le domaine de la détection, de régions hautement conservées ou diverses dans un génome de l'inférence phylogénétique des espèces histoire évolutive. Parmi les différents types de séquences du génome, les régions codantes de protéines sont particulièrement intéressants en raison de leur impact sur les protéines. Les blocs de construction des protéines, à savoir les acides aminés, sont codés par des triplets de nucléotides, appelés codons. Par conséquent, l'étude de l'évolution des codons mène à la compréhension fondamentale de la façon dont les protéines fonctionnent et évoluent. Les modèles de codons actuels peuvent être classés en trois groupes principaux : les modèles de codons mécanistes, les modèles de codons empiriques et les hybrides. Les modèles mécanistes saisir une attention particulière en raison de la clarté de leurs hypothèses et les paramètres biologiques sous-jacents. Cependant, ils souffrent d'hypothèses simplificatrices qui permettent de surmonter le fardeau de la complexité des calculs. Les principales hypothèses retenues pour les modèles actuels de codons mécanistes sont : a) substitutions doubles et triples de nucleotides dans les codons sont négligeables, b) il n'y a pas de variation de la mutation chez les nucléotides d'un codon unique, et c) en supposant modèle nucléotidique HKY est suffisant pour capturer l'essence de taux de transition transversion au niveau nucléotidique. Dans cette thèse, je poursuis deux objectifs principaux. Le premier objectif est de développer un cadre de modèles de codons mécanistes, nommé cadre KCM-based model family, sur la base de la détention ou de l'assouplissement des hypothèses mentionnées. En conséquence, huit modèles différents sont proposés à partir de huit combinaisons de la détention ou l'assouplissement des hypothèses de la plus simple qui détient toutes les hypothèses à la plus générale qui détend tous. Les modèles dérivés du cadre proposé nous permettent d'enquêter sur la plausibilité biologique des trois hypothèses simplificatrices sur des données réelles ainsi que de trouver le meilleur modèle qui est aligné avec les caractéristiques sous-jacentes des jeux de données. Nos expériences montrent que, dans aucun des jeux de données réelles, tenant les trois hypothèses mentionnées est réaliste. Cela signifie en utilisant des modèles simples qui détiennent ces hypothèses peuvent être trompeuses et les résultats de l'estimation inexacte des paramètres. Le deuxième objectif est de développer un modèle mécaniste de codon généralisée qui détend les trois hypothèses simplificatrices, tandis que d'informatique efficace, en utilisant une opération de matrice appelée produit de Kronecker. Nos expériences montrent que sur un jeux de données choisis au hasard, le modèle proposé de codon mécaniste généralisée surpasse autre modèle de codon par rapport à AICc métrique dans environ la moitié des ensembles de données. En outre, je montre à travers plusieurs expériences que le modèle général proposé est biologiquement plausible.
Resumo:
Este estudio realiza un investigación empírica comparando las dificultades que se derivan de la utilización del valor razonable (VR) y del coste histórico (CH) en el sector agrícola. Se analiza también la fiabilidad de ambos métodos de valoración para la interpretación de la información y la toma de decisiones por parte de los agentes que actúan en el sector. Mediante un experimento realizado con estudiantes, agricultores y contables que operan en el sector agrícola, se halla que estos tienen más dificultades, cometen mayores errores e interpretan peor la información contable realizada a CH que la realizada a VR. Entrevistas en profundidad con agricultores y contables agrícolas desvelan prácticas contables defectuosas derivadas de la necesidad de aplicar el CH en el sector en España. Dadas las complejidades del cálculo del coste de los activos biológicos y el predominio de pequeñas explotaciones en el sector en los países occidentales avanzados, el estudio concluye que la contabilidad a VR constituye una mejoría de utilización y desarrollo de la contabilidad en el sector que la confeccionada a CH. Asimismo, el CH transmite una peor representación de la situación real de las explotaciones agrícolas.
Resumo:
In this article, we offer an overview of the compared quantitative importance of biotransformation reactions in the metabolism of drugs and other xenobiotics, based on a meta-analysis of current research interests. Also, we assess the relative significance the enzyme (super)families or categories catalysing these reactions. We put the facts unveiled by the analysis into a drug discovery context and draw some implications. The results confirm the primary role of cytochrome P450-catalysed oxidations and UDP-glucuronosyl-catalysed glucuronidations, but they also document the marked significance of several other reactions. Thus, there is a need for several drug discovery scientists to better grasp the variety of drug metabolism reactions and enzymes and their consequences.
Resumo:
Ecological speciation and its hallmark, adaptive radiation is a process from which most of the current biodiversity derives. As ecological opportunity allows species to colonise unoccupied niches, natural selection drives adaptive phenotypical change. In this thesis, I begin by describing how this evolutionary process acted on the evolution of the clownfishes. During its infancy, this iconic group of coral reef fishes developed a mutualism with sea anemone species. I show how this event triggered the evolutionary radiation of the group, generating species that now inhabit diverse habitats of the coral reefs. Following the appearance of the mutualism, the diversification of the clownfish was catalysed by hybridisation events which shuffled genes, allowing hybrids to reach new fitness optima. While the clownfishes appeared in the region of the coral triangle, a lineage colonised the eastern shores of Africa. I reconstructed the geographic history of the group and showed that this event lead to the rapid appearance of new species, replicating the evolutionary patterns of the original radiation. To better grasp the mechanisms of ecological speciation, I conducted analyses at the population level and identified similar evolutionary patterns than found at the clade level. I discuss how such result suggests a continuity bridging micro- and macroevolution, which so far only been theorised. In parallel to this study case, I question whether biotic and abiotic interactions can promote or restrain ecological speciation. Indeed, I show how the ecological setting of species can drastically impact on their diversification dynamics. Moreover, tradeoffs can occur between specialisation made on different ecological axes allowing species cohabitation. Overall, I show in this work that regardless of the few simple rules that explain the mechanism of ecological speciation, the unavoidable interactions with the ever changing ecological context lead diversification events to give always a different outcome. It is thus primordial to account for the ecological settings of species when discussing their evolutionary dynamics. LA SPÉCIATION ÉCOLOGIQUE RACONTÉE AU TRAVERS DE L'ÉTUDE DE L'ÉVOLUTION DES POISSONS-CLOWNS ET DE QUELQUES AUTRES Le phénomène de spéciation écologique est à l'origine de la majeure partie de la biodiversité que l'on rencontre aujourd'hui. Au fil des opportunités qu'elles rencontrent, les espèces colonisent l'espace écologique laissant la sélection naturelle forger leur phénotype moyen. Malgré l'omniprésence de ce phénomène dans la nature, beaucoup de questions qui lui sont relatives restent à élucider. C'est afin de mieux comprendre ce mécanisme que j'étudie les poissons-clowns, célèbres habitants des récifs coralliens. Dans ce travail, je démontré que le développement du comportement mutualiste liant les poissons-clowns aux anémones de mer fut l'événement qui déclencha leur diversification. Suite à ce premier événement, j'illustre comment l'hybridation entre lignées primordiales a remodelé la diversité génétique du groupe et catalysé leur radiation évolutive. Je poursuis en reconstruisant l'expansion géographique des poissons-clowns au cours du temps depuis le triangle de corail, leur lieu d'origine, jusqu'aux côtes d'Afrique de l'Ouest. Afin d'affiner ces analyses générales sur le groupe, je continue en étudiant plus finement des populations d'une seule espèce de poisson-clown. Cette fine résolution me permet de comprendre plus précisément quels sont les facteurs écologiques qui permettent aux poissons-clowns de se différencier. Les résultats de ces analyses suggèrent qu'il est important de comprendre les liens entre le contexte écologique et la diversification des espèces. J'étudie cette question dans la seconde partie de ce travail en montrant que l'hétérogénéité du paysage ou les liens entretenus avec un partenaire mutualiste influencent fortement la dynamique évolutive des espèces. Finalement, j'illustre les compromis que chaque espèce réalise en se spécialisant ou non dans ses interactions avec l'environnent. Plus généralement, je souligne dans ce travail l'influence du contexte écologique sur le résultat de la spéciation écologique. Ce sont ces interactions entre les organismes et leur environnent qui sont à l'origine de l'incroyable diversité de la vie. Il est donc primordial de les prendre en compte lors de l'étude de l'évolution des espèces.