960 resultados para least common subgraph algorithm
Resumo:
In this paper, we are proposing a methodology to determine the most efficient and least costly way of crew pairing optimization. We are developing a methodology based on algorithm optimization on Eclipse opensource IDE using the Java programming language to solve the crew scheduling problems.
Resumo:
AbstractFor a wide range of environmental, hydrological, and engineering applications there is a fast growing need for high-resolution imaging. In this context, waveform tomographic imaging of crosshole georadar data is a powerful method able to provide images of pertinent electrical properties in near-surface environments with unprecedented spatial resolution. In contrast, conventional ray-based tomographic methods, which consider only a very limited part of the recorded signal (first-arrival traveltimes and maximum first-cycle amplitudes), suffer from inherent limitations in resolution and may prove to be inadequate in complex environments. For a typical crosshole georadar survey the potential improvement in resolution when using waveform-based approaches instead of ray-based approaches is in the range of one order-of- magnitude. Moreover, the spatial resolution of waveform-based inversions is comparable to that of common logging methods. While in exploration seismology waveform tomographic imaging has become well established over the past two decades, it is comparably still underdeveloped in the georadar domain despite corresponding needs. Recently, different groups have presented finite-difference time-domain waveform inversion schemes for crosshole georadar data, which are adaptations and extensions of Tarantola's seminal nonlinear generalized least-squares approach developed for the seismic case. First applications of these new crosshole georadar waveform inversion schemes on synthetic and field data have shown promising results. However, there is little known about the limits and performance of such schemes in complex environments. To this end, the general motivation of my thesis is the evaluation of the robustness and limitations of waveform inversion algorithms for crosshole georadar data in order to apply such schemes to a wide range of real world problems.One crucial issue to making applicable and effective any waveform scheme to real-world crosshole georadar problems is the accurate estimation of the source wavelet, which is unknown in reality. Waveform inversion schemes for crosshole georadar data require forward simulations of the wavefield in order to iteratively solve the inverse problem. Therefore, accurate knowledge of the source wavelet is critically important for successful application of such schemes. Relatively small differences in the estimated source wavelet shape can lead to large differences in the resulting tomograms. In the first part of my thesis, I explore the viability and robustness of a relatively simple iterative deconvolution technique that incorporates the estimation of the source wavelet into the waveform inversion procedure rather than adding additional model parameters into the inversion problem. Extensive tests indicate that this source wavelet estimation technique is simple yet effective, and is able to provide remarkably accurate and robust estimates of the source wavelet in the presence of strong heterogeneity in both the dielectric permittivity and electrical conductivity as well as significant ambient noise in the recorded data. Furthermore, our tests also indicate that the approach is insensitive to the phase characteristics of the starting wavelet, which is not the case when directly incorporating the wavelet estimation into the inverse problem.Another critical issue with crosshole georadar waveform inversion schemes which clearly needs to be investigated is the consequence of the common assumption of frequency- independent electromagnetic constitutive parameters. This is crucial since in reality, these parameters are known to be frequency-dependent and complex and thus recorded georadar data may show significant dispersive behaviour. In particular, in the presence of water, there is a wide body of evidence showing that the dielectric permittivity can be significantly frequency dependent over the GPR frequency range, due to a variety of relaxation processes. The second part of my thesis is therefore dedicated to the evaluation of the reconstruction limits of a non-dispersive crosshole georadar waveform inversion scheme in the presence of varying degrees of dielectric dispersion. I show that the inversion algorithm, combined with the iterative deconvolution-based source wavelet estimation procedure that is partially able to account for the frequency-dependent effects through an "effective" wavelet, performs remarkably well in weakly to moderately dispersive environments and has the ability to provide adequate tomographic reconstructions.
Resumo:
From a managerial point of view, the more effcient, simple, and parameter-free (ESP) an algorithm is, the more likely it will be used in practice for solving real-life problems. Following this principle, an ESP algorithm for solving the Permutation Flowshop Sequencing Problem (PFSP) is proposed in this article. Using an Iterated Local Search (ILS) framework, the so-called ILS-ESP algorithm is able to compete in performance with other well-known ILS-based approaches, which are considered among the most effcient algorithms for the PFSP. However, while other similar approaches still employ several parameters that can affect their performance if not properly chosen, our algorithm does not require any particular fine-tuning process since it uses basic "common sense" rules for the local search, perturbation, and acceptance criterion stages of the ILS metaheuristic. Our approach defines a new operator for the ILS perturbation process, a new acceptance criterion based on extremely simple and transparent rules, and a biased randomization process of the initial solution to randomly generate different alternative initial solutions of similar quality -which is attained by applying a biased randomization to a classical PFSP heuristic. This diversification of the initial solution aims at avoiding poorly designed starting points and, thus, allows the methodology to take advantage of current trends in parallel and distributed computing. A set of extensive tests, based on literature benchmarks, has been carried out in order to validate our algorithm and compare it against other approaches. These tests show that our parameter-free algorithm is able to compete with state-of-the-art metaheuristics for the PFSP. Also, the experiments show that, when using parallel computing, it is possible to improve the top ILS-based metaheuristic by just incorporating to it our biased randomization process with a high-quality pseudo-random number generator.
Resumo:
Revenue management (RM) is a complicated business process that can best be described ascontrol of sales (using prices, restrictions, or capacity), usually using software as a tool to aiddecisions. RM software can play a mere informative role, supplying analysts with formatted andsummarized data who use it to make control decisions (setting a price or allocating capacity fora price point), or, play a deeper role, automating the decisions process completely, at the otherextreme. The RM models and algorithms in the academic literature by and large concentrateon the latter, completely automated, level of functionality.A firm considering using a new RM model or RM system needs to evaluate its performance.Academic papers justify the performance of their models using simulations, where customerbooking requests are simulated according to some process and model, and the revenue perfor-mance of the algorithm compared to an alternate set of algorithms. Such simulations, whilean accepted part of the academic literature, and indeed providing research insight, often lackcredibility with management. Even methodologically, they are usually awed, as the simula-tions only test \within-model" performance, and say nothing as to the appropriateness of themodel in the first place. Even simulations that test against alternate models or competition arelimited by their inherent necessity on fixing some model as the universe for their testing. Theseproblems are exacerbated with RM models that attempt to model customer purchase behav-ior or competition, as the right models for competitive actions or customer purchases remainsomewhat of a mystery, or at least with no consensus on their validity.How then to validate a model? Putting it another way, we want to show that a particularmodel or algorithm is the cause of a certain improvement to the RM process compared to theexisting process. We take care to emphasize that we want to prove the said model as the causeof performance, and to compare against a (incumbent) process rather than against an alternatemodel.In this paper we describe a \live" testing experiment that we conducted at Iberia Airlineson a set of flights. A set of competing algorithms control a set of flights during adjacentweeks, and their behavior and results are observed over a relatively long period of time (9months). In parallel, a group of control flights were managed using the traditional mix of manualand algorithmic control (incumbent system). Such \sandbox" testing, while common at manylarge internet search and e-commerce companies is relatively rare in the revenue managementarea. Sandbox testing has an undisputable model of customer behavior but the experimentaldesign and analysis of results is less clear. In this paper we describe the philosophy behind theexperiment, the organizational challenges, the design and setup of the experiment, and outlinethe analysis of the results. This paper is a complement to a (more technical) related paper thatdescribes the econometrics and statistical analysis of the results.
Resumo:
We herein present a preliminary practical algorithm for evaluating complementary and alternative medicine (CAM) for children which relies on basic bioethical principles and considers the influence of CAM on global child healthcare. CAM is currently involved in almost all sectors of pediatric care and frequently represents a challenge to the pediatrician. The aim of this article is to provide a decision-making tool to assist the physician, especially as it remains difficult to keep up-to-date with the latest developments in the field. The reasonable application of our algorithm together with common sense should enable the pediatrician to decide whether pediatric (P)-CAM represents potential harm to the patient, and allow ethically sound counseling. In conclusion, we propose a pragmatic algorithm designed to evaluate P-CAM, briefly explain the underlying rationale and give a concrete clinical example.
Resumo:
Visible and near infrared (vis-NIR) spectroscopy is widely used to detect soil properties. The objective of this study is to evaluate the combined effect of moisture content (MC) and the modeling algorithm on prediction of soil organic carbon (SOC) and pH. Partial least squares (PLS) and the Artificial neural network (ANN) for modeling of SOC and pH at different MC levels were compared in terms of efficiency in prediction of regression. A total of 270 soil samples were used. Before spectral measurement, dry soil samples were weighed to determine the amount of water to be added by weight to achieve the specified gravimetric MC levels of 5, 10, 15, 20, and 25 %. A fiber-optic vis-NIR spectrophotometer (350-2500 nm) was used to measure spectra of soil samples in the diffuse reflectance mode. Spectra preprocessing and PLS regression were carried using Unscrambler® software. Statistica® software was used for ANN modeling. The best prediction result for SOC was obtained using the ANN (RMSEP = 0.82 % and RPD = 4.23) for soil samples with 25 % MC. The best prediction results for pH were obtained with PLS for dry soil samples (RMSEP = 0.65 % and RPD = 1.68) and soil samples with 10 % MC (RMSEP = 0.61 % and RPD = 1.71). Whereas the ANN showed better performance for SOC prediction at all MC levels, PLS showed better predictive accuracy of pH at all MC levels except for 25 % MC. Therefore, based on the data set used in the current study, the ANN is recommended for the analyses of SOC at all MC levels, whereas PLS is recommended for the analysis of pH at MC levels below 20 %.
Resumo:
As opposed to objective definitions in soil physics, the subjective term “soil physical quality” is increasingly found in publications in the soil physics area. A supposed indicator of soil physical quality that has been the focus of attention, especially in the Brazilian literature, is the Least Limiting Water Range (RLL), translated in Portuguese as "Intervalo Hídrico Ótimo" or IHO. In this paper the four limiting water contents that define RLLare discussed in the light of objectively determinable soil physical properties, pointing to inconsistencies in the RLLdefinition and calculation. It also discusses the interpretation of RLL as an indicator of crop productivity or soil physical quality, showing its inability to consider common phenological and pedological boundary conditions. It is shown that so-called “critical densities” found by the RLL through a commonly applied calculation method are questionable. Considering the availability of robust models for agronomy, ecology, hydrology, meteorology and other related areas, the attractiveness of RLL as an indicator to Brazilian soil physicists is not related to its (never proven) effectiveness, but rather to the simplicity with which it is dealt. Determining the respective limiting contents in a simplified manner, relegating the study or concern on the actual functioning of the system to a lower priority, goes against scientific construction and systemic understanding. This study suggests a realignment of the research in soil physics in Brazil with scientific precepts, towards mechanistic soil physics, to replace the currently predominant search for empirical correlations below the state of the art of soil physics.
Resumo:
Objective: To investigate the association between common carotid artery intima-media thickness (cIMT) and exposure to secondhand smoke (SHS) in children. Methods: Data were available at baseline in the Quebec Adiposity and Lifestyle investigation in Youth (QUALITY) study, an ongoing longitudinal investigation of Caucasian children aged 8e10 years at cohort inception, who had at least one obese parent. Data on exposure to parents, siblings and friends smoking were collected in interviewer-administered child, and self-report parent questionnaires. Blood cotinine was measured with a high sensitivity ELISA. cIMTwas measured by ultrasound. The association between blood cotinine and cIMT was investigated in multivariable linear regression analyses controlling for age, body mass index, and child smoking status. Results: Mean (SD) cIMT (0.5803 (0.04602)) did not differ across age or sex. Overall 26%, 6% and 3% of children were exposed to parents, siblings and friends smoking, respectively. Cotinine ranged from 0.13 ng/ml to 7.38 ng/ml (median (IQR)¼0.18 ng/ml)). Multivariately, a 1 ng/ml increase in cotinine was associated with a 0.090 mm increase in cIMT (p¼0.034). Conclusion: In children as young as age 8e10 years, exposure to SHS relates to cIMT, a marker of pre-clinical atherosclerosis. Given the wide range of health effects of SHS, increased public health efforts are needed to reduced exposure among children in homes an private vehicles.
Resumo:
3 Summary 3. 1 English The pharmaceutical industry has been facing several challenges during the last years, and the optimization of their drug discovery pipeline is believed to be the only viable solution. High-throughput techniques do participate actively to this optimization, especially when complemented by computational approaches aiming at rationalizing the enormous amount of information that they can produce. In siiico techniques, such as virtual screening or rational drug design, are now routinely used to guide drug discovery. Both heavily rely on the prediction of the molecular interaction (docking) occurring between drug-like molecules and a therapeutically relevant target. Several softwares are available to this end, but despite the very promising picture drawn in most benchmarks, they still hold several hidden weaknesses. As pointed out in several recent reviews, the docking problem is far from being solved, and there is now a need for methods able to identify binding modes with a high accuracy, which is essential to reliably compute the binding free energy of the ligand. This quantity is directly linked to its affinity and can be related to its biological activity. Accurate docking algorithms are thus critical for both the discovery and the rational optimization of new drugs. In this thesis, a new docking software aiming at this goal is presented, EADock. It uses a hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with .the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 R around the center of mass of the ligand position in the crystal structure, and conversely to other benchmarks, our algorithms was fed with optimized ligand positions up to 10 A root mean square deviation 2MSD) from the crystal structure. This validation illustrates the efficiency of our sampling heuristic, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best-ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures in this benchmark could be explained by the presence of crystal contacts in the experimental structure. EADock has been used to understand molecular interactions involved in the regulation of the Na,K ATPase, and in the activation of the nuclear hormone peroxisome proliferatoractivated receptors a (PPARa). It also helped to understand the action of common pollutants (phthalates) on PPARy, and the impact of biotransformations of the anticancer drug Imatinib (Gleevec®) on its binding mode to the Bcr-Abl tyrosine kinase. Finally, a fragment-based rational drug design approach using EADock was developed, and led to the successful design of new peptidic ligands for the a5ß1 integrin, and for the human PPARa. In both cases, the designed peptides presented activities comparable to that of well-established ligands such as the anticancer drug Cilengitide and Wy14,643, respectively. 3.2 French Les récentes difficultés de l'industrie pharmaceutique ne semblent pouvoir se résoudre que par l'optimisation de leur processus de développement de médicaments. Cette dernière implique de plus en plus. de techniques dites "haut-débit", particulièrement efficaces lorsqu'elles sont couplées aux outils informatiques permettant de gérer la masse de données produite. Désormais, les approches in silico telles que le criblage virtuel ou la conception rationnelle de nouvelles molécules sont utilisées couramment. Toutes deux reposent sur la capacité à prédire les détails de l'interaction moléculaire entre une molécule ressemblant à un principe actif (PA) et une protéine cible ayant un intérêt thérapeutique. Les comparatifs de logiciels s'attaquant à cette prédiction sont flatteurs, mais plusieurs problèmes subsistent. La littérature récente tend à remettre en cause leur fiabilité, affirmant l'émergence .d'un besoin pour des approches plus précises du mode d'interaction. Cette précision est essentielle au calcul de l'énergie libre de liaison, qui est directement liée à l'affinité du PA potentiel pour la protéine cible, et indirectement liée à son activité biologique. Une prédiction précise est d'une importance toute particulière pour la découverte et l'optimisation de nouvelles molécules actives. Cette thèse présente un nouveau logiciel, EADock, mettant en avant une telle précision. Cet algorithme évolutionnaire hybride utilise deux pressions de sélections, combinées à une gestion de la diversité sophistiquée. EADock repose sur CHARMM pour les calculs d'énergie et la gestion des coordonnées atomiques. Sa validation a été effectuée sur 37 complexes protéine-ligand cristallisés, incluant 11 protéines différentes. L'espace de recherche a été étendu à une sphère de 151 de rayon autour du centre de masse du ligand cristallisé, et contrairement aux comparatifs habituels, l'algorithme est parti de solutions optimisées présentant un RMSD jusqu'à 10 R par rapport à la structure cristalline. Cette validation a permis de mettre en évidence l'efficacité de notre heuristique de recherche car des modes d'interactions présentant un RMSD inférieur à 2 R par rapport à la structure cristalline ont été classés premier pour 68% des complexes. Lorsque les cinq meilleures solutions sont prises en compte, le taux de succès grimpe à 78%, et 92% lorsque la totalité de la dernière génération est prise en compte. La plupart des erreurs de prédiction sont imputables à la présence de contacts cristallins. Depuis, EADock a été utilisé pour comprendre les mécanismes moléculaires impliqués dans la régulation de la Na,K ATPase et dans l'activation du peroxisome proliferatoractivated receptor a (PPARa). Il a également permis de décrire l'interaction de polluants couramment rencontrés sur PPARy, ainsi que l'influence de la métabolisation de l'Imatinib (PA anticancéreux) sur la fixation à la kinase Bcr-Abl. Une approche basée sur la prédiction des interactions de fragments moléculaires avec protéine cible est également proposée. Elle a permis la découverte de nouveaux ligands peptidiques de PPARa et de l'intégrine a5ß1. Dans les deux cas, l'activité de ces nouveaux peptides est comparable à celles de ligands bien établis, comme le Wy14,643 pour le premier, et le Cilengitide (PA anticancéreux) pour la seconde.
Resumo:
OBJECTIVE: The natural course of chronic hepatitis C varies widely. To improve the profiling of patients at risk of developing advanced liver disease, we assessed the relative contribution of factors for liver fibrosis progression in hepatitis C. DESIGN: We analysed 1461 patients with chronic hepatitis C with an estimated date of infection and at least one liver biopsy. Risk factors for accelerated fibrosis progression rate (FPR), defined as ≥0.13 Metavir fibrosis units per year, were identified by logistic regression. Examined factors included age at infection, sex, route of infection, HCV genotype, body mass index (BMI), significant alcohol drinking (≥20 g/day for ≥5 years), HIV coinfection and diabetes. In a subgroup of 575 patients, we assessed the impact of single nucleotide polymorphisms previously associated with fibrosis progression in genome-wide association studies. Results were expressed as attributable fraction (AF) of risk for accelerated FPR. RESULTS: Age at infection (AF 28.7%), sex (AF 8.2%), route of infection (AF 16.5%) and HCV genotype (AF 7.9%) contributed to accelerated FPR in the Swiss Hepatitis C Cohort Study, whereas significant alcohol drinking, anti-HIV, diabetes and BMI did not. In genotyped patients, variants at rs9380516 (TULP1), rs738409 (PNPLA3), rs4374383 (MERTK) (AF 19.2%) and rs910049 (major histocompatibility complex region) significantly added to the risk of accelerated FPR. Results were replicated in three additional independent cohorts, and a meta-analysis confirmed the role of age at infection, sex, route of infection, HCV genotype, rs738409, rs4374383 and rs910049 in accelerating FPR. CONCLUSIONS: Most factors accelerating liver fibrosis progression in chronic hepatitis C are unmodifiable.
Resumo:
AIMS: We aimed to assess the prevalence and management of clinical familial hypercholesterolaemia (FH) among patients with acute coronary syndrome (ACS). METHODS AND RESULTS: We studied 4778 patients with ACS from a multi-centre cohort study in Switzerland. Based on personal and familial history of premature cardiovascular disease and LDL-cholesterol levels, two validated algorithms for diagnosis of clinical FH were used: the Dutch Lipid Clinic Network algorithm to assess possible (score 3-5 points) or probable/definite FH (>5 points), and the Simon Broome Register algorithm to assess possible FH. At the time of hospitalization for ACS, 1.6% had probable/definite FH [95% confidence interval (CI) 1.3-2.0%, n = 78] and 17.8% possible FH (95% CI 16.8-18.9%, n = 852), respectively, according to the Dutch Lipid Clinic algorithm. The Simon Broome algorithm identified 5.4% (95% CI 4.8-6.1%, n = 259) patients with possible FH. Among 1451 young patients with premature ACS, the Dutch Lipid Clinic algorithm identified 70 (4.8%, 95% CI 3.8-6.1%) patients with probable/definite FH, and 684 (47.1%, 95% CI 44.6-49.7%) patients had possible FH. Excluding patients with secondary causes of dyslipidaemia such as alcohol consumption, acute renal failure, or hyperglycaemia did not change prevalence. One year after ACS, among 69 survivors with probable/definite FH and available follow-up information, 64.7% were using high-dose statins, 69.0% had decreased LDL-cholesterol from at least 50, and 4.6% had LDL-cholesterol ≤1.8 mmol/L. CONCLUSION: A phenotypic diagnosis of possible FH is common in patients hospitalized with ACS, particularly among those with premature ACS. Optimizing long-term lipid treatment of patients with FH after ACS is required.
Resumo:
The Ebro Delta holds a large seabird community, including a common tern (Sterna hirundo) local population of 3,085 pairs in 2000 which breeds scattered in several colonies. At El Canalot colony, 1,178 (1999) and 1,156 pairs (2000) of this species bred distributed in 32 and 38 sub-colonies respectively. These sub-colonies varied in size from 1 to 223 pairs and were placed near the main breeding colonies of yellow-legged gulls (Larus cachinnans) and Audouin´s gulls (L. audouinii), which are potential egg-predators of terns. We studied egg predation during 1999 (6 sub-colonies) and 2000 (27 sub-colonies). Overall, we found that 10.6% of the nests in 1999 and 16.7% in 2000 suffered partial or total egg predation, being total in 81.1% of the predatory events. Predation was significantly higher in small sub-colonies (< 11 pairs): 49.4% in 1999 and 75.5% in 2000. Only attacks from yellow-legged gulls were observed, and defence behaviour of terns was significantly more frequent against this gull species (40.5 hours of observation), suggesting that in most cases the egg predation recorded was due to this species. Probability of egg predation was significantly and negatively correlated with distance to the nearest yellow-legged gull sub-colony, although this relationship was no more significant after adjustment for sub-colony size. On the other hand, distance to the nearest Audouin´s gull sub-colony did not show any effect. Our results suggest that the impact of large gulls (at least yellow-legged gulls) upon smaller seabirds breeding in the area might be important, especially when they are breeding in small sub-colonies. Further studies are needed to analyse the general impact of large gulls upon the breeding populations of other colonial bird species in the area.
Resumo:
ABSTRACT This study aimed to identify wavelengths based on leaf reflectance (400-1050 nm) to estimate white mold severity in common beans at different seasons. Two experiments were carried out, one during fall and another in winter. Partial Least Squares (PLS) regression was used to establish a set of wavelengths that better estimates the disease severity at a specific date. Therefore, observations were previously divided in two sub-groups. The first one (calibration) was used for model building and the second subgroup for model testing. Error measurements and correlation between measured and predicted values of disease severity index were employed to provide the best wavelengths in both seasons. The average indexes of each experiment were of 5.8% and 7.4%, which is considered low. Spectral bands ranged between blue and green, green and red, and red and infrared, being most sensitive for disease estimation. Beyond the transition ranges, other spectral regions also presented wavelengths with potential to determine the disease severity, such as red, green, and near infrared.
Resumo:
In 1995, a pioneering MD-PhD program was initiated in Brazil for the training of medical scientists in experimental sciences at the Federal University of Rio de Janeiro. The program’s aim was achieved with respect to publication of theses in the form of papers with international visibility and also in terms of fostering the scientific careers of the graduates. The expansion of this type of program is one of the strategies for improving the preparation of biomedical researchers in Brazil. A noteworthy absence of interest in carrying out clinical research limits the ability of young Brazilian physicians to solve biomedical problems. To understand the students’ views of science, we used qualitative and quantitative triangulation methods, as well as participant observation to evaluate the students’ concepts of science and common sense. Subjective aspects were clearly less evident in their concepts of science. There was a strong concern about "methodology", "truth" and "usefulness". "Intuition", "creativity" and "curiosity" were the least mentioned thematic categories. Students recognized the value of intuition when it appeared as an explicit option but they did not refer to it spontaneously. Common sense was associated with "consensus", "opinion" and ideas that "require scientific validation". Such observations indicate that MD-PhD students share with their senior academic colleagues the same reluctance to consider common sense as a valid adjunct for the solution of scientific problems. Overcoming this difficulty may be an important step toward stimulating the interest of physicians in pursuing experimental research.
Resumo:
Ordered gene problems are a very common classification of optimization problems. Because of their popularity countless algorithms have been developed in an attempt to find high quality solutions to the problems. It is also common to see many different types of problems reduced to ordered gene style problems as there are many popular heuristics and metaheuristics for them due to their popularity. Multiple ordered gene problems are studied, namely, the travelling salesman problem, bin packing problem, and graph colouring problem. In addition, two bioinformatics problems not traditionally seen as ordered gene problems are studied: DNA error correction and DNA fragment assembly. These problems are studied with multiple variations and combinations of heuristics and metaheuristics with two distinct types or representations. The majority of the algorithms are built around the Recentering- Restarting Genetic Algorithm. The algorithm variations were successful on all problems studied, and particularly for the two bioinformatics problems. For DNA Error Correction multiple cases were found with 100% of the codes being corrected. The algorithm variations were also able to beat all other state-of-the-art DNA Fragment Assemblers on 13 out of 16 benchmark problem instances.