50 resultados para Meta heuristic algorithm
em University of Queensland eSpace - Australia
Resumo:
Hannenhalli and Pevzner developed the first polynomial-time algorithm for the combinatorial problem of sorting of signed genomic data. Their algorithm solves the minimum number of reversals required for rearranging a genome to another when gene duplication is nonexisting. In this paper, we show how to extend the Hannenhalli-Pevzner approach to genomes with multigene families. We propose a new heuristic algorithm to compute the reversal distance between two genomes with multigene families via the concept of binary integer programming without removing gene duplicates. The experimental results on simulated and real biological data demonstrate that the proposed algorithm is able to find the reversal distance accurately. ©2005 IEEE
Resumo:
The problem of designing spatially cohesive nature reserve systems that meet biodiversity objectives is formulated as a nonlinear integer programming problem. The multiobjective function minimises a combination of boundary length, area and failed representation of the biological attributes we are trying to conserve. The task is to reserve a subset of sites that best meet this objective. We use data on the distribution of habitats in the Northern Territory, Australia, to show how simulated annealing and a greedy heuristic algorithm can be used to generate good solutions to such large reserve design problems, and to compare the effectiveness of these methods.
Resumo:
In this paper, the minimum-order stable recursive filter design problem is proposed and investigated. This problem is playing an important role in pipeline implementation sin signal processing. Here, the existence of a high-order stable recursive filter is proved theoretically, in which the upper bound for the highest order of stable filters is given. Then the minimum-order stable linear predictor is obtained via solving an optimization problem. In this paper, the popular genetic algorithm approach is adopted since it is a heuristic probabilistic optimization technique and has been widely used in engineering designs. Finally, an illustrative example is sued to show the effectiveness of the proposed algorithm.
Resumo:
Qu-Prolog is an extension of Prolog which performs meta-level computations over object languages, such as predicate calculi and lambda-calculi, which have object-level variables, and quantifier or binding symbols creating local scopes for those variables. As in Prolog, the instantiable (meta-level) variables of Qu-Prolog range over object-level terms, and in addition other Qu-Prolog syntax denotes the various components of the object-level syntax, including object-level variables. Further, the meta-level operation of substitution into object-level terms is directly represented by appropriate Qu-Prolog syntax. Again as in Prolog, the driving mechanism in Qu-Prolog computation is a form of unification, but this is substantially more complex than for Prolog because of Qu-Prolog's greater generality, and especially because substitution operations are evaluated during unification. In this paper, the Qu-Prolog unification algorithm is specified, formalised and proved correct. Further, the analysis of the algorithm is carried out in a frame-work which straightforwardly allows the 'completeness' of the algorithm to be proved: though fully explicit answers to unification problems are not always provided, no information is lost in the unification process.
Resumo:
This paper reviews the attitudes, skills and knowledge that engineering innovators should possess. It critically analyses and compares sets of graduate attributes from the USA, Australia and Malaysia in terms of which of these relate to the ability to innovate. Innovation can be described as an integrative, meta attribute that overarches most of the other graduate attributes. Due to the “graduate attribute paradox”, it is shown how meeting the stated attributes of graduates by industry does not necessarily satisfy the requirements of industry. It is argued that the culture of the engineering school is an important influence on fostering innovation in engineers.
Resumo:
Recently Adams and Bischof (1994) proposed a novel region growing algorithm for segmenting intensity images. The inputs to the algorithm are the intensity image and a set of seeds - individual points or connected components - that identify the individual regions to be segmented. The algorithm grows these seed regions until all of the image pixels have been assimilated. Unfortunately the algorithm is inherently dependent on the order of pixel processing. This means, for example, that raster order processing and anti-raster order processing do not, in general, lead to the same tessellation. In this paper we propose an improved seeded region growing algorithm that retains the advantages of the Adams and Bischof algorithm fast execution, robust segmentation, and no tuning parameters - but is pixel order independent. (C) 1997 Elsevier Science B.V.
Resumo:
This systematic review aimed to collate randomized controlled trials (RCTs) of various interventions used to treat tardive dyskinesia (TD) and, where appropriate, to combine the data for mete-analysis, Clinical trials were identified by electronic searches, handsearches and contact with principal investigators. Data were extracted independently by two reviewers, for outcomes related to improvement, deterioration, side-effects and drop out rates. Data were pooled using the Mantel-Haenzel Odds Ratio (fixed effect model). For treatments that had significant effects, the number needed to treat (NNT) was calculated. From 296 controlled clinical trials, data were extracted from 47 trials. For most interventions, we could identify no RCT-derived evidence of efficacy. A meta-analysis showed that baclofen, deanol and diazepam were no more effective than a placebo. Single RCTs demonstrated a lack of evidence of any effect for bromocriptine, ceruletide, clonidine, estrogen, gamma linolenic acid, hydergine, lecithin, lithium, progabide, seligiline and tetrahydroisoxazolopyridinol. The meta-analysis found that five interventions were effective: L-dopa, oxypertine, sodium valproate, tiapride and vitamin E; neuroleptic reduction was marginally significant. Data from single RCTs revealed that insulin, alpha methyl dopa and reserpine were more effective than a placebo. There was a significantly increased risk of adverse events associated with baclofen, deanol, L-dopa, oxypertine and reserpine. Metaanalysis of the impact of placebo (n=485) showed that 37.3% of participants showed an improvement. Interpretation of this systematic review requires caution as the individual trials identified tended to have small sample sizes. For many compounds, data from only one trial were available, and where meta-analyses were possible, these were based on a small number of trials. Despite these concerns, the review facilitated the interpretation of the large and diverse range of treatments used for TD. Clinical recommendations for the treatment of TD are made, based on the availability of RCT-derived evidence, the strength of that evidence and the presence of adverse effects. (C) 1999 Elsevier Science B.V. All rights reserved.
Resumo:
Motivation: Prediction methods for identifying binding peptides could minimize the number of peptides required to be synthesized and assayed, and thereby facilitate the identification of potential T-cell epitopes. We developed a bioinformatic method for the prediction of peptide binding to MHC class II molecules. Results: Experimental binding data and expert knowledge of anchor positions and binding motifs were combined with an evolutionary algorithm (EA) and an artificial neural network (ANN): binding data extraction --> peptide alignment --> ANN training and classification. This method, termed PERUN, was implemented for the prediction of peptides that bind to HLA-DR4(B1*0401). The respective positive predictive values of PERUN predictions of high-, moderate-, low- and zero-affinity binder-a were assessed as 0.8, 0.7, 0.5 and 0.8 by cross-validation, and 1.0, 0.8, 0.3 and 0.7 by experimental binding. This illustrates the synergy between experimentation and computer modeling, and its application to the identification of potential immunotheraaeutic peptides.
Resumo:
lBACKGROUND. Management of patients with ductal carcinoma in situ (DCIS) is a dilemma, as mastectomy provides nearly a 100% cure rate but at the expense of physical and psychologic morbidity. It would be helpful if we could predict which patients with DCIS are at sufficiently high risk of local recurrence after conservative surgery (CS) alone to warrant postoperative radiotherapy (RT) and which patients are at sufficient risk of local recurrence after CS + RT to warrant mastectomy. The authors reviewed the published studies and identified the factors that may be predictive of local recurrence after management by mastectomy, CS alone, or CS + RT. METHODS. The authors examined patient, tumor, and treatment factors as potential predictors for local recurrence and estimated the risks of recurrence based on a review of published studies. They examined the effects of patient factors (age at diagnosis and family history), tumor factors (sub-type of DCIS, grade, tumor size, necrosis, and margins), and treatment (mastectomy, CS alone, and CS + RT). The 95% confidence intervals (CI) of the recurrence rates for each of the studies were calculated for subtype, grade, and necrosis, using the exact binomial; the summary recurrence rate and 95% CI for each treatment category were calculated by quantitative meta-analysis using the fixed and random effects models applied to proportions. RESULTS, Meta-analysis yielded a summary recurrence rate of 22.5% (95% CI = 16.9-28.2) for studies employing CS alone, 8.9% (95% CI = 6.8-11.0) for CS + RT, and 1.4% (95% CI = 0.7-2.1) for studies involving mastectomy alone. These summary figures indicate a clear and statistically significant separation, and therefore outcome, between the recurrence rates of each treatment category, despite the likelihood that the patients who underwent CS alone were likely to have had smaller, possibly low grade lesions with clear margins. The patients with risk factors of presence of necrosis, high grade cytologic features, or comedo subtype were found to derive the greatest improvement in local control with the addition of RT to CS. Local recurrence among patients treated by CS alone is approximately 20%, and one-half of the recurrences are invasive cancers. For most patients, RT reduces the risk of recurrence after CS alone by at least 50%. The differences in local recurrence between CS alone and CS + RT are most apparent for those patients with high grade tumors or DCIS with necrosis, or of the comedo subtype, or DCIS with close or positive surgical margins. CONCLUSIONS, The authors recommend that radiation be added to CS if patients with DCIS who also have the risk factors for local recurrence choose breast conservation over mastectomy. The patients who may be suitable for CS alone outside of a clinical trial may be those who have low grade lesions with little or no necrosis, and with clear surgical margins. Use of the summary statistics when discussing outcomes with patients may help the patient make treatment decisions. Cancer 1999;85:616-28. (C) 1999 American Cancer Society.
Resumo:
We tested the effects of four data characteristics on the results of reserve selection algorithms. The data characteristics were nestedness of features (land types in this case), rarity of features, size variation of sites (potential reserves) and size of data sets (numbers of sites and features). We manipulated data sets to produce three levels, with replication, of each of these data characteristics while holding the other three characteristics constant. We then used an optimizing algorithm and three heuristic algorithms to select sites to solve several reservation problems. We measured efficiency as the number or total area of selected sites, indicating the relative cost of a reserve system. Higher nestedness increased the efficiency of all algorithms (reduced the total cost of new reserves). Higher rarity reduced the efficiency of all algorithms (increased the total cost of new reserves). More variation in site size increased the efficiency of all algorithms expressed in terms of total area of selected sites. We measured the suboptimality of heuristic algorithms as the percentage increase of their results over optimal (minimum possible) results. Suboptimality is a measure of the reliability of heuristics as indicative costing analyses. Higher rarity reduced the suboptimality of heuristics (increased their reliability) and there is some evidence that more size variation did the same for the total area of selected sites. We discuss the implications of these results for the use of reserve selection algorithms as indicative and real-world planning tools.
Resumo:
To translate and transfer solution data between two totally different meshes (i.e. mesh 1 and mesh 2), a consistent point-searching algorithm for solution interpolation in unstructured meshes consisting of 4-node bilinear quadrilateral elements is presented in this paper. The proposed algorithm has the following significant advantages: (1) The use of a point-searching strategy allows a point in one mesh to be accurately related to an element (containing this point) in another mesh. Thus, to translate/transfer the solution of any particular point from mesh 2 td mesh 1, only one element in mesh 2 needs to be inversely mapped. This certainly minimizes the number of elements, to which the inverse mapping is applied. In this regard, the present algorithm is very effective and efficient. (2) Analytical solutions to the local co ordinates of any point in a four-node quadrilateral element, which are derived in a rigorous mathematical manner in the context of this paper, make it possible to carry out an inverse mapping process very effectively and efficiently. (3) The use of consistent interpolation enables the interpolated solution to be compatible with an original solution and, therefore guarantees the interpolated solution of extremely high accuracy. After the mathematical formulations of the algorithm are presented, the algorithm is tested and validated through a challenging problem. The related results from the test problem have demonstrated the generality, accuracy, effectiveness, efficiency and robustness of the proposed consistent point-searching algorithm. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
OBJECTIVE: To evaluate a diagnostic algorithm for pulmonary tuberculosis based on smear microscopy and objective response to trial of antibiotics. SETTING: Adult medical wards, Hlabisa Hospital, South Africa, 1996-1997. METHODS: Adults with chronic chest symptoms and abnormal chest X-ray had sputum examined for Ziehl-Neelsen stained acid-fast bacilli by light microscopy. Those with negative smears were treated with amoxycillin for 5 days and assessed. Those who had not improved were treated with erythromycin for 5 days and reassessed. Response was compared with mycobacterial culture. RESULTS: Of 280 suspects who completed the diagnostic pathway, 160 (57%) had a positive smear, 46 (17%) responded to amoxycillin, 34 (12%) responded to erythromycin and 40 (14%) were treated as smear-negative tuberculosis. The sensitivity (89%) and specificity (84%) of the full algorithm for culture-positive tuberculosis were high. However, 11 patients (positive predictive value [PPV] 95%) were incorrectly diagnosed with tuberculosis, and 24 cases of tuberculosis (negative predictive value [NPV] 70%) were not identified. NPV improved to 75% when anaemia was included as a predictor. Algorithm performance was independent of human immunodeficiency virus status. CONCLUSION: Sputum smear microscopy plus trial of antibiotic algorithm among a selected group of tuberculosis suspects may increase diagnostic accuracy in district hospitals in developing countries.