45 resultados para Meta-heuristics algorithms
Resumo:
We tested the effects of four data characteristics on the results of reserve selection algorithms. The data characteristics were nestedness of features (land types in this case), rarity of features, size variation of sites (potential reserves) and size of data sets (numbers of sites and features). We manipulated data sets to produce three levels, with replication, of each of these data characteristics while holding the other three characteristics constant. We then used an optimizing algorithm and three heuristic algorithms to select sites to solve several reservation problems. We measured efficiency as the number or total area of selected sites, indicating the relative cost of a reserve system. Higher nestedness increased the efficiency of all algorithms (reduced the total cost of new reserves). Higher rarity reduced the efficiency of all algorithms (increased the total cost of new reserves). More variation in site size increased the efficiency of all algorithms expressed in terms of total area of selected sites. We measured the suboptimality of heuristic algorithms as the percentage increase of their results over optimal (minimum possible) results. Suboptimality is a measure of the reliability of heuristics as indicative costing analyses. Higher rarity reduced the suboptimality of heuristics (increased their reliability) and there is some evidence that more size variation did the same for the total area of selected sites. We discuss the implications of these results for the use of reserve selection algorithms as indicative and real-world planning tools.
Resumo:
Despite many successes of conventional DNA sequencing methods, some DNAs remain difficult or impossible to sequence. Unsequenceable regions occur in the genomes of many biologically important organisms, including the human genome. Such regions range in length from tens to millions of bases, and may contain valuable information such as the sequences of important genes. The authors have recently developed a technique that renders a wide range of problematic DNAs amenable to sequencing. The technique is known as sequence analysis via mutagenesis (SAM). This paper presents a number of algorithms for analysing and interpreting data generated by this technique.
Resumo:
This paper reviews the attitudes, skills and knowledge that engineering innovators should possess. It critically analyses and compares sets of graduate attributes from the USA, Australia and Malaysia in terms of which of these relate to the ability to innovate. Innovation can be described as an integrative, meta attribute that overarches most of the other graduate attributes. Due to the “graduate attribute paradox”, it is shown how meeting the stated attributes of graduates by industry does not necessarily satisfy the requirements of industry. It is argued that the culture of the engineering school is an important influence on fostering innovation in engineers.
Resumo:
The BR algorithm is a novel and efficient method to find all eigenvalues of upper Hessenberg matrices and has never been applied to eigenanalysis for power system small signal stability. This paper analyzes differences between the BR and the QR algorithms with performance comparison in terms of CPU time based on stopping criteria and storage requirement. The BR algorithm utilizes accelerating strategies to improve its performance when computing eigenvalues of narrowly banded, nearly tridiagonal upper Hessenberg matrices. These strategies significantly reduce the computation time at a reasonable level of precision. Compared with the QR algorithm, the BR algorithm requires fewer iteration steps and less storage space without depriving of appropriate precision in solving eigenvalue problems of large-scale power systems. Numerical examples demonstrate the efficiency of the BR algorithm in pursuing eigenanalysis tasks of 39-, 68-, 115-, 300-, and 600-bus systems. Experiment results suggest that the BR algorithm is a more efficient algorithm for large-scale power system small signal stability eigenanalysis.
Resumo:
This systematic review aimed to collate randomized controlled trials (RCTs) of various interventions used to treat tardive dyskinesia (TD) and, where appropriate, to combine the data for mete-analysis, Clinical trials were identified by electronic searches, handsearches and contact with principal investigators. Data were extracted independently by two reviewers, for outcomes related to improvement, deterioration, side-effects and drop out rates. Data were pooled using the Mantel-Haenzel Odds Ratio (fixed effect model). For treatments that had significant effects, the number needed to treat (NNT) was calculated. From 296 controlled clinical trials, data were extracted from 47 trials. For most interventions, we could identify no RCT-derived evidence of efficacy. A meta-analysis showed that baclofen, deanol and diazepam were no more effective than a placebo. Single RCTs demonstrated a lack of evidence of any effect for bromocriptine, ceruletide, clonidine, estrogen, gamma linolenic acid, hydergine, lecithin, lithium, progabide, seligiline and tetrahydroisoxazolopyridinol. The meta-analysis found that five interventions were effective: L-dopa, oxypertine, sodium valproate, tiapride and vitamin E; neuroleptic reduction was marginally significant. Data from single RCTs revealed that insulin, alpha methyl dopa and reserpine were more effective than a placebo. There was a significantly increased risk of adverse events associated with baclofen, deanol, L-dopa, oxypertine and reserpine. Metaanalysis of the impact of placebo (n=485) showed that 37.3% of participants showed an improvement. Interpretation of this systematic review requires caution as the individual trials identified tended to have small sample sizes. For many compounds, data from only one trial were available, and where meta-analyses were possible, these were based on a small number of trials. Despite these concerns, the review facilitated the interpretation of the large and diverse range of treatments used for TD. Clinical recommendations for the treatment of TD are made, based on the availability of RCT-derived evidence, the strength of that evidence and the presence of adverse effects. (C) 1999 Elsevier Science B.V. All rights reserved.
Resumo:
Algorithms for explicit integration of structural dynamics problems with multiple time steps (subcycling) are investigated. Only one such algorithm, due to Smolinski and Sleith has proved to be stable in a classical sense. A simplified version of this algorithm that retains its stability is presented. However, as with the original version, it can be shown to sacrifice accuracy to achieve stability. Another algorithm in use is shown to be only statistically stable, in that a probability of stability can be assigned if appropriate time step limits are observed. This probability improves rapidly with the number of degrees of freedom in a finite element model. The stability problems are shown to be a property of the central difference method itself, which is modified to give the subcycling algorithm. A related problem is shown to arise when a constraint equation in time is introduced into a time-continuous space-time finite element model. (C) 1998 Elsevier Science S.A.
Resumo:
Extended gcd calculation has a long history and plays an important role in computational number theory and linear algebra. Recent results have shown that finding optimal multipliers in extended gcd calculations is difficult. We present an algorithm which uses lattice basis reduction to produce small integer multipliers x(1), ..., x(m) for the equation s = gcd (s(1), ..., s(m)) = x(1)s(1) + ... + x(m)s(m), where s1, ... , s(m) are given integers. The method generalises to produce small unimodular transformation matrices for computing the Hermite normal form of an integer matrix.
Resumo:
lBACKGROUND. Management of patients with ductal carcinoma in situ (DCIS) is a dilemma, as mastectomy provides nearly a 100% cure rate but at the expense of physical and psychologic morbidity. It would be helpful if we could predict which patients with DCIS are at sufficiently high risk of local recurrence after conservative surgery (CS) alone to warrant postoperative radiotherapy (RT) and which patients are at sufficient risk of local recurrence after CS + RT to warrant mastectomy. The authors reviewed the published studies and identified the factors that may be predictive of local recurrence after management by mastectomy, CS alone, or CS + RT. METHODS. The authors examined patient, tumor, and treatment factors as potential predictors for local recurrence and estimated the risks of recurrence based on a review of published studies. They examined the effects of patient factors (age at diagnosis and family history), tumor factors (sub-type of DCIS, grade, tumor size, necrosis, and margins), and treatment (mastectomy, CS alone, and CS + RT). The 95% confidence intervals (CI) of the recurrence rates for each of the studies were calculated for subtype, grade, and necrosis, using the exact binomial; the summary recurrence rate and 95% CI for each treatment category were calculated by quantitative meta-analysis using the fixed and random effects models applied to proportions. RESULTS, Meta-analysis yielded a summary recurrence rate of 22.5% (95% CI = 16.9-28.2) for studies employing CS alone, 8.9% (95% CI = 6.8-11.0) for CS + RT, and 1.4% (95% CI = 0.7-2.1) for studies involving mastectomy alone. These summary figures indicate a clear and statistically significant separation, and therefore outcome, between the recurrence rates of each treatment category, despite the likelihood that the patients who underwent CS alone were likely to have had smaller, possibly low grade lesions with clear margins. The patients with risk factors of presence of necrosis, high grade cytologic features, or comedo subtype were found to derive the greatest improvement in local control with the addition of RT to CS. Local recurrence among patients treated by CS alone is approximately 20%, and one-half of the recurrences are invasive cancers. For most patients, RT reduces the risk of recurrence after CS alone by at least 50%. The differences in local recurrence between CS alone and CS + RT are most apparent for those patients with high grade tumors or DCIS with necrosis, or of the comedo subtype, or DCIS with close or positive surgical margins. CONCLUSIONS, The authors recommend that radiation be added to CS if patients with DCIS who also have the risk factors for local recurrence choose breast conservation over mastectomy. The patients who may be suitable for CS alone outside of a clinical trial may be those who have low grade lesions with little or no necrosis, and with clear surgical margins. Use of the summary statistics when discussing outcomes with patients may help the patient make treatment decisions. Cancer 1999;85:616-28. (C) 1999 American Cancer Society.
Resumo:
Objective: To review the epidemiological evidence for the association between passive smoking and lung cancer. Method: Primary studies and meta-analyses examining the relationship between passive smoking and lung cancer were identified through a computerised literature search of Medline and Embase, secondary references, and experts in the field of passive smoking. Primary studies meeting the inclusion criteria were meta-analysed. Results From 1981 to the end of 1999 there have been 76 primary epidemiological studies of passive smoking and lung cancer, and 20 meta-analyses. There were 43 primary studies that met the inclusion criteria for this meta-analysis; more studies than previous assessments. The pooled relative risk (RR) for never-smoking women exposed to environmental tobacco smoke (ETS) from spouses, compared with unexposed never-smoking women was 1.29 (95% CI 1.17-1.43). Sequential cumulative meta-analysed results for each year from 1981 were calculated: since 1992 the RR has been greater than 1.25. For Western industrialised countries the RR for never-smoking women exposed to ETS compared with unexposed never-smoking women, was 1.21 (95% CI 1.10-1.33). Previously published international spousal meta-analyses have all produced statistically significant RRs greater than 1.17. Conclusions The abundance of evidence in this paper, and the consistency of findings across domestic and workplace primary studies, dosimetric extrapolations and meta-analyses, clearly indicates that non-smokers exposed to ETS are at increased risk of lung cancer. Implications: The recommended public health policy is for a total ban on smoking in enclosed public places and work sites.
Resumo:
We aimed to determine the effectiveness of the vaginally administered spermicide nonoxynol-9 (N-9) among women for the prevention of HIV and other sexually transmitted infections (STIs), We did a systematic review of randomised controlled trials, Nine such trials including 5096 women, predominantly sex workers, comparing N-9 with placebo or no treatment, were included. Primary outcomes were new HIV infection, new episodes of various STIs, and genital lesions. Five trials included HIV and nine included STI outcomes, and all but one (2% of the data) contributed to the meta-analysis. Overall, relative risks of HIV infection (1.12, 95% confidence interval 0.88-1.42), gonorrhoea (0.91, 0.67-1.24), chlamyclia (0.88, 0.77-1.01), cervical infection (1.01, 0.84-1-22), trichomoniasis (0.84, 0.69-1.02), bacterial vaginosis (0.88, 0.74-1.04) and candidiasis (0.97, 0.84-1.12) were not significantly different in the N-9 and placebo or no treatment groups. Genital lesions were more common in the N-9 group (1.18, 1.02-1.36). Our review has found no statistically significant reduction in risk of HIV and STIs, and the confidence intervals indicate that any protection that may exist is likely to be very small. There is some evidence of harm through genital lesions. N-9 cannot be recommended for HIV and STI prevention.
Resumo:
In this paper, genetic algorithm (GA) is applied to the optimum design of reinforced concrete liquid retaining structures, which comprise three discrete design variables, including slab thickness, reinforcement diameter and reinforcement spacing. GA, being a search technique based on the mechanics of natural genetics, couples a Darwinian survival-of-the-fittest principle with a random yet structured information exchange amongst a population of artificial chromosomes. As a first step, a penalty-based strategy is entailed to transform the constrained design problem into an unconstrained problem, which is appropriate for GA application. A numerical example is then used to demonstrate strength and capability of the GA in this domain problem. It is shown that, only after the exploration of a minute portion of the search space, near-optimal solutions are obtained at an extremely converging speed. The method can be extended to application of even more complex optimization problems in other domains.