959 resultados para efficient algorithms
Resumo:
Market-based transmission expansion planning gives information to investors on where is the most cost efficient place to invest and brings benefits to those who invest in this grid. However, both market issue and power system adequacy problems are system planers’ concern. In this paper, a hybrid probabilistic criterion of Expected Economical Loss (EEL) is proposed as an index to evaluate the systems’ overall expected economical losses during system operation in a competitive market. It stands on both investors’ and planner’s point of view and will further improves the traditional reliability cost. By applying EEL, it is possible for system planners to obtain a clear idea regarding the transmission network’s bottleneck and the amount of losses arises from this weak point. Sequentially, it enables planners to assess the worth of providing reliable services. Also, the EEL will contain valuable information for moneymen to undertake their investment. This index could truly reflect the random behaviors of power systems and uncertainties from electricity market. The performance of the EEL index is enhanced by applying Normalized Coefficient of Probability (NCP), so it can be utilized in large real power systems. A numerical example is carried out on IEEE Reliability Test System (RTS), which will show how the EEL can predict the current system bottleneck under future operational conditions and how to use EEL as one of planning objectives to determine future optimal plans. A well-known simulation method, Monte Carlo simulation, is employed to achieve the probabilistic characteristic of electricity market and Genetic Algorithms (GAs) is used as a multi-objective optimization tool.
Resumo:
Quantum computers promise to increase greatly the efficiency of solving problems such as factoring large integers, combinatorial optimization and quantum physics simulation. One of the greatest challenges now is to implement the basic quantum-computational elements in a physical system and to demonstrate that they can be reliably and scalably controlled. One of the earliest proposals for quantum computation is based on implementing a quantum bit with two optical modes containing one photon. The proposal is appealing because of the ease with which photon interference can be observed. Until now, it suffered from the requirement for non-linear couplings between optical modes containing few photons. Here we show that efficient quantum computation is possible using only beam splitters, phase shifters, single photon sources and photo-detectors. Our methods exploit feedback from photo-detectors and are robust against errors from photon loss and detector inefficiency. The basic elements are accessible to experimental investigation with current technology.
Resumo:
Despite many successes of conventional DNA sequencing methods, some DNAs remain difficult or impossible to sequence. Unsequenceable regions occur in the genomes of many biologically important organisms, including the human genome. Such regions range in length from tens to millions of bases, and may contain valuable information such as the sequences of important genes. The authors have recently developed a technique that renders a wide range of problematic DNAs amenable to sequencing. The technique is known as sequence analysis via mutagenesis (SAM). This paper presents a number of algorithms for analysing and interpreting data generated by this technique.
Resumo:
The reconstruction of a complex scene from multiple images is a fundamental problem in the field of computer vision. Volumetric methods have proven to be a strong alternative to traditional correspondence-based methods due to their flexible visibility models. In this paper we analyse existing methods for volumetric reconstruction and identify three key properties of voxel colouring algorithms: a water-tight surface model, a monotonic carving order, and causality. We present a new Voxel Colouring algorithm which embeds all reconstructions of a scene into a single output. While modelling exact visibility for arbitrary camera locations, Embedded Voxel Colouring removes the need for a priori threshold selection present in previous work. An efficient implementation is given along with results demonstrating the advantages of posteriori threshold selection.
Resumo:
Growing economic globalisation (a means of market extension) may increase the economic vulnerability of firms in modern industries, especially those in which firms experience substantial economies of scale. The possibility is explored that globalisation activates competitive pressures that forces firms into a situation where their leverage (fixed costs relative to variable costs, or overhead cost relative to operating costs or capital intensity) rises substantially. Consequently, they become increasingly vulnerable to a sudden adverse change in economic conditions, such as a collapse in the demand for their industry’s product. This is explored for monopolistically competitive markets and also for oligopolistic markets of the type considered and modelled by Sweezy using kinked demand curves. In addition, globalisation is hypothesised to induce firms to become more uniformly efficient. While this has static efficiency advantages, this lack of heterogeneity in productive efficiency of firms can make for economic inefficiency in the adjustment of the industry to altered economic conditions. It is shown that lack of variation in the economic efficiency of firms can impede the speed of market adjustment to new equilibria and may destabilise market equilibria.
Resumo:
Algorithms for explicit integration of structural dynamics problems with multiple time steps (subcycling) are investigated. Only one such algorithm, due to Smolinski and Sleith has proved to be stable in a classical sense. A simplified version of this algorithm that retains its stability is presented. However, as with the original version, it can be shown to sacrifice accuracy to achieve stability. Another algorithm in use is shown to be only statistically stable, in that a probability of stability can be assigned if appropriate time step limits are observed. This probability improves rapidly with the number of degrees of freedom in a finite element model. The stability problems are shown to be a property of the central difference method itself, which is modified to give the subcycling algorithm. A related problem is shown to arise when a constraint equation in time is introduced into a time-continuous space-time finite element model. (C) 1998 Elsevier Science S.A.
Resumo:
Extended gcd calculation has a long history and plays an important role in computational number theory and linear algebra. Recent results have shown that finding optimal multipliers in extended gcd calculations is difficult. We present an algorithm which uses lattice basis reduction to produce small integer multipliers x(1), ..., x(m) for the equation s = gcd (s(1), ..., s(m)) = x(1)s(1) + ... + x(m)s(m), where s1, ... , s(m) are given integers. The method generalises to produce small unimodular transformation matrices for computing the Hermite normal form of an integer matrix.
Resumo:
Peanut, one of the world's most important oilseed crops, has a narrow germplasm base and lacks sources of resistance to several major diseases. The species is considered recalcitrant to transformation, with few confirmed transgenic plants upon particle bombardment or Agrobacterium treatment. Reported transformation methods are limited by low efficiency, cultivar specificity, chimeric or infertile transformants, or availability of explants. Here we present a method to efficiently transform cultivars in both botanical types of peanut, by (1) particle bombardment into embryogenic callus derived from mature seeds, (2) escape-free (not stepwise) selection for hygromycin B resistance, (3) brief osmotic desiccation followed by sequential incubation on charcoal and cytokinin-containing media; resulting in efficient conversion of transformed somatic embryos into fertile, non-chimeric, transgenic plants. The method produces three to six independent transformants per bombardment of 10 cm(2) embryogenic callus. Potted, transgenic plant lines can be regenerated within 9 months of callus initiation, or 6 months after bombardment. Transgene copy number ranged from one to 20 with multiple integration sites. There was ca. 50% coexpression of hph and luc or uidA genes coprecipitated on separate plasmids. Reporter gene (luc) expression was confirmed in T-1 progeny from each of six tested independent transformants. Insufficient seeds were produced under containment conditions to determine segregation ratios. The practicality of the technique for efficient cotransformation with selected and unselected genes is demonstrated using major commercial peanut varieties in Australia (cv. NC-7, a virginia market type) and Indonesia (cv. Gajah, a spanish market type).
Resumo:
We tested the effects of four data characteristics on the results of reserve selection algorithms. The data characteristics were nestedness of features (land types in this case), rarity of features, size variation of sites (potential reserves) and size of data sets (numbers of sites and features). We manipulated data sets to produce three levels, with replication, of each of these data characteristics while holding the other three characteristics constant. We then used an optimizing algorithm and three heuristic algorithms to select sites to solve several reservation problems. We measured efficiency as the number or total area of selected sites, indicating the relative cost of a reserve system. Higher nestedness increased the efficiency of all algorithms (reduced the total cost of new reserves). Higher rarity reduced the efficiency of all algorithms (increased the total cost of new reserves). More variation in site size increased the efficiency of all algorithms expressed in terms of total area of selected sites. We measured the suboptimality of heuristic algorithms as the percentage increase of their results over optimal (minimum possible) results. Suboptimality is a measure of the reliability of heuristics as indicative costing analyses. Higher rarity reduced the suboptimality of heuristics (increased their reliability) and there is some evidence that more size variation did the same for the total area of selected sites. We discuss the implications of these results for the use of reserve selection algorithms as indicative and real-world planning tools.
Resumo:
Giles and Goss (1980) have suggested that, if a futures market provides a forward pricing function, then it is an efficient market. In this article a simple test for whether the Australian Wool Futures market is efficient is proposed. The test is based on applying cointegration techniques to test the Law of One Price over a three, six, nine, and twelve month spread of futures prices. We found that the futures market is efficient for up to a six-month spread, but no further into the future. Because futures market prices can be used to predict spot prices up to six months in advance, woolgrowers can use the futures price to assess when they market their clip, but not for longer-term production planning decisions. (C) 1999 John Wiley & Sons, Inc.
Resumo:
Overcoming the phenomenon known as difficult synthetic sequences has been a major goal in solid-phase peptide synthesis for over 30 years. In this work the advantages of amide backbone-substitution in the solid-phase synthesis of difficult peptides are augmented by developing an activated N-alpha-acyl transfer auxiliary. Apart from disrupting troublesome intermolecular hydrogen-bonding networks, the primary function of the activated N-alpha-auxiliary was to facilitate clean and efficient acyl capture of large or beta-branched amino acids and improve acyl transfer yields to the secondary N-alpha-amine. We found o-hydroxyl-substituted nitrobenzyl (Hnb) groups were suitable N-alpha-auxiliaries for this purpose. The relative acyl transfer efficiency of the Hnb auxiliary was superior to the 2-hydroxy-4-methoxybenzyl (Hmb) auxiliary with protected amino acids of varying size. Significantly, this difference in efficiency was more pronounced between more sterically demanding amino acids. The Hnb auxiliary is readily incorporated at the N-alpha-amine during SPPS by reductive alkylation of its corresponding benzaldehyde derivative and conveniently removed by mild photolysis at 366 nm. The usefulness of the Hnb auxiliary for the improvement of coupling efficiencies in the chain-assembly of difficult peptides was demonstrated by the efficient Hnb-assisted Fmoc solid-phase synthesis of a known hindered difficult peptide sequence, STAT-91. This work suggests the Hnb auxiliary will significantly enhance our ability to synthesize difficult polypeptides and increases the applicability of amide-backbone substitution.
Resumo:
Realistic time frames in which management decisions are made often preclude the completion of the detailed analyses necessary for conservation planning. Under these circumstances, efficient alternatives may assist in approximating the results of more thorough studies that require extensive resources and time. We outline a set of concepts and formulas that may be used in lieu of detailed population viability analyses and habitat modeling exercises to estimate the protected areas required to provide desirable conservation outcomes for a suite of threatened plant species. We used expert judgment of parameters and assessment of a population size that results in a specified quasiextinction risk based on simple dynamic models The area required to support a population of this size is adjusted to take into account deterministic and stochastic human influences, including small-scale disturbance deterministic trends such as habitat loss, and changes in population density through processes such as predation and competition. We set targets for different disturbance regimes and geographic regions. We applied our methods to Banksia cuneata, Boronia keysii, and Parsonsia dorrigoensis, resulting in target areas for conservation of 1102, 733, and 1084 ha, respectively. These results provide guidance on target areas and priorities for conservation strategies.
Resumo:
Peptides that induce and recall T-cell responses are called T-cell epitopes. T-cell epitopes may be useful in a subunit vaccine against malaria. Computer models that simulate peptide binding to MHC are useful for selecting candidate T-cell epitopes since they minimize the number of experiments required for their identification. We applied a combination of computational and immunological strategies to select candidate T-cell epitopes. A total of 86 experimental binding assays were performed in three rounds of identification of HLA-All binding peptides from the six preerythrocytic malaria antigens. Thirty-six peptides were experimentally confirmed as binders. We show that the cyclical refinement of the ANN models results in a significant improvement of the efficiency of identifying potential T-cell epitopes. (C) 2001 by Elsevier Science Inc.
Resumo:
In this paper, genetic algorithm (GA) is applied to the optimum design of reinforced concrete liquid retaining structures, which comprise three discrete design variables, including slab thickness, reinforcement diameter and reinforcement spacing. GA, being a search technique based on the mechanics of natural genetics, couples a Darwinian survival-of-the-fittest principle with a random yet structured information exchange amongst a population of artificial chromosomes. As a first step, a penalty-based strategy is entailed to transform the constrained design problem into an unconstrained problem, which is appropriate for GA application. A numerical example is then used to demonstrate strength and capability of the GA in this domain problem. It is shown that, only after the exploration of a minute portion of the search space, near-optimal solutions are obtained at an extremely converging speed. The method can be extended to application of even more complex optimization problems in other domains.