977 resultados para QA
Resumo:
We consider the multilevel paradigm and its potential to aid the solution of combinatorial optimisation problems. The multilevel paradigm is a simple one, which involves recursive coarsening to create a hierarchy of approximations to the original problem. An initial solution is found (sometimes for the original problem, sometimes the coarsest) and then iteratively refined at each level. As a general solution strategy, the multilevel paradigm has been in use for many years and has been applied to many problem areas (most notably in the form of multigrid techniques). However, with the exception of the graph partitioning problem, multilevel techniques have not been widely applied to combinatorial optimisation problems. In this paper we address the issue of multilevel refinement for such problems and, with the aid of examples and results in graph partitioning, graph colouring and the travelling salesman problem, make a case for its use as a metaheuristic. The results provide compelling evidence that, although the multilevel framework cannot be considered as a panacea for combinatorial problems, it can provide an extremely useful addition to the combinatorial optimisation toolkit. We also give a possible explanation for the underlying process and extract some generic guidelines for its future use on other combinatorial problems.
Resumo:
A complete model of particle impact degradation during dilute-phase pneumatic conveying is developed, which combines a degradation model, based on the experimental determination of breakage matrices, and a physical model of solids and gas flow in the pipeline. The solids flow in a straight pipe element is represented by a model consisting of two zones: a strand-type flow zone immediately downstream of a bend, followed by a fully suspended flow region after dispersion of the strand. The breakage matrices constructed from data on 90° angle single-impact tests are shown to give a good representation of the degradation occurring in a pipe bend of 90° angle. Numerical results are presented for degradation of granulated sugar in a large scale pneumatic conveyor.
Resumo:
The graph-partitioning problem is to divide a graph into several pieces so that the number of vertices in each piece is the same within some defined tolerance and the number of cut edges is minimised. Important applications of the problem arise, for example, in parallel processing where data sets need to be distributed across the memory of a parallel machine. Very effective heuristic algorithms have been developed for this problem which run in real-time, but it is not known how good the partitions are since the problem is, in general, NP-complete. This paper reports an evolutionary search algorithm for finding benchmark partitions. A distinctive feature is the use of a multilevel heuristic algorithm to provide an effective crossover. The technique is tested on several example graphs and it is demonstrated that our method can achieve extremely high quality partitions significantly better than those found by the state-of-the-art graph-partitioning packages.
Resumo:
Identification, when sought, is not necessarily obtained. Operational guidance that is normatively acceptable may be necessary for such cases. We proceed to formalize and illustrate modes of exchanges of individual identity, and provide procedures of recovery strategies in specific prescriptions from an ancient body of law for such situations when, for given types of purposes, individuals of some relevant kind had become intermixed and were undistinguishable. Rules were devised, in a variety of domains, for coping with situations that occur if and when the goal of identification was frustrated. We propose or discuss mathematical representations of such recovery procedures.
Resumo:
The article focuses on an information system to exploit the use of metadata within film and television production. It is noted that the television and film industries are used to working on big projects. This involves the use of actual film, video tape, and P.E.R.T charts for project planning. Scripts are in most instances revised. It is essential to attach information on these in order to manage, track and retrieve them. The use of metadata eases the operations involved in these industries.
Resumo:
Tony Mann provides the crossword solution under the pseudonym of Parsman.
Resumo:
Book review of: Peter Aughton, The Transit of Venus: The Brief, Brilliant Life of Jeremiah Horrocks, Father of British Astronomy, Orion, 2004, 0-297-84721-x, £18.99.
Resumo:
Tony Mann reviews: Owen Gingerich, The Book Nobody Read: In Pursuit of the Revolutions of Nicolaus Copernicus, Heinemann, 2004, 0-434-01315-3, £12.99.
Resumo:
Book review of: Scarlett Thomas, PopCo, London and New York: Fourth Estate, 2004. 1-84115-763-5, £12.99.
Resumo:
Induction Skull Melting (ISM) is used for heating, melting, mixing and, possibly, evaporating reactive liquid metals at high temperatures when a minimum contact at solid walls is required. The numerical model presented here involves the complete time dependent process analysis based on the coupled electromagnetic, temperature and turbulent velocity fields during the melting and liquid shape changes. The simulation is validated against measurements of liquid metal height, temperature and heat losses in a commercial size ISM furnace. The often observed limiting temperature plateau for ever increasing electrical power input is explained by the turbulent convective heat losses. Various methods to increase the superheat within the liquid melt, the process energy efficiency and stability are proposed.
Resumo:
The television and film industries are used to working on large projects. These projects use media and documents of various types, ranging from actual film and videotape to items such as PERT charts for project planning. Some items, such as scripts, evolve over a period and go through many versions. It is often necessary to attach information to these “objects” in order to manage, track, and retrieve them. On large productions there may be hundreds of personnel who need access to this material and who in their turn generate new items which form some part of the final production. The requirements for this industry in terms of an information system may be generalized and a distributed software architecture built, primarily using the internet, to serve the needs of these projects. This architecture must enable potentially very large collections of objects to be managed in a secure environment with distributed responsibilities held by many working on the production. Copyright © 2005 by the Society of Motion Picture and Television Engineers, Inc.
Resumo:
The SB distributional model of Johnson's 1949 paper was introduced by a transformation to normality, that is, z ~ N(0, 1), consisting of a linear scaling to the range (0, 1), a logit transformation, and an affine transformation, z = γ + δu. The model, in its original parameterization, has often been used in forest diameter distribution modelling. In this paper, we define the SB distribution in terms of the inverse transformation from normality, including an initial linear scaling transformation, u = γ′ + δ′z (δ′ = 1/δ and γ′ = �γ/δ). The SB model in terms of the new parameterization is derived, and maximum likelihood estimation schema are presented for both model parameterizations. The statistical properties of the two alternative parameterizations are compared empirically on 20 data sets of diameter distributions of Changbai larch (Larix olgensis Henry). The new parameterization is shown to be statistically better than Johnson's original parameterization for the data sets considered here.