97 resultados para Clustering search algorithm
Resumo:
An order of magnitude sensitivity gain is described for using quasar spectra to investigate possible time or space variation in the fine structure constant alpha. Applied to a sample of 30 absorption systems, spanning redshifts 0.5 < z < 1.6, we derive limits on variations in alpha over a wide range of epochs. For the whole sample, Delta alpha/alpha = (-1.1 +/- 0.4) x 10(-5). This deviation is dominated by measurements at z > 1, where Delta alpha/alpha = (-1.9 +/- 0.5) x 10(-5). For z < 1, Delta alpha/alpha = (-0.2 +/- 0.4) x 10(-5). While this is consistent with a time-varying alpha, further work is required to explore possible systematic errors in the data, although careful searches have so far revealed none.
Resumo:
To translate and transfer solution data between two totally different meshes (i.e. mesh 1 and mesh 2), a consistent point-searching algorithm for solution interpolation in unstructured meshes consisting of 4-node bilinear quadrilateral elements is presented in this paper. The proposed algorithm has the following significant advantages: (1) The use of a point-searching strategy allows a point in one mesh to be accurately related to an element (containing this point) in another mesh. Thus, to translate/transfer the solution of any particular point from mesh 2 td mesh 1, only one element in mesh 2 needs to be inversely mapped. This certainly minimizes the number of elements, to which the inverse mapping is applied. In this regard, the present algorithm is very effective and efficient. (2) Analytical solutions to the local co ordinates of any point in a four-node quadrilateral element, which are derived in a rigorous mathematical manner in the context of this paper, make it possible to carry out an inverse mapping process very effectively and efficiently. (3) The use of consistent interpolation enables the interpolated solution to be compatible with an original solution and, therefore guarantees the interpolated solution of extremely high accuracy. After the mathematical formulations of the algorithm are presented, the algorithm is tested and validated through a challenging problem. The related results from the test problem have demonstrated the generality, accuracy, effectiveness, efficiency and robustness of the proposed consistent point-searching algorithm. Copyright (C) 1999 John Wiley & Sons, Ltd.
Resumo:
OBJECTIVE: To evaluate a diagnostic algorithm for pulmonary tuberculosis based on smear microscopy and objective response to trial of antibiotics. SETTING: Adult medical wards, Hlabisa Hospital, South Africa, 1996-1997. METHODS: Adults with chronic chest symptoms and abnormal chest X-ray had sputum examined for Ziehl-Neelsen stained acid-fast bacilli by light microscopy. Those with negative smears were treated with amoxycillin for 5 days and assessed. Those who had not improved were treated with erythromycin for 5 days and reassessed. Response was compared with mycobacterial culture. RESULTS: Of 280 suspects who completed the diagnostic pathway, 160 (57%) had a positive smear, 46 (17%) responded to amoxycillin, 34 (12%) responded to erythromycin and 40 (14%) were treated as smear-negative tuberculosis. The sensitivity (89%) and specificity (84%) of the full algorithm for culture-positive tuberculosis were high. However, 11 patients (positive predictive value [PPV] 95%) were incorrectly diagnosed with tuberculosis, and 24 cases of tuberculosis (negative predictive value [NPV] 70%) were not identified. NPV improved to 75% when anaemia was included as a predictor. Algorithm performance was independent of human immunodeficiency virus status. CONCLUSION: Sputum smear microscopy plus trial of antibiotic algorithm among a selected group of tuberculosis suspects may increase diagnostic accuracy in district hospitals in developing countries.
Resumo:
Normal mixture models are being increasingly used to model the distributions of a wide variety of random phenomena and to cluster sets of continuous multivariate data. However, for a set of data containing a group or groups of observations with longer than normal tails or atypical observations, the use of normal components may unduly affect the fit of the mixture model. In this paper, we consider a more robust approach by modelling the data by a mixture of t distributions. The use of the ECM algorithm to fit this t mixture model is described and examples of its use are given in the context of clustering multivariate data in the presence of atypical observations in the form of background noise.
Resumo:
The aim of this study was to investigate the association between false belief comprehension, the exhibition of pretend play and the use of mental state terms in pre-school children. Ferry children, aged between 36 and 54 months were videotaped engaging in free play with each parent. The exhibit-ion of six distinct acts of pretend play and the expression of 16 mental sr:ate terms were coded during play. Each child was also administered a pantomime task and three standard false belief casks. Reliable associations were also found between false belief performance and the pretence categories of object substitution and role assignment, and the exhibition of imaginary object pantomimes. Moreover, the use of mental state terms was positively correlated with false belief and the pretence categories of object substitution, imaginary play and role assignment, and negatively correlated with the exhibition of body part object pantomimes. These findings indicate that the development of a mental state lexicon and some, bur not all, components of pretend play are dependent on the capacity for metarepresentational cognition.
Resumo:
In this paper, the minimum-order stable recursive filter design problem is proposed and investigated. This problem is playing an important role in pipeline implementation sin signal processing. Here, the existence of a high-order stable recursive filter is proved theoretically, in which the upper bound for the highest order of stable filters is given. Then the minimum-order stable linear predictor is obtained via solving an optimization problem. In this paper, the popular genetic algorithm approach is adopted since it is a heuristic probabilistic optimization technique and has been widely used in engineering designs. Finally, an illustrative example is sued to show the effectiveness of the proposed algorithm.
Resumo:
Using data from the H I Parkes All Sky Survey (HIPASS), we have searched for neutral hydrogen in galaxies in a region similar to25x25 deg(2) centred on NGC 1399, the nominal centre of the Fornax cluster. Within a velocity search range of 300-3700 km s(-1) and to a 3sigma lower flux limit of similar to40 mJy, 110 galaxies with H I emission were detected, one of which is previously uncatalogued. None of the detections has early-type morphology. Previously unknown velocities for 14 galaxies have been determined, with a further four velocity measurements being significantly dissimilar to published values. Identification of an optical counterpart is relatively unambiguous for more than similar to90 per cent of our H I galaxies. The galaxies appear to be embedded in a sheet at the cluster velocity which extends for more than 30degrees across the search area. At the nominal cluster distance of similar to20 Mpc, this corresponds to an elongated structure more than 10 Mpc in extent. A velocity gradient across the structure is detected, with radial velocities increasing by similar to500 km s(-1) from south-east to north-west. The clustering of galaxies evident in optical surveys is only weakly suggested in the spatial distribution of our H I detections. Of 62 H I detections within a 10degrees projected radius of the cluster centre, only two are within the core region (projected radius
Resumo:
In this paper, genetic algorithm (GA) is applied to the optimum design of reinforced concrete liquid retaining structures, which comprise three discrete design variables, including slab thickness, reinforcement diameter and reinforcement spacing. GA, being a search technique based on the mechanics of natural genetics, couples a Darwinian survival-of-the-fittest principle with a random yet structured information exchange amongst a population of artificial chromosomes. As a first step, a penalty-based strategy is entailed to transform the constrained design problem into an unconstrained problem, which is appropriate for GA application. A numerical example is then used to demonstrate strength and capability of the GA in this domain problem. It is shown that, only after the exploration of a minute portion of the search space, near-optimal solutions are obtained at an extremely converging speed. The method can be extended to application of even more complex optimization problems in other domains.
Resumo:
An equivalent algorithm is proposed to simulate thermal effects of the magma intrusion in geological systems, which are composed of porous rocks. Based on the physical and mathematical equivalence, the original magma solidification problem with a moving boundary between the rock and intruded magma is transformed into a new problem without the moving boundary but with a physically equivalent heat source. From the analysis of an ideal solidification model, the physically equivalent heat source has been determined in this paper. The major advantage in using the proposed equivalent algorithm is that the fixed finite element mesh with a variable integration time step can be employed to simulate the thermal effect of the intruded magma solidification using the conventional finite element method. The related numerical results have demonstrated the correctness and usefulness of the proposed equivalent algorithm for simulating the thermal effect of the intruded magma solidification in geological systems. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
Ohman and colleagues provided evidence for preferential processing of pictures depicting fear-relevant animals by showing that pictures of snakes and spiders are found faster among pictures of fiowers and mushrooms than vice versa and that the speed of detecting fear-relevant animals was not affected by set size whereas the speed of detecting fiowers/mushrooms was. Experiment 1 replicated this finding. Experiment 2, however, found similar search advantages when pictures of cats and horses or of wolves and big cats were to be found among pictures of flowers and mushrooms. Moreover, Experiment 3, in a within subject comparison, failed to find faster identification of snakes and spiders than of cats and horses among flowers and mushrooms. The present findings seem to indicate that previous reports of preferential processing of pictures of snakes and spiders in a visual search task may reflect a processing advantage for animal pictures in general rather than fear-relevance.