956 resultados para Cluster Counting Algorithm


Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To compare the Full Threshold (FT) and SITA Standard (SS) strategies in glaucomatous patients undergoing automated perimetry for the first time. METHODS: Thirty-one glaucomatous patients who had never undergone perimetry underwent automated perimetry (Humphrey, program 30-2) with both FT and SS on the same day, with an interval of at least 15 minutes. The order of the examination was randomized, and only one eye per patient was analyzed. Three analyses were performed: a) all the examinations, regardless of the order of application; b) only the first examinations; c) only the second examinations. In order to calculate the sensitivity of both strategies, the following criteria were used to define abnormality: glaucoma hemifield test (GHT) outside normal limits, pattern standard deviation (PSD) <5%, or a cluster of 3 adjacent points with p<5% at the pattern deviation probability plot. RESULTS: When the results of all examinations were analyzed regardless of the order in which they were performed, the number of depressed points with p<0.5% in the pattern deviation probability map was significantly greater with SS (p=0.037), and the sensitivities were 87.1% for SS and 77.4% for FT (p=0.506). When only the first examinations were compared, there were no statistically significant differences regarding the number of depressed points, but the sensitivity of SS (100%) was significantly greater than that obtained with FT (70.6%) (p=0.048). When only the second examinations were compared, there were no statistically significant differences regarding the number of depressed points, and the sensitivities of SS (76.5%) and FT (85.7%) (p=0.664). CONCLUSION: SS may have a higher sensitivity than FT in glaucomatous patients undergoing automated perimetry for the first time. However, this difference tends to disappear in subsequent examinations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

More than 140 years after the first description of Friedreich ataxia, autosomal recessive ataxias have become one of the more complex fields in Neurogenetics. Currently this group of diseases contains more than 20 clinical entities and an even larger number of associated genes. Some disorders are very rare, restricted to isolated populations, and others are found worldwide. An expressive number of recessive ataxias are treatable, and responsibility for an accurate diagnosis is high. The purpose of this review is to update the practitioner on clinical and pathophysiological aspects of these disorders and to present an algorithm to guide the diagnosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe and present initial results of a weak lensing survey of nearby (z less than or similar to 0.1) galaxy clusters in the Sloan Digital Sky Survey (SDSS). In this first study, galaxy clusters are selected from the SDSS spectroscopic galaxy cluster catalogs of Miller et al. and Berlind et al. We report a total of seven individual low-redshift cluster weak lensing measurements that include A2048, A1767, A2244, A1066, A2199, and two clusters specifically identified with the C4 algorithm. Our program of weak lensing of nearby galaxy clusters in the SDSS will eventually reach similar to 200 clusters, making it the largest weak lensing survey of individual galaxy clusters to date.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. Cluster properties can be more distinctly studied in pairs of clusters, where we expect the effects of interactions to be strong. Aims. We here discuss the properties of the double cluster Abell 1758 at a redshift z similar to 0.279. These clusters show strong evidence for merging. Methods. We analyse the optical properties of the North and South cluster of Abell 1758 based on deep imaging obtained with the Canada-France-Hawaii Telescope (CFHT) archive Megaprime/Megacam camera in the g' and r' bands, covering a total region of about 1.05 x 1.16 deg(2), or 16.1 x 17.6 Mpc(2). Our X-ray analysis is based on archive XMM-Newton images. Numerical simulations were performed using an N-body algorithm to treat the dark-matter component, a semi-analytical galaxy-formation model for the evolution of the galaxies and a grid-based hydrodynamic code with a parts per million (PPM) scheme for the dynamics of the intra-cluster medium. We computed galaxy luminosity functions (GLFs) and 2D temperature and metallicity maps of the X-ray gas, which we then compared to the results of our numerical simulations. Results. The GLFs of Abell 1758 North are well fit by Schechter functions in the g' and r' bands, but with a small excess of bright galaxies, particularly in the r' band; their faint-end slopes are similar in both bands. In contrast, the GLFs of Abell 1758 South are not well fit by Schechter functions: excesses of bright galaxies are seen in both bands; the faint-end of the GLF is not very well defined in g'. The GLF computed from our numerical simulations assuming a halo mass-luminosity relation agrees with those derived from the observations. From the X-ray analysis, the most striking features are structures in the metal distribution. We found two elongated regions of high metallicity in Abell 1758 North with two peaks towards the centre. In contrast, Abell 1758 South shows a deficit of metals in its central regions. Comparing observational results to those derived from numerical simulations, we could mimic the most prominent features present in the metallicity map and propose an explanation for the dynamical history of the cluster. We found in particular that in the metal-rich elongated regions of the North cluster, winds had been more efficient than ram-pressure stripping in transporting metal-enriched gas to the outskirts. Conclusions. We confirm the merging structure of the North and South clusters, both at optical and X-ray wavelengths.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scheduling of constrained deadline sporadic task systems on multiprocessor platforms is an area which has received much attention in the recent past. It is widely believed that finding an optimal scheduler is hard, and therefore most studies have focused on developing algorithms with good processor utilization bounds. These algorithms can be broadly classified into two categories: partitioned scheduling in which tasks are statically assigned to individual processors, and global scheduling in which each task is allowed to execute on any processor in the platform. In this paper we consider a third, more general, approach called cluster-based scheduling. In this approach each task is statically assigned to a processor cluster, tasks in each cluster are globally scheduled among themselves, and clusters in turn are scheduled on the multiprocessor platform. We develop techniques to support such cluster-based scheduling algorithms, and also consider properties that minimize total processor utilization of individual clusters. In the last part of this paper, we develop new virtual cluster-based scheduling algorithms. For implicit deadline sporadic task systems, we develop an optimal scheduling algorithm that is neither Pfair nor ERfair. We also show that the processor utilization bound of us-edf{m/(2m−1)} can be improved by using virtual clustering. Since neither partitioned nor global strategies dominate over the other, cluster-based scheduling is a natural direction for research towards achieving improved processor utilization bounds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Casa da Música Foundation, responsible for the management of Casa da Música do Porto building, has the need to obtain statistical data related to the number of building’s visitors. This information is a valuable tool for the elaboration of periodical reports concerning the success of this cultural institution. For this reason it was necessary to develop a system capable of returning the number of visitors for a requested period of time. This represents a complex task due to the building’s unique architectural design, characterized by very large doors and halls, and the sudden large number of people that pass through them in moments preceding and proceeding the different activities occurring in the building. To achieve the technical solution for this challenge, several image processing methods, for people detection with still cameras, were first studied. The next step was the development of a real time algorithm, using OpenCV libraries and computer vision concepts,to count individuals with the desired accuracy. This algorithm includes the scientific and technical knowledge acquired in the study of the previous methods. The themes developed in this thesis comprise the fields of background maintenance, shadow and highlight detection, and blob detection and tracking. A graphical interface was also built, to help on the development, test and tunning of the proposed system, as a complement to the work. Furthermore, tests to the system were also performed, to certify the proposed techniques against a set of limited circumstances. The results obtained revealed that the algorithm was successfully applied to count the number of people in complex environments with reliable accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a step count algorithm designed to work in real-time using low computational power. This proposal is our first step for the development of an indoor navigation system, based on Pedestrian Dead Reckoning (PDR). We present two approaches to solve this problem and compare them based in their error on step counting, as well as, the capability of their use in a real time system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present paper reports the precipitation process of Al3Sc structures in an aluminum scandium alloy, which has been simulated with a synchronous parallel kinetic Monte Carlo (spkMC) algorithm. The spkMC implementation is based on the vacancy diffusion mechanism. To filter the raw data generated by the spkMC simulations, the density-based clustering with noise (DBSCAN) method has been employed. spkMC and DBSCAN algorithms were implemented in the C language and using MPI library. The simulations were conducted in the SeARCH cluster located at the University of Minho. The Al3Sc precipitation was successfully simulated at the atomistic scale with the spkMC. DBSCAN proved to be a valuable aid to identify the precipitates by performing a cluster analysis of the simulation results. The achieved simulations results are in good agreement with those reported in the literature under sequential kinetic Monte Carlo simulations (kMC). The parallel implementation of kMC has provided a 4x speedup over the sequential version.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The parameterized expectations algorithm (PEA) involves a long simulation and a nonlinear least squares (NLS) fit, both embedded in a loop. Both steps are natural candidates for parallelization. This note shows that parallelization can lead to important speedups for the PEA. I provide example code for a simple model that can serve as a template for parallelization of more interesting models, as well as a download link for an image of a bootable CD that allows creation of a cluster and execution of the example code in minutes, with no need to install any software.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Forest fires are defined as uncontrolled fires often occurring in wildland areas, but that can also affect houses or agricultural resources. Causes are both natural (e.g.,lightning phenomena) and anthropogenic (human negligence or arsons).Major environmental factors influencing the fire ignition and propagation are climate and vegetation. Wildfires are most common and severe during drought period and on windy days. Moreover, under water-stress conditions, which occur after a long hot and dry period, the vegetation is more vulnerable to fire. These conditions are common in the United State and Canada, where forest fires represent a big problem. We focused our analysis on the state of Florida, for which a big dataset on forest fires detection is readily available. USDA Forest Service Remote Sensing Application Center, in collaboration with NASA-Goddard Space Flight Center and the University of Maryland, has compiled daily MODIS Thermal Anomalies (fires and biomass burning images) produced by NASA using a contextual algorithm that exploits the strong emission of mid-infrared radiation from fires. Fire classes were converted in GIS format: daily MODIS fire detections are provided as the centroids of the 1 kilometer pixels and compiled into daily Arc/INFO point coverage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The invaded cluster (IC) dynamics introduced by Machta et al. [Phys. Rev. Lett. 75, 2792 (1995)] is extended to the fully frustrated Ising model on a square lattice. The properties of the dynamics that exhibits numerical evidence of self-organized criticality are studied. The fluctuations in the IC dynamics are shown to be intrinsic of the algorithm and the fluctuation-dissipation theorem is no longer valid. The relaxation time is found to be very short and does not present a critical size dependence.