979 resultados para methods: statistical


Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: To make individual assessments using automated quantification methodology in order to screen for perfusion abnormalities in cerebral SPECT examinations among a sample of subjects with OCD. METHODS: Statistical parametric mapping (SPM) was used to compare 26 brain SPECT images from patients with OCD individually with an image bank of 32 normal subjects, using the statistical threshold of p < 0.05 (corrected for multiple comparisons at the level of individual voxels or clusters). The maps were analyzed, and regions presenting voxels that remained above this threshold were sought. RESULTS: Six patients from a sample of 26 OCD images showed abnormalities at cluster or voxel level, considering the criteria described above, which represented 23.07%. However, seven images from the normal group of 32 were also indicated as cases of perfusional abnormality, representing 21.8% of the sample. CONCLUSION: The automated quantification method was not considered to be a useful tool for clinical practice, for analyses complementary to visual inspection.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES: The aim of the study was to assess whether prospective follow-up data within the Swiss HIV Cohort Study can be used to predict patients who stop smoking; or among smokers who stop, those who start smoking again. METHODS: We built prediction models first using clinical reasoning ('clinical models') and then by selecting from numerous candidate predictors using advanced statistical methods ('statistical models'). Our clinical models were based on literature that suggests that motivation drives smoking cessation, while dependence drives relapse in those attempting to stop. Our statistical models were based on automatic variable selection using additive logistic regression with component-wise gradient boosting. RESULTS: Of 4833 smokers, 26% stopped smoking, at least temporarily; because among those who stopped, 48% started smoking again. The predictive performance of our clinical and statistical models was modest. A basic clinical model for cessation, with patients classified into three motivational groups, was nearly as discriminatory as a constrained statistical model with just the most important predictors (the ratio of nonsmoking visits to total visits, alcohol or drug dependence, psychiatric comorbidities, recent hospitalization and age). A basic clinical model for relapse, based on the maximum number of cigarettes per day prior to stopping, was not as discriminatory as a constrained statistical model with just the ratio of nonsmoking visits to total visits. CONCLUSIONS: Predicting smoking cessation and relapse is difficult, so that simple models are nearly as discriminatory as complex ones. Patients with a history of attempting to stop and those known to have stopped recently are the best candidates for an intervention.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Soil organic matter (SOM) plays an important role in physical, chemical and biological properties of soil. Therefore, the amount of SOM is important for soil management for sustainable agriculture. The objective of this work was to evaluate the amount of SOM in oxisols by different methods and compare them, using principal component analysis, regarding their limitations. The methods used in this work were Walkley-Black, elemental analysis, total organic carbon (TOC) and thermogravimetry. According to our results, TOC and elemental analysis were the most satisfactory methods for carbon quantification, due to their better accuracy and reproducibility.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A flow injection method for the quantitative analysis of vancomycin hydrochloride, C66H75Cl2N9O24.HCl (HVCM), based on the reaction with copper (II) ions, is presented. HVCM forms a lilac-blue complex with copper ions at pH≅4.5 in aqueous solutions, with maximum absorption at 555 nm. The detection limit was estimated to be about 8.5×10-5 mol L-1; the quantitation limit is about 2.5×10-4 mol L-1 and about 30 determinations can be performed in an hour. The accuracy of the method was tested through recovery procedures in presence of four different excipients, in the proportion 1:1 w/w. The results were compared with those obtained with the batch spectrophotometric and with the HPLC methods. Statistical comparison was done using the Student's procedure. Complete agreement was found at a 0.95 significance level between the proposed flow injection and the batch spectrophotometric methods, which present similar precision (RSD: 2.1 % vs. 1.9%).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The dynamical processes that lead to open cluster disruption cause its mass to decrease. To investigate such processes from the observational point of view, it is important to identify open cluster remnants (OCRs), which are intrinsically poorly populated. Due to their nature, distinguishing them from field star fluctuations is still an unresolved issue. In this work, we developed a statistical diagnostic tool to distinguish poorly populated star concentrations from background field fluctuations. We use 2MASS photometry to explore one of the conditions required for a stellar group to be a physical group: to produce distinct sequences in a colour-magnitude diagram (CMD). We use automated tools to (i) derive the limiting radius; (ii) decontaminate the field and assign membership probabilities; (iii) fit isochrones; and (iv) compare object and field CMDs, considering the isochrone solution, in order to verify the similarity. If the object cannot be statistically considered as a field fluctuation, we derive its probable age, distance modulus, reddening and uncertainties in a self-consistent way. As a test, we apply the tool to open clusters and comparison fields. Finally, we study the OCR candidates DoDz 6, NGC 272, ESO 435 SC48 and ESO 325 SC15. The tool is optimized to treat these low-statistic objects and to separate the best OCR candidates for studies on kinematics and chemical composition. The study of the possible OCRs will certainly provide a deep understanding of OCR properties and constraints for theoretical models, including insights into the evolution of open clusters and dissolution rates.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Evidence of jet precession in many galactic and extragalactic sources has been reported in the literature. Much of this evidence is based on studies of the kinematics of the jet knots, which depends on the correct identification of the components to determine their respective proper motions and position angles on the plane of the sky. Identification problems related to fitting procedures, as well as observations poorly sampled in time, may influence the follow-up of the components in time, which consequently might contribute to a misinterpretation of the data. In order to deal with these limitations, we introduce a very powerful statistical tool to analyse jet precession: the cross-entropy method for continuous multi-extremal optimization. Only based on the raw data of the jet components (right ascension and declination offsets from the core), the cross-entropy method searches for the precession model parameters that better represent the data. In this work we present a large number of tests to validate this technique, using synthetic precessing jets built from a given set of precession parameters. With the aim of recovering these parameters, we applied the cross-entropy method to our precession model, varying exhaustively the quantities associated with the method. Our results have shown that even in the most challenging tests, the cross-entropy method was able to find the correct parameters within a 1 per cent level. Even for a non-precessing jet, our optimization method could point out successfully the lack of precession.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We estimate the conditions for detectability of two planets in a 2/1 mean-motion resonance from radial velocity data, as a function of their masses, number of observations and the signal-to-noise ratio. Even for a data set of the order of 100 observations and standard deviations of the order of a few meters per second, we find that Jovian-size resonant planets are difficult to detect if the masses of the planets differ by a factor larger than similar to 4. This is consistent with the present population of real exosystems in the 2/1 commensurability, most of which have resonant pairs with similar minimum masses, and could indicate that many other resonant systems exist, but are currently beyond the detectability limit. Furthermore, we analyze the error distribution in masses and orbital elements of orbital fits from synthetic data sets for resonant planets in the 2/1 commensurability. For various mass ratios and number of data points we find that the eccentricity of the outer planet is systematically overestimated, although the inner planet`s eccentricity suffers a much smaller effect. If the initial conditions correspond to small-amplitude oscillations around stable apsidal corotation resonances, the amplitudes estimated from the orbital fits are biased toward larger amplitudes, in accordance to results found in real resonant extrasolar systems.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Astronomy has evolved almost exclusively by the use of spectroscopic and imaging techniques, operated separately. With the development of modern technologies, it is possible to obtain data cubes in which one combines both techniques simultaneously, producing images with spectral resolution. To extract information from them can be quite complex, and hence the development of new methods of data analysis is desirable. We present a method of analysis of data cube (data from single field observations, containing two spatial and one spectral dimension) that uses Principal Component Analysis (PCA) to express the data in the form of reduced dimensionality, facilitating efficient information extraction from very large data sets. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates ordered by principal components of decreasing variance. The new coordinates are referred to as eigenvectors, and the projections of the data on to these coordinates produce images we will call tomograms. The association of the tomograms (images) to eigenvectors (spectra) is important for the interpretation of both. The eigenvectors are mutually orthogonal, and this information is fundamental for their handling and interpretation. When the data cube shows objects that present uncorrelated physical phenomena, the eigenvector`s orthogonality may be instrumental in separating and identifying them. By handling eigenvectors and tomograms, one can enhance features, extract noise, compress data, extract spectra, etc. We applied the method, for illustration purpose only, to the central region of the low ionization nuclear emission region (LINER) galaxy NGC 4736, and demonstrate that it has a type 1 active nucleus, not known before. Furthermore, we show that it is displaced from the centre of its stellar bulge.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There are several papers on pruning methods in the artificial neural networks area. However, with rare exceptions, none of them presents an appropriate statistical evaluation of such methods. In this article, we proved statistically the ability of some methods to reduce the number of neurons of the hidden layer of a multilayer perceptron neural network (MLP), and to maintain the same landing of classification error of the initial net. They are evaluated seven pruning methods. The experimental investigation was accomplished on five groups of generated data and in two groups of real data. Three variables were accompanied in the study: apparent classification error rate in the test group (REA); number of hidden neurons, obtained after the application of the pruning method; and number of training/retraining epochs, to evaluate the computational effort. The non-parametric Friedman's test was used to do the statistical analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: To make individual assessments using automated quantification methodology in order to screen for perfusion abnormalities in cerebral SPECT examinations among a sample of subjects with OCD. Methods: Statistical parametric mapping (SPM) was used to compare 26 brain SPECT images from patients with OCD individually with an image bank of 32 normal subjects, using the statistical threshold of p < 0.05 (corrected for multiple comparisons at the level of individual voxels or clusters). The maps were analyzed, and regions presenting voxels that remained above this threshold were sought. results: Six patients from a sample of 26 OCD images showed abnormalities at cluster or voxel level, considering the criteria described above, which represented 23.07%. However, seven images from the normal group of 32 were also indicated as cases of perfusional abnormality, representing 21.8% of the sample. Conclusion: The automated quantification method was not considered to be a useful tool for clinical practice, for analyses complementary to visual inspection.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The dynamics of dissipative and coherent N-body systems, such as a Bose-Einstein condensate, which can be described by an extended Gross-Pitaevskii formalism, is investigated. In order to analyze chaotic and unstable regimes, two approaches are considered: a metric one, based on calculations of Lyapunov exponents, and an algorithmic one, based on the Lempel-Ziv criterion. The consistency of both approaches is established, with the Lempel-Ziv algorithmic found as an efficient complementary approach to the metric one for the fast characterization of dynamical behaviors obtained from finite sequences. © 2013 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aim The aim of this study was to understand the biogeography of Brachygastra. As the spatial component of evolution is of fundamental importance to understanding the processes shaping the evolution of taxa, the known geological history of the Neotropical region was used together with the current phylogeny and distribution of species to investigate questions concerning the biogeography of Brachygastra: the ancestral ranges of Brachygastra species; their areal relationships and their congruence with previously published hypotheses; the possible associated vicariance events and the influence of land bridges between North and South America, and the split between the Amazon and Atlantic forests. Location Neotropical region, from Mexico to central Argentina and southern USA. Methods Statistical dispersal–vicariance analysis (S-DIVA) was used to reconstruct the possible ancestral ranges of Brachygastra species based on their phylogeny (divided into three groups, lecheguana, scuttelaris and smithii). A Brooks parsimony analysis (BPA) and component analysis were performed to reconstruct the areal relationships of these species within the Neotropics. Results S-DIVA suggested a widespread, South American ancestral region for Brachygastra. The ancestral B. azteca probably reached the Nearctic before a posterior vicariance event separated it from the species groups ((lecheguana (scutellaris + smithii))), that stayed in the Atlantic forest. The ancestral (scutellaris + smithii groups) possibly reached the Amazon by dispersal, and the subsequent vicariance event splitting the Atlantic forest and Amazon separated the groups into scutellaris in the Atlantic forest and smithii in the Amazon. BPA and component analyses suggested that the Nearctic was a sister area to other regions, the Andes and Mesoamerica was a sister area to the Neotropical regions and the Amazon was closely related to the Atlantic forest. Main conclusions The phylogeny and distribution of Brachygastra suggest the influence of a land bridge between the Northern and Southern Hemispheres affecting the cladogenesis of B. azteca and the importance of the formation of the two blocks of forests in South America to the cladogenesis of the main groups of Brachygastra. Future comparisons between the distribution patterns of other taxa should enable a more precise identification of the possible events and outcomes, adding robustness to the hypothesized areal relationships.