901 resultados para Subfractals, Subfractal Coding, Model Analysis, Digital Imaging, Pattern Recognition
Resumo:
The response of monsoon circulation in the northern and southern hemisphere to 6 ka orbital forcing has been examined in 17 atmospheric general circulation models and 11 coupled ocean–atmosphere general circulation models. The atmospheric response to increased summer insolation at 6 ka in the northern subtropics strengthens the northern-hemisphere summer monsoons and leads to increased monsoonal precipitation in western North America, northern Africa and China; ocean feedbacks amplify this response and lead to further increase in monsoon precipitation in these three regions. The atmospheric response to reduced summer insolation at 6 ka in the southern subtropics weakens the southern-hemisphere summer monsoons and leads to decreased monsoonal precipitation in northern South America, southern Africa and northern Australia; ocean feedbacks weaken this response so that the decrease in rainfall is smaller than might otherwise be expected. The role of the ocean in monsoonal circulation in other regions is more complex. There is no discernable impact of orbital forcing in the monsoon region of North America in the atmosphere-only simulations but a strong increase in precipitation in the ocean–atmosphere simulations. In contrast, there is a strong atmospheric response to orbital forcing over northern India but ocean feedback reduces the strength of the change in the monsoon although it still remains stronger than today. Although there are differences in magnitude and exact location of regional precipitation changes from model to model, the same basic mechanisms are involved in the oceanic modulation of the response to orbital forcing and this gives rise to a robust ensemble response for each of the monsoon systems. Comparison of simulated and reconstructed changes in regional climate suggest that the coupled ocean–atmosphere simulations produce more realistic changes in the northern-hemisphere monsoons than atmosphere-only simulations, though they underestimate the observed changes in precipitation in all regions. Evaluation of the southern-hemisphere monsoons is limited by lack of quantitative reconstructions, but suggest that model skill in simulating these monsoons is limited.
Resumo:
Climate controls fire regimes through its influence on the amount and types of fuel present and their dryness. CO2 concentration constrains primary production by limiting photosynthetic activity in plants. However, although fuel accumulation depends on biomass production, and hence on CO2 concentration, the quantitative relationship between atmospheric CO2 concentration and biomass burning is not well understood. Here a fire-enabled dynamic global vegetation model (the Land surface Processes and eXchanges model, LPX) is used to attribute glacial–interglacial changes in biomass burning to an increase in CO2, which would be expected to increase primary production and therefore fuel loads even in the absence of climate change, vs. climate change effects. Four general circulation models provided last glacial maximum (LGM) climate anomalies – that is, differences from the pre-industrial (PI) control climate – from the Palaeoclimate Modelling Intercomparison Project Phase~2, allowing the construction of four scenarios for LGM climate. Modelled carbon fluxes from biomass burning were corrected for the model's observed prediction biases in contemporary regional average values for biomes. With LGM climate and low CO2 (185 ppm) effects included, the modelled global flux at the LGM was in the range of 1.0–1.4 Pg C year-1, about a third less than that modelled for PI time. LGM climate with pre-industrial CO2 (280 ppm) yielded unrealistic results, with global biomass burning fluxes similar to or even greater than in the pre-industrial climate. It is inferred that a substantial part of the increase in biomass burning after the LGM must be attributed to the effect of increasing CO2 concentration on primary production and fuel load. Today, by analogy, both rising CO2 and global warming must be considered as risk factors for increasing biomass burning. Both effects need to be included in models to project future fire risks.
Resumo:
Obesity prevalence is increasing. The management of this condition requires a detailed analysis of the global risk factors in order to develop personalised advice. This study is aimed to identify current dietary patterns and habits in Spanish population interested in personalised nutrition and investigate associations with weight status. Self-reported dietary and anthropometrical data from the Spanish participants in the Food4Me study, were used in a multidimensional exploratory analysis to define specific dietary profiles. Two opposing factors were obtained according to food groups’ intake: Factor 1 characterised by a more frequent consumption of traditionally considered unhealthy foods; and Factor 2, where the consumption of “Mediterranean diet” foods was prevalent. Factor 1 showed a direct relationship with BMI (β = 0.226; r2 = 0.259; p < 0.001), while the association with Factor 2 was inverse (β = −0.037; r2 = 0.230; p = 0.348). A total of four categories were defined (Prudent, Healthy, Western, and Compensatory) through classification of the sample in higher or lower adherence to each factor and combining the possibilities. Western and Compensatory dietary patterns, which were characterized by high-density foods consumption, showed positive associations with overweight prevalence. Further analysis showed that prevention of overweight must focus on limiting the intake of known deleterious foods rather than exclusively enhance healthy products.
Resumo:
Texture is one of the most important visual attributes used in image analysis. It is used in many content-based image retrieval systems, where it allows the identification of a larger number of images from distinct origins. This paper presents a novel approach for image analysis and retrieval based on complexity analysis. The approach consists of a texture segmentation step, performed by complexity analysis through BoxCounting fractal dimension, followed by the estimation of complexity of each computed region by multiscale fractal dimension. Experiments have been performed with MRI database in both pattern recognition and image retrieval contexts. Results show the accuracy of the method and also indicate how the performance changes as the texture segmentation process is altered.
Resumo:
The use of implants to rehabilitation of total edentulous, partial edentulous or single tooth is increasing, it is due to the high rate of success that this type of treatment present. The objective of this study was to analyze the mechanical behavior of different positions of two dental implants in a rehabilitation of 4 teeth in the region of maxilla anterior. The groups studied were divided according the positioning of the implants. The Group 1: Internal Hexagonal implant in position of lateral incisors and pontic in region of central incisors; Group 2: Internal Hexagonal implant in position of central incisors and cantilever of the lateral incisors and Group3 - : Internal Hexagonal implants alternate with suspended elements. The Electronic Speckle Pattern Interferometry (ESPI) technique was selected for the mechanical evaluation of the 3 groups performance. The results are shown in interferometric phase maps representing the displacement field of the prosthetic structure.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The identification of gasoline adulteration by organic solvents is not an easy task, because compounds that constitute the solvents are already in gasoline composition. In this work, the combination of Hydrogen Nuclear Magnetic Resonance ((1)H NMR) spectroscopic fingerprintings with pattern-recognition multivariate Soft Independent Modeling of Class Analogy (SIMCA) chemometric analysis provides an original and alternative approach to screening Brazilian commercial gasoline quality in a Monitoring Program for Quality Control of Automotive Fuels. SIMCA was performed on spectroscopic fingerprints to classify the quality of representative commercial gasoline samples selected by Hierarchical Cluster Analysis (HCA) and collected over a 6-month period from different gas stations in the São Paulo state, Brazil. Following optimized the (1)H NMR-SIMCA algorithm, it was possible to correctly classify 92.0% of commercial gasoline samples, which is considered acceptable. The chemometric method is recommended for routine applications in Quality-Control Monitoring Programs, since its measurements are fast and can be easily automated. Also, police laboratories could employ this method for rapid screening analysis to discourage adulteration practices. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
A body of research has developed within the context of nonlinear signal and image processing that deals with the automatic, statistical design of digital window-based filters. Based on pairs of ideal and observed signals, a filter is designed in an effort to minimize the error between the ideal and filtered signals. The goodness of an optimal filter depends on the relation between the ideal and observed signals, but the goodness of a designed filter also depends on the amount of sample data from which it is designed. In order to lessen the design cost, a filter is often chosen from a given class of filters, thereby constraining the optimization and increasing the error of the optimal filter. To a great extent, the problem of filter design concerns striking the correct balance between the degree of constraint and the design cost. From a different perspective and in a different context, the problem of constraint versus sample size has been a major focus of study within the theory of pattern recognition. This paper discusses the design problem for nonlinear signal processing, shows how the issue naturally transitions into pattern recognition, and then provides a review of salient related pattern-recognition theory. In particular, it discusses classification rules, constrained classification, the Vapnik-Chervonenkis theory, and implications of that theory for morphological classifiers and neural networks. The paper closes by discussing some design approaches developed for nonlinear signal processing, and how the nature of these naturally lead to a decomposition of the error of a designed filter into a sum of the following components: the Bayes error of the unconstrained optimal filter, the cost of constraint, the cost of reducing complexity by compressing the original signal distribution, the design cost, and the contribution of prior knowledge to a decrease in the error. The main purpose of the paper is to present fundamental principles of pattern recognition theory within the framework of active research in nonlinear signal processing.
Resumo:
Physical parameters of different types of lenses were measured through digital speckle pattern interferometry (DSPI) using a multimode diode laser as light source. When such lasers emit two or more longitudinal modes simultaneously the speckle image of an object appears covered of contour fringes. By performing the quantitative fringe evaluation the radii of curvature as well as the refractive indexes of the lenses were determined. The fringe quantitative evaluation was carried out through the four- and the eight-stepping techniques and the branch-cut method was employed for phase unwrapping. With all these parameters the focal length was calculated. This whole-field multi-wavelength method does enable the characterization of spherical and aspherical lenses and of positive and negative ones as well. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Structural health monitoring (SHM) is related to the ability of monitoring the state and deciding the level of damage or deterioration within aerospace, civil and mechanical systems. In this sense, this paper deals with the application of a two-step auto-regressive and auto-regressive with exogenous inputs (AR-ARX) model for linear prediction of damage diagnosis in structural systems. This damage detection algorithm is based on the. monitoring of residual error as damage-sensitive indexes, obtained through vibration response measurements. In complex structures there are. many positions under observation and a large amount of data to be handed, making difficult the visualization of the signals. This paper also investigates data compression by using principal component analysis. In order to establish a threshold value, a fuzzy c-means clustering is taken to quantify the damage-sensitive index in an unsupervised learning mode. Tests are made in a benchmark problem, as proposed by IASC-ASCE with different damage patterns. The diagnosis that was obtained showed high correlation with the actual integrity state of the structure. Copyright © 2007 by ABCM.
Resumo:
It was purposed the use of electromyography (EMG) to evaluate the activation of the agonists and antagonists muscles of spastic patients, to test the viability in the development of an instrument that given quantitative data of the patient spasticity. 30 hemiplegic and 15 normal volunteers had been submitted to the EMG of flexor and extensor carpi ulnaris muscles during the flexion and extension movements of the wrist. The individuals with less severe spasticity (mAS (modified Ashworth Scale) ringing 0 to 3 degree), had presented deficit in the activation of the flexor muscles in plegic side in relation to the non plegic side and that the individuals seriously compromised by the spasticity (mAS = 4 degree) present deficit of reciprocal inhibition. One evidenced is that the non plegic member does not present a similar neuro-motor comportment when compared to the normal member. The surface electromyography is a practical clinical instrument to evaluate the patient with spasticity and the hemiplegic patient needs to be evaluated on both sides (deficient and no deficient) because the no compromised side do not show a normality standard.
Resumo:
There is increasing interest in the diving behavior of marine mammals. However, identifying foraging among recorded dives often requires several assumptions. The simultaneous acquisition of images of the prey encountered, together with records of diving behavior will allow researchers to more fully investigate the nature of subsurface behavior. We tested a novel digital camera linked to a time-depth recorder on Antarctic fur seals (Arctocephalus gazella). During the austral summer 2000-2001, this system was deployed on six lactating female fur seals at Bird Island, South Georgia, each for a single foraging trip. The camera was triggered at depths greater than 10 m. Five deployments recorded still images (640 x 480 pixels) at 3-sec intervals (total 8,288 images), the other recorded movie images at 0.2-sec intervals (total 7,598 frames). Memory limitation (64 MB) restricted sampling to approximately 1.5 d of 5-7 d foraging trips. An average of 8.5% of still pictures (2.4%-11.6%) showed krill (Euphausia superba) distinctly, while at least half the images in each deployment were empty, the remainder containing blurred or indistinct prey. In one deployment krill images were recorded within 2.5 h (16 km, assuming 1.8 m/sec travel speed) of leaving the beach. Five of the six deployments also showed other fur seals foraging in conjunction with the study animal. This system is likely to generate exciting new avenues for interpretation of diving behavior.
Resumo:
The analysis of spatial relations among objects in an image is an important vision problem that involves both shape analysis and structural pattern recognition. In this paper, we propose a new approach to characterize the spatial relation along, an important feature of spatial configurations in space that has been overlooked in the literature up to now. We propose a mathematical definition of the degree to which an object A is along an object B, based on the region between A and B and a degree of elongatedness of this region. In order to better fit the perceptual meaning of the relation, distance information is included as well. In order to cover a more wide range of potential applications, both the crisp and fuzzy cases are considered. In the crisp case, the objects are represented in terms of 2D regions or ID contours, and the definition of the alongness between them is derived from a visibility notion and from the region between the objects. However, the computational complexity of this approach leads us to the proposition of a new model to calculate the between region using the convex hull of the contours. On the fuzzy side, the region-based approach is extended. Experimental results obtained using synthetic shapes and brain structures in medical imaging corroborate the proposed model and the derived measures of alongness, thus showing that they agree with the common sense. (C) 2011 Elsevier Ltd. All rights reserved.