768 resultados para fuzzy entropy
Resumo:
Understanding complex social-ecological systems, and anticipating how they may respond to rapid change, requires an approach that incorporates environmental, social, economic, and policy factors, usually in a context of fragmented data availability. We employed fuzzy cognitive mapping (FCM) to integrate these factors in the assessment of future wildfire risk in the Chiquitania region, Bolivia. In this region, dealing with wildfires is becoming increasingly challenging due to reinforcing feedbacks between multiple drivers. We conducted semi-structured interviews and constructed different FCMs in focus groups to understand the regional dynamics of wildfire from diverse perspectives. We used FCM modelling to evaluate possible adaptation scenarios in the context of future drier climatic conditions. Scenarios also considered possible failure to respond in time to the emergent risk. This approach proved of great potential to support decision-making for risk management. It helped identify key forcing variables and generate insights into potential risks and trade-offs of different strategies. All scenarios showed increased wildfire risk in the event of more droughts. The ‘Hands-off’ scenario resulted in amplified impacts driven by intensifying trends, affecting particularly the agricultural production. The ‘Fire management’ scenario, which adopted a bottom-up approach to improve controlled burning, showed less trade-offs between wildfire risk reduction and production compared to the ‘Fire suppression’ scenario. Findings highlighted the importance of considering strategies that involve all actors who use fire, and the need to nest these strategies for a more systemic approach to manage wildfire risk. The FCM model could be used as a decision-support tool and serve as a ‘boundary object’ to facilitate collaboration and integration of different forms of knowledge and perceptions of fire in the region. This approach has also the potential to support decisions in other dynamic frontier landscapes around the world that are facing increased risk of large wildfires.
Resumo:
Evidence of jet precession in many galactic and extragalactic sources has been reported in the literature. Much of this evidence is based on studies of the kinematics of the jet knots, which depends on the correct identification of the components to determine their respective proper motions and position angles on the plane of the sky. Identification problems related to fitting procedures, as well as observations poorly sampled in time, may influence the follow-up of the components in time, which consequently might contribute to a misinterpretation of the data. In order to deal with these limitations, we introduce a very powerful statistical tool to analyse jet precession: the cross-entropy method for continuous multi-extremal optimization. Only based on the raw data of the jet components (right ascension and declination offsets from the core), the cross-entropy method searches for the precession model parameters that better represent the data. In this work we present a large number of tests to validate this technique, using synthetic precessing jets built from a given set of precession parameters. With the aim of recovering these parameters, we applied the cross-entropy method to our precession model, varying exhaustively the quantities associated with the method. Our results have shown that even in the most challenging tests, the cross-entropy method was able to find the correct parameters within a 1 per cent level. Even for a non-precessing jet, our optimization method could point out successfully the lack of precession.
Resumo:
We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.
Resumo:
Non-linear methods for estimating variability in time-series are currently of widespread use. Among such methods are approximate entropy (ApEn) and sample approximate entropy (SampEn). The applicability of ApEn and SampEn in analyzing data is evident and their use is increasing. However, consistency is a point of concern in these tools, i.e., the classification of the temporal organization of a data set might indicate a relative less ordered series in relation to another when the opposite is true. As highlighted by their proponents themselves, ApEn and SampEn might present incorrect results due to this lack of consistency. In this study, we present a method which gains consistency by using ApEn repeatedly in a wide range of combinations of window lengths and matching error tolerance. The tool is called volumetric approximate entropy, vApEn. We analyze nine artificially generated prototypical time-series with different degrees of temporal order (combinations of sine waves, logistic maps with different control parameter values, random noises). While ApEn/SampEn clearly fail to consistently identify the temporal order of the sequences, vApEn correctly do. In order to validate the tool we performed shuffled and surrogate data analysis. Statistical analysis confirmed the consistency of the method. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
This paper is concerned with the computational efficiency of fuzzy clustering algorithms when the data set to be clustered is described by a proximity matrix only (relational data) and the number of clusters must be automatically estimated from such data. A fuzzy variant of an evolutionary algorithm for relational clustering is derived and compared against two systematic (pseudo-exhaustive) approaches that can also be used to automatically estimate the number of fuzzy clusters in relational data. An extensive collection of experiments involving 18 artificial and two real data sets is reported and analyzed. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This paper tackles the problem of showing that evolutionary algorithms for fuzzy clustering can be more efficient than systematic (i.e. repetitive) approaches when the number of clusters in a data set is unknown. To do so, a fuzzy version of an Evolutionary Algorithm for Clustering (EAC) is introduced. A fuzzy cluster validity criterion and a fuzzy local search algorithm are used instead of their hard counterparts employed by EAC. Theoretical complexity analyses for both the systematic and evolutionary algorithms under interest are provided. Examples with computational experiments and statistical analyses are also presented.
Resumo:
This paper presents an automatic method to detect and classify weathered aggregates by assessing changes of colors and textures. The method allows the extraction of aggregate features from images and the automatic classification of them based on surface characteristics. The concept of entropy is used to extract features from digital images. An analysis of the use of this concept is presented and two classification approaches, based on neural networks architectures, are proposed. The classification performance of the proposed approaches is compared to the results obtained by other algorithms (commonly considered for classification purposes). The obtained results confirm that the presented method strongly supports the detection of weathered aggregates.
Resumo:
We study and compare the information loss of a large class of Gaussian bipartite systems. It includes the usual Caldeira-Leggett-type model as well as Anosov models ( parametric oscillators, the inverted oscillator environment, etc), which exhibit instability, one of the most important characteristics of chaotic systems. We establish a rigorous connection between the quantum Lyapunov exponents and coherence loss, and show that in the case of unstable environments coherence loss is completely determined by the upper quantum Lyapunov exponent, a behavior which is more universal than that of the Caldeira-Leggett-type model.
Resumo:
Measurements of X-ray diffraction, electrical resistivity, and magnetization are reported across the Jahn-Teller phase transition in LaMnO(3). Using a thermodynamic equation, we obtained the pressure derivative of the critical temperature (T(JT)), dT(JT)/dP = -28.3 K GPa(-1). This approach also reveals that 5.7(3)J(mol K)(-1) comes from the volume change and 0.8(2)J(mol K)(-1) from the magnetic exchange interaction change across the phase transition. Around T(JT), a robust increase in the electrical conductivity takes place and the electronic entropy change, which is assumed to be negligible for the majority of electronic systems, was found to be 1.8(3)J(mol K)(-1).
Resumo:
An entropy-based image segmentation approach is introduced and applied to color images obtained from Google Earth. Segmentation refers to the process of partitioning a digital image in order to locate different objects and regions of interest. The application to satellite images paves the way to automated monitoring of ecological catastrophes, urban growth, agricultural activity, maritime pollution, climate changing and general surveillance. Regions representing aquatic, rural and urban areas are identified and the accuracy of the proposed segmentation methodology is evaluated. The comparison with gray level images revealed that the color information is fundamental to obtain an accurate segmentation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The time evolution of the out-of-equilibrium Mott insulator is investigated numerically through calculations of space-time-resolved density and entropy profiles resulting from the release of a gas of ultracold fermionic atoms from an optical trap. For adiabatic, moderate and sudden switching-off of the trapping potential, the out-of-equilibrium dynamics of the Mott insulator is found to differ profoundly from that of the band insulator and the metallic phase, displaying a self-induced stability that is robust within a wide range of densities, system sizes and interaction strengths. The connection between the entanglement entropy and changes of phase, known for equilibrium situations, is found to extend to the out-of-equilibrium regime. Finally, the relation between the system`s long time behavior and the thermalization limit is analyzed. Copyright (C) EPLA, 2011
Resumo:
Nuclear receptors are important targets for pharmaceuticals, but similarities between family members cause difficulties in obtaining highly selective compounds. Synthetic ligands that are selective for thyroid hormone (TH) receptor beta (TR beta) vs. TR alpha reduce cholesterol and fat without effects on heart rate; thus, it is important to understand TR beta-selective binding. Binding of 3 selective ligands (GC-1, KB141, and GC-24) is characterized at the atomic level; preferential binding depends on a nonconserved residue (Asn-331 beta) in the TR beta ligand-binding cavity (LBC), and GC-24 gains extra selectivity from insertion of a bulky side group into an extension of the LBC that only opens up with this ligand. Here we report that the natural TH 3,5,3`-triodothyroacetic acid (Triac) exhibits a previously unrecognized mechanism of TR beta selectivity. TR x-ray structures reveal better fit of ligand with the TR alpha LBC. The TR beta LBC, however, expands relative to TR alpha in the presence of Triac (549 angstrom(3) vs. 461 angstrom(3)), and molecular dynamics simulations reveal that water occupies the extra space. Increased solvation compensates for weaker interactions of ligand with TR beta and permits greater flexibility of the Triac carboxylate group in TR beta than in TR alpha. We propose that this effect results in lower entropic restraint and decreases free energy of interactions between Triac and TR beta, explaining subtype-selective binding. Similar effects could potentially be exploited in nuclear receptor drug design.
Resumo:
Nowadays, noninvasive methods of diagnosis have increased due to demands of the population that requires fast, simple and painless exams. These methods have become possible because of the growth of technology that provides the necessary means of collecting and processing signals. New methods of analysis have been developed to understand the complexity of voice signals, such as nonlinear dynamics aiming at the exploration of voice signals dynamic nature. The purpose of this paper is to characterize healthy and pathological voice signals with the aid of relative entropy measures. Phase space reconstruction technique is also used as a way to select interesting regions of the signals. Three groups of samples were used, one from healthy individuals and the other two from people with nodule in the vocal fold and Reinke`s edema. All of them are recordings of sustained vowel /a/ from Brazilian Portuguese. The paper shows that nonlinear dynamical methods seem to be a suitable technique for voice signal analysis, due to the chaotic component of the human voice. Relative entropy is well suited due to its sensibility to uncertainties, since the pathologies are characterized by an increase in the signal complexity and unpredictability. The results showed that the pathological groups had higher entropy values in accordance with other vocal acoustic parameters presented. This suggests that these techniques may improve and complement the recent voice analysis methods available for clinicians. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
Results from infrared photodissociation (IRPD) spectroscopy and kinetics of singly hydrated, protonated proline indicate that the water molecule hydrogen bonds preferentially to the formally neutral carboxylic acid at low temperatures and at higher temperatures to the protonated N-terminus, which bears the formal charge. Hydration isomer populations obtained from IRPD kinetic data as a function of temperature are used to generate a van`t Hoff plot that reveals that C-terminal binding is enthalpically favored by 4.2-6.4 kJ/mol, whereas N-terminal binding is entropically favored by 31-43 J/(mol K), consistent with a higher calculated barrier for water molecule rotation at the C-terminus.