985 resultados para Generalized Procrustes Analysis
Resumo:
This journal provides immediate open access to its content on the principle that making research freely available to the public supports a greater global exchange of knowledge.
Resumo:
Optical fiber microwires (OFMs) are nonlinear optical waveguides that support several spatial modes. The multimodal generalized nonlinear Schrodinger equation (MM-GNLSE) is deduced taking into account the linear and nonlinear modal coupling. A detailed theoretical description of four-wave mixing (FWM) considering the modal coupling is developed. Both, the intramode and the intermode phase-matching conditions is calculated for an optical microwire in a strong guiding regime. Finally, the FWM dynamics is studied and the amplitude evolution of the pump beams, the signal and the idler are analyzed.
Resumo:
International Scientific Forum, ISF 2013, ISF 2013, 12-14 December 2013, Tirana.
Resumo:
3rd SMTDA Conference Proceedings, 11-14 June 2014, Lisbon Portugal.
Resumo:
Linear unmixing decomposes a hyperspectral image into a collection of reflectance spectra of the materials present in the scene, called endmember signatures, and the corresponding abundance fractions at each pixel in a spatial area of interest. This paper introduces a new unmixing method, called Dependent Component Analysis (DECA), which overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical properties of hyperspectral data. DECA models the abundance fractions as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. The performance of the method is illustrated using simulated and real data.
Resumo:
OBJECTIVE To analyze the association between concentrations of air pollutants and admissions for respiratory causes in children. METHODS Ecological time series study. Daily figures for hospital admissions of children aged < 6, and daily concentrations of air pollutants (PM10, SO2, NO2, O3 and CO) were analyzed in the Região da Grande Vitória, ES, Southeastern Brazil, from January 2005 to December 2010. For statistical analysis, two techniques were combined: Poisson regression with generalized additive models and principal model component analysis. Those analysis techniques complemented each other and provided more significant estimates in the estimation of relative risk. The models were adjusted for temporal trend, seasonality, day of the week, meteorological factors and autocorrelation. In the final adjustment of the model, it was necessary to include models of the Autoregressive Moving Average Models (p, q) type in the residuals in order to eliminate the autocorrelation structures present in the components. RESULTS For every 10:49 μg/m3 increase (interquartile range) in levels of the pollutant PM10 there was a 3.0% increase in the relative risk estimated using the generalized additive model analysis of main components-seasonal autoregressive – while in the usual generalized additive model, the estimate was 2.0%. CONCLUSIONS Compared to the usual generalized additive model, in general, the proposed aspect of generalized additive model − principal component analysis, showed better results in estimating relative risk and quality of fit.
Resumo:
A dynamical approach to study the behaviour of generalized populational growth models from Bets(p, 2) densities, with strong Allee effect, is presented. The dynamical analysis of the respective unimodal maps is performed using symbolic dynamics techniques. The complexity of the correspondent discrete dynamical systems is measured in terms of topological entropy. Different populational dynamics regimes are obtained when the intrinsic growth rates are modified: extinction, bistability, chaotic semistability and essential extinction.
Resumo:
This paper introduces a new method to blindly unmix hyperspectral data, termed dependent component analysis (DECA). This method decomposes a hyperspectral images into a collection of reflectance (or radiance) spectra of the materials present in the scene (endmember signatures) and the corresponding abundance fractions at each pixel. DECA assumes that each pixel is a linear mixture of the endmembers signatures weighted by the correspondent abundance fractions. These abudances are modeled as mixtures of Dirichlet densities, thus enforcing the constraints on abundance fractions imposed by the acquisition process, namely non-negativity and constant sum. The mixing matrix is inferred by a generalized expectation-maximization (GEM) type algorithm. This method overcomes the limitations of unmixing methods based on Independent Component Analysis (ICA) and on geometrical based approaches. The effectiveness of the proposed method is illustrated using simulated data based on U.S.G.S. laboratory spectra and real hyperspectral data collected by the AVIRIS sensor over Cuprite, Nevada.
Resumo:
This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.
Resumo:
Complex industrial plants exhibit multiple interactions among smaller parts and with human operators. Failure in one part can propagate across subsystem boundaries causing a serious disaster. This paper analyzes the industrial accident data series in the perspective of dynamical systems. First, we process real world data and show that the statistics of the number of fatalities reveal features that are well described by power law (PL) distributions. For early years, the data reveal double PL behavior, while, for more recent time periods, a single PL fits better into the experimental data. Second, we analyze the entropy of the data series statistics over time. Third, we use the Kullback–Leibler divergence to compare the empirical data and multidimensional scaling (MDS) techniques for data analysis and visualization. Entropy-based analysis is adopted to assess complexity, having the advantage of yielding a single parameter to express relationships between the data. The classical and the generalized (fractional) entropy and Kullback–Leibler divergence are used. The generalized measures allow a clear identification of patterns embedded in the data.
Resumo:
23rd Euromicro International Conference on Parallel, Distributed, and Network-Based Processing (PDP 2015). 4 to 6, Mar, 2015. Turku, Finland.
Resumo:
The objective of the thesis is to analyze the behaviour of the wind flow when it is passing beside the forest. To complete this analysis, a parametric study was done based upon generalized situations. Some abacus have been made, which are related to forest and wind characteristics. The abacus were compared with a particular real case, namely Alexandrovo (Bulgaria), where it was concluded that the applicability of the abacus in projects with complex terrain is low and they must be used, from a quantitative point of view, for flat terrain, being hc the most important parameter.
Resumo:
Complex Microwave Structures Wake Field Computatation PETRA III Generalized Multipole Technique Antenna Antennen Wakefelder Berechnung
Resumo:
This study compared tidepool fish assemblages within and among habitats at Iparana and Pecém beaches, State of Ceará, Northeast Brazil, using visual census techniques. A total of 8,914 fishes, representing 25 families and 43 species were recorded. The most abundant taxon was Sparisoma spp, followed by Haemulon parra (Desmarest, 1823), Acanthurus chirurgus (Bloch, 1787) and Abudefduf saxatilis (Linnaeus, 1758). Haemulidae was the most abundant family in number of individuals, followed by Scaridae, Acanthuridae and Pomacentridae. Within- and between- site differences in species assemblages probably reflected environmental discontinuities and more localized features, such as pool isolation episodes, or environmental complexity, both acting isolated or interactively. The locality of Iparana was probably subjected to a greater fishing pressure and tourism than Pecém, a potential cause for the observed lowest fish abundance and biodiversity. We conclude that tidepool ichthyofauna may be quite variable between and within reef sites. Thus, observations taken from or damages caused on one area may not be generalized to or mitigated by the protection of adjacent sites.
Resumo:
There is recent interest in the generalization of classical factor models in which the idiosyncratic factors are assumed to be orthogonal and there are identification restrictions on cross-sectional and time dimensions. In this study, we describe and implement a Bayesian approach to generalized factor models. A flexible framework is developed to determine the variations attributed to common and idiosyncratic factors. We also propose a unique methodology to select the (generalized) factor model that best fits a given set of data. Applying the proposed methodology to the simulated data and the foreign exchange rate data, we provide a comparative analysis between the classical and generalized factor models. We find that when there is a shift from classical to generalized, there are significant changes in the estimates of the structures of the covariance and correlation matrices while there are less dramatic changes in the estimates of the factor loadings and the variation attributed to common factors.