917 resultados para Wavelet Packet Decomposition


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Spanish savings banks attracted quite a considerable amount of interest within the scientific arena, especially subsequent to the disappearance of the regulatory constraints during the second decade of the 1980s. Nonetheless, a lack of research identified with respect to mainstream paths given by strategic groups, and the analysis of the total factor productivity. Therefore, on the basis of the resource-based view of the firm and cluster analysis, we make use of changes in structure and performance ratios in order to identify the strategic groups extant in the sector. We attain a threeways division, which we link with different input-output specifications defining strategic paths. Consequently, on the basis of these three dissimilar approaches we compute and decompose a Hicks-Moorsteen total factor productivity index. Obtained results put forward an interesting interpretation under a multi-strategic approach, together with the setbacks of employing cluster analysis within a complex strategic environment. Moreover, we also propose an ex-post method of analysing the outcomes of the decomposed total factor productivity index that could be merged with non-traditional techniques of forming strategic groups, such as cognitive approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analysis of gas emissions by the input-output subsystem approach provides detailed insight into pollution generation in an economy. Structural decomposition analysis, on the other hand, identifies the factors behind the changes in key variables over time. Extending the input-output subsystem model to account for the changes in these variables reveals the channels by which environmental burdens are caused and transmitted throughout the production system. In this paper we propose a decomposition of the changes in the components of CO2 emissions captured by an input-output subsystems representation. The empirical application is for the Spanish service sector, and the economic and environmental data are for years 1990 and 2000. Our results show that services increased their CO2 emissions mainly because of a rise in emissions generated by non-services to cover the final demand for services. In all service activities, the decomposed effects show an increase in CO2 emissions due to a decrease in emission coefficients (i.e., emissions per unit of output) compensated by an increase in emissions caused both by the input-output coefficients and the rise in demand for services. Finally, large asymmetries exist not only in the quantitative changes in the CO2 emissions of the various services but also in the decomposed effects of these changes. Keywords: structural decomposition analysis, input-output subsystems, CO2 emissions, service sector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To measure the contribution of individual transactions inside the total risk of a credit portfolio is a major issue in financial institutions. VaR Contributions (VaRC) and Expected Shortfall Contributions (ESC) have become two popular ways of quantifying the risks. However, the usual Monte Carlo (MC) approach is known to be a very time consuming method for computing these risk contributions. In this paper we consider the Wavelet Approximation (WA) method for Value at Risk (VaR) computation presented in [Mas10] in order to calculate the Expected Shortfall (ES) and the risk contributions under the Vasicek one-factor model framework. We decompose the VaR and the ES as a sum of sensitivities representing the marginal impact on the total portfolio risk. Moreover, we present technical improvements in the Wavelet Approximation (WA) that considerably reduce the computational effort in the approximation while, at the same time, the accuracy increases.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recognition by the T-cell receptor (TCR) of immunogenic peptides (p) presented by Class I major histocompatibility complexes (MHC) is the key event in the immune response against virus-infected cells or tumor cells. A study of the 2C TCR/SIYR/H-2K(b) system using a computational alanine scanning and a much faster binding free energy decomposition based on the Molecular Mechanics-Generalized Born Surface Area (MM-GBSA) method is presented. The results show that the TCR-p-MHC binding free energy decomposition using this approach and including entropic terms provides a detailed and reliable description of the interactions between the molecules at an atomistic level. Comparison of the decomposition results with experimentally determined activity differences for alanine mutants yields a correlation of 0.67 when the entropy is neglected and 0.72 when the entropy is taken into account. Similarly, comparison of experimental activities with variations in binding free energies determined by computational alanine scanning yields correlations of 0.72 and 0.74 when the entropy is neglected or taken into account, respectively. Some key interactions for the TCR-p-MHC binding are analyzed and some possible side chains replacements are proposed in the context of TCR protein engineering. In addition, a comparison of the two theoretical approaches for estimating the role of each side chain in the complexation is given, and a new ad hoc approach to decompose the vibrational entropy term into atomic contributions, the linear decomposition of the vibrational entropy (LDVE), is introduced. The latter allows the rapid calculation of the entropic contribution of interesting side chains to the binding. This new method is based on the idea that the most important contributions to the vibrational entropy of a molecule originate from residues that contribute most to the vibrational amplitude of the normal modes. The LDVE approach is shown to provide results very similar to those of the exact but highly computationally demanding method.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the PhD thesis “Sound Texture Modeling” we deal with statistical modelling or textural sounds like water, wind, rain, etc. For synthesis and classification. Our initial model is based on a wavelet tree signal decomposition and the modeling of the resulting sequence by means of a parametric probabilistic model, that can be situated within the family of models trainable via expectation maximization (hidden Markov tree model ). Our model is able to capture key characteristics of the source textures (water, rain, fire, applause, crowd chatter ), and faithfully reproduces some of the sound classes. In terms of a more general taxonomy of natural events proposed by Graver, we worked on models for natural event classification and segmentation. While the event labels comprise physical interactions between materials that do not have textural propierties in their enterity, those segmentation models can help in identifying textural portions of an audio recording useful for analysis and resynthesis. Following our work on concatenative synthesis of musical instruments, we have developed a pattern-based synthesis system, that allows to sonically explore a database of units by means of their representation in a perceptual feature space. Concatenative syntyhesis with “molecules” built from sparse atomic representations also allows capture low-level correlations in perceptual audio features, while facilitating the manipulation of textural sounds based on their physical and perceptual properties. We have approached the problem of sound texture modelling for synthesis from different directions, namely a low-level signal-theoretic point of view through a wavelet transform, and a more high-level point of view driven by perceptual audio features in the concatenative synthesis setting. The developed framework provides unified approach to the high-quality resynthesis of natural texture sounds. Our research is embedded within the Metaverse 1 European project (2008-2011), where our models are contributting as low level building blocks within a semi-automated soundscape generation system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The calyptrate dipterans are the most important decomposers of human cadavers. Knowledge of their species and distribution are of great importance to forensic entomology, especially because of the enormous diversity in Brazil. Carcasses of domestic pigs (Sus scrofa, L) were the experimental models used to attract calyptrates of forensic interest during the winters of 2006 and 2007 and the summers of 2006 and 2008. A total of 24,423 specimens from 44 species were collected (19 Muscidae, 2 Fanniidae and 23 Sarcophagidae), three of which were new records of occurrence and 20 of which were new forensic records for the state of Rio de Janeiro. Fourteen of these species were newly identified as forensically important in Brazil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

TCP flows from applications such as the web or ftp are well supported by a Guaranteed Minimum Throughput Service (GMTS), which provides a minimum network throughput to the flow and, if possible, an extra throughput. We propose a scheme for a GMTS using Admission Control (AC) that is able to provide different minimum throughput to different users and that is suitable for "standard" TCP flows. Moreover, we consider a multidomain scenario where the scheme is used in one of the domains, and we propose some mechanisms for the interconnection with neighbor domains. The whole scheme uses a small set of packet classes in a core-stateless network where each class has a different discarding priority in queues assigned to it. The AC method involves only edge nodes and uses a special probing packet flow (marked as the highest discarding priority class) that is sent continuously from ingress to egress through a path. The available throughput in the path is obtained at the egress using measurements of flow aggregates, and then it is sent back to the ingress. At the ingress each flow is detected using an implicit way and then it is admission controlled. If it is accepted, it receives the GMTS and its packets are marked as the lowest discarding priority classes; otherwise, it receives a best-effort service. The scheme is evaluated through simulation in a simple "bottleneck" topology using different traffic loads consisting of "standard" TCP flows that carry files of varying sizes

Relevância:

20.00% 20.00%

Publicador:

Resumo:

All-optical label swapping (AOLS) forms a key technology towards the implementation of all-optical packet switching nodes (AOPS) for the future optical Internet. The capital expenditures of the deployment of AOLS increases with the size of the label spaces (i.e. the number of used labels), since a special optical device is needed for each recognized label on every node. Label space sizes are affected by the way in which demands are routed. For instance, while shortest-path routing leads to the usage of fewer labels but high link utilization, minimum interference routing leads to the opposite. This paper studies all-optical label stacking (AOLStack), which is an extension of the AOLS architecture. AOLStack aims at reducing label spaces while easing the compromise with link utilization. In this paper, an integer lineal program is proposed with the objective of analyzing the softening of the aforementioned trade-off due to AOLStack. Furthermore, a heuristic aiming at finding good solutions in polynomial-time is proposed as well. Simulation results show that AOLStack either a) reduces the label spaces with a low increase in the link utilization or, similarly, b) uses better the residual bandwidth to decrease the number of labels even more

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Piecewise linear models systems arise as mathematical models of systems in many practical applications, often from linearization for nonlinear systems. There are two main approaches of dealing with these systems according to their continuous or discrete-time aspects. We propose an approach which is based on the state transformation, more particularly the partition of the phase portrait in different regions where each subregion is modeled as a two-dimensional linear time invariant system. Then the Takagi-Sugeno model, which is a combination of local model is calculated. The simulation results show that the Alpha partition is well-suited for dealing with such a system

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work analyzes whether the relationship between risk and returns predicted by the Capital Asset Pricing Model (CAPM) is valid in the Brazilian stock market. The analysis is based on discrete wavelet decomposition on different time scales. This technique allows to analyze the relationship between different time horizons, since the short-term ones (2 to 4 days) up to the long-term ones (64 to 128 days). The results indicate that there is a negative or null relationship between systemic risk and returns for Brazil from 2004 to 2007. As the average excess return of a market portfolio in relation to a risk-free asset during that period was positive, it would be expected this relationship to be positive. That is, higher systematic risk should result in higher excess returns, which did not occur. Therefore, during that period, appropriate compensation for systemic risk was not observed in the Brazilian market. The scales that proved to be most significant to the risk-return relation were the first three, which corresponded to short-term time horizons. When treating differently, year-by-year, and consequently separating positive and negative premiums, some relevance is found, during some years, in the risk/return relation predicted by the CAPM. However, this pattern did not persist throughout the years. Therefore, there is not any evidence strong enough confirming that the asset pricing follows the model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper performs an empirical Decomposition of International Inequality in Ecological Footprint in order to quantify to what extent explanatory variables such as a country’s affluence, economic structure, demographic characteristics, climate and technology contributed to international differences in terms of natural resource consumption during the period 1993-2007. We use a Regression-Based Inequality Decomposition approach. As a result, the methodology extends qualitatively the results obtained in standard environmental impact regressions as it comprehends further social dimensions of the Sustainable Development concept, i.e. equity within generations. The results obtained point to prioritizing policies that take into account both future and present generations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present paper we discuss and compare two different energy decomposition schemes: Mayer's Hartree-Fock energy decomposition into diatomic and monoatomic contributions [Chem. Phys. Lett. 382, 265 (2003)], and the Ziegler-Rauk dissociation energy decomposition [Inorg. Chem. 18, 1558 (1979)]. The Ziegler-Rauk scheme is based on a separation of a molecule into fragments, while Mayer's scheme can be used in the cases where a fragmentation of the system in clearly separable parts is not possible. In the Mayer scheme, the density of a free atom is deformed to give the one-atom Mulliken density that subsequently interacts to give rise to the diatomic interaction energy. We give a detailed analysis of the diatomic energy contributions in the Mayer scheme and a close look onto the one-atom Mulliken densities. The Mulliken density ρA has a single large maximum around the nuclear position of the atom A, but exhibits slightly negative values in the vicinity of neighboring atoms. The main connecting point between both analysis schemes is the electrostatic energy. Both decomposition schemes utilize the same electrostatic energy expression, but differ in how fragment densities are defined. In the Mayer scheme, the electrostatic component originates from the interaction of the Mulliken densities, while in the Ziegler-Rauk scheme, the undisturbed fragment densities interact. The values of the electrostatic energy resulting from the two schemes differ significantly but typically have the same order of magnitude. Both methods are useful and complementary since Mayer's decomposition focuses on the energy of the finally formed molecule, whereas the Ziegler-Rauk scheme describes the bond formation starting from undeformed fragment densities

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present work provides a generalization of Mayer's energy decomposition for the density-functional theory (DFT) case. It is shown that one- and two-atom Hartree-Fock energy components in Mayer's approach can be represented as an action of a one-atom potential VA on a one-atom density ρ A or ρ B. To treat the exchange-correlation term in the DFT energy expression in a similar way, the exchange-correlation energy density per electron is expanded into a linear combination of basis functions. Calculations carried out for a number of density functionals demonstrate that the DFT and Hartree-Fock two-atom energies agree to a reasonable extent with each other. The two-atom energies for strong covalent bonds are within the range of typical bond dissociation energies and are therefore a convenient computational tool for assessment of individual bond strength in polyatomic molecules. For nonspecific nonbonding interactions, the two-atom energies are low. They can be either repulsive or slightly attractive, but the DFT results more frequently yield small attractive values compared to the Hartree-Fock case. The hydrogen bond in the water dimer is calculated to be between the strong covalent and nonbonding interactions on the energy scale

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper performs an empirical Decomposition of International Inequality in Ecological Footprint in order to quantify to what extent explanatory variables such as a country’s affluence, economic structure, demographic characteristics, climate and technology contributed to international differences in terms of natural resource consumption during the period 1993-2007. We use a Regression- Based Inequality Decomposition approach. As a result, the methodology extends qualitatively the results obtained in standard environmental impact regressions as it comprehends further social dimensions of the Sustainable Development concept, i.e. equity within generations. The results obtained point to prioritizing policies that take into account both future and present generations. Keywords: Ecological Footprint Inequality, Regression-Based Inequality Decomposition, Intragenerational equity, Sustainable development.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the problem of spatial data mapping. A new method based on wavelet interpolation and geostatistical prediction (kriging) is proposed. The method - wavelet analysis residual kriging (WARK) - is developed in order to assess the problems rising for highly variable data in presence of spatial trends. In these cases stationary prediction models have very limited application. Wavelet analysis is used to model large-scale structures and kriging of the remaining residuals focuses on small-scale peculiarities. WARK is able to model spatial pattern which features multiscale structure. In the present work WARK is applied to the rainfall data and the results of validation are compared with the ones obtained from neural network residual kriging (NNRK). NNRK is also a residual-based method, which uses artificial neural network to model large-scale non-linear trends. The comparison of the results demonstrates the high quality performance of WARK in predicting hot spots, reproducing global statistical characteristics of the distribution and spatial correlation structure.