968 resultados para Polygon decomposition


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Brazilian studies involving entomological succession patterns in carcasses have been used to describe the necrophagous entomofauna of a determined municipality or region with forensic objectives. Following the same objectives, an ecological study with 10 calyptrate dipterans was carried out during the winter of 2007 and the summer of 2008 in the metropolitan region of the municipality of Rio de Janeiro. The aim of this study was to describe several aspects of the phenology of these species in three neighbouring areas. Carcasses of three domestic pigs (Sus scrofa L.) were used in each season as models for forensic and legal medicine investigations in the region. Temperature, relative humidity and rainfall were measured daily and their relations with population abundance of the colonising species and the decomposition stages were analysed. Ten fly species were recorded to be colonising the carcasses, five of which belonged to the Calliphoridae family, three to the Muscidae, one to the Fanniidae and one to the Sarcophagidae family. Data show preferences of these species for climatic season and decomposition stage, as well as for the studied area and suggest that short distances can significantly influence the abundance of some species.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In several computer graphics areas, a refinement criterion is often needed to decide whether to goon or to stop sampling a signal. When the sampled values are homogeneous enough, we assume thatthey represent the signal fairly well and we do not need further refinement, otherwise more samples arerequired, possibly with adaptive subdivision of the domain. For this purpose, a criterion which is verysensitive to variability is necessary. In this paper, we present a family of discrimination measures, thef-divergences, meeting this requirement. These convex functions have been well studied and successfullyapplied to image processing and several areas of engineering. Two applications to global illuminationare shown: oracles for hierarchical radiosity and criteria for adaptive refinement in ray-tracing. Weobtain significantly better results than with classic criteria, showing that f-divergences are worth furtherinvestigation in computer graphics. Also a discrimination measure based on entropy of the samples forrefinement in ray-tracing is introduced. The recursive decomposition of entropy provides us with a naturalmethod to deal with the adaptive subdivision of the sampling region

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper surveys control architectures proposed in the literature and describes a control architecture that is being developed for a semi-autonomous underwater vehicle for intervention missions (SAUVIM) at the University of Hawaii. Conceived as hybrid, this architecture has been organized in three layers: planning, control and execution. The mission is planned with a sequence of subgoals. Each subgoal has a related task supervisor responsible for arranging a set of pre-programmed task modules in order to achieve the subgoal. Task modules are the key concept of the architecture. They are the main building blocks and can be dynamically re-arranged by the task supervisor. In our architecture, deliberation takes place at the planning layer while reaction is dealt through the parallel execution of the task modules. Hence, the system presents both a hierarchical and an heterarchical decomposition, being able to show a predictable response while keeping rapid reactivity to the dynamic environment

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evolution of compositions in time, space, temperature or other covariates is frequentin practice. For instance, the radioactive decomposition of a sample changes its composition with time. Some of the involved isotopes decompose into other isotopes of thesample, thus producing a transfer of mass from some components to other ones, butpreserving the total mass present in the system. This evolution is traditionally modelledas a system of ordinary di erential equations of the mass of each component. However,this kind of evolution can be decomposed into a compositional change, expressed interms of simplicial derivatives, and a mass evolution (constant in this example). A rst result is that the simplicial system of di erential equations is non-linear, despiteof some subcompositions behaving linearly.The goal is to study the characteristics of such simplicial systems of di erential equa-tions such as linearity and stability. This is performed extracting the compositional differential equations from the mass equations. Then, simplicial derivatives are expressedin coordinates of the simplex, thus reducing the problem to the standard theory ofsystems of di erential equations, including stability. The characterisation of stabilityof these non-linear systems relays on the linearisation of the system of di erential equations at the stationary point, if any. The eigenvelues of the linearised matrix and theassociated behaviour of the orbits are the main tools. For a three component system,these orbits can be plotted both in coordinates of the simplex or in a ternary diagram.A characterisation of processes with transfer of mass in closed systems in terms of stability is thus concluded. Two examples are presented for illustration, one of them is aradioactive decay

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scarcities of environmental services are no longer merely a remote hypothesis. Consequently, analysis of their inequalities between nations becomes of paramount importance for the achievement of sustainability in terms either of international policy, or of Universalist ethical principles of equity. This paper aims, on the one hand, at revising methodological aspects of the inequality measurement of certain environmental data and, on the other, at extending the scarce empirical evidence relating to the international distribution of Ecological Footprint (EF), by using a longer EF time series. Most of the techniques currently important in the literature are revised and then tested on EF data with interesting results. We look in depth at Lorenz dominance analyses and consider the underlying properties of different inequality indices. Those indices which fit best with environmental inequality measurements are CV2 and GE(2) because of their neutrality property, however a trade-off may occur when subgroup decompositions are performed. A weighting factor decomposition method is proposed in order to isolate weighting factor changes in inequality growth rates. Finally, the only non-ambiguous way of decomposing inequality by source is the natural decomposition of CV2, which additionally allows the interpretation of marginal term contributions. Empirically, this paper contributes to the environmental inequality measurement of EF: this inequality has been quite stable and its change over time is due to per capita vector changes rather than population changes. Almost the entirety of the EF inequality is explainable by differences in the means between the countries of the World Bank group. This finding suggests that international environmental agreements should be attempted on a regional basis in an attempt to achieve greater consensus between the parties involved. Additionally, source decomposition warns of the dangers of confining CO2 emissions reduction to crop-based energies because of the implications for basic needs satisfaction.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, White (2007) analysed the international inequalities in Ecological Footprints per capita (EF hereafter) based on a two-factor decomposition of an index from the Atkinson family (Atkinson (1970)). Specifically, this paper evaluated the separate role of environment intensity (EF/GDP) and average income as explanatory factors for these global inequalities. However, in addition to other comments on their appeal, this decomposition suffers from the serious limitation of the omission of the role exerted by probable factorial correlation (York et al. (2005)). This paper proposes, by way of an alternative, a decomposition of a conceptually similar index like Theil’s (Theil, 1967) which, in effect, permits clear decomposition in terms of the role of both factors plus an inter-factor correlation, in line with Duro and Padilla (2006). This decomposition might, in turn, be extended to group inequality components (Shorrocks, 1980), an analysis that cannot be conducted in the case of the Atkinson indices. The proposed methodology is implemented empirically with the aim of analysing the international inequalities in EF per capita for the 1980-2007 period and, amongst other results, we find that, indeed, the interactive component explains, to a significant extent, the apparent pattern of stability observed in overall international inequalities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Spatial data on species distributions are available in two main forms, point locations and distribution maps (polygon ranges and grids). The first are often temporally and spatially biased, and too discontinuous, to be useful (untransformed) in spatial analyses. A variety of modelling approaches are used to transform point locations into maps. We discuss the attributes that point location data and distribution maps must satisfy in order to be useful in conservation planning. We recommend that before point location data are used to produce and/or evaluate distribution models, the dataset should be assessed under a set of criteria, including sample size, age of data, environmental/geographical coverage, independence, accuracy, time relevance and (often forgotten) representation of areas of permanent and natural presence of the species. Distribution maps must satisfy additional attributes if used for conservation analyses and strategies, including minimizing commission and omission errors, credibility of the source/assessors and availability for public screening. We review currently available databases for mammals globally and show that they are highly variable in complying with these attributes. The heterogeneity and weakness of spatial data seriously constrain their utility to global and also sub-global scale conservation analyses.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Scarcities of environmental services are no longer merely a remote hypothesis. Consequently, analysis of their inequalities between nations becomes of paramount importance for the achievement of sustainability in terms either of international policy, or of Universalist ethical principles of equity. This paper aims, on the one hand, at revising methodological aspects of the inequality measurement of certain environmental data and, on the other, at extending the scarce empirical evidence relating to the international distribution of Ecological Footprint (EF), by using a longer EF time series. Most of the techniques currently important in the literature are revised and then tested on EF data with interesting results. We look in depth at Lorenz dominance analyses and consider the underlying properties of different inequality indices. Those indices which fit best with environmental inequality measurements are CV2 and GE(2) because of their neutrality property, however a trade-off may occur when subgroup decompositions are performed. A weighting factor decomposition method is proposed in order to isolate weighting factor changes in inequality growth rates. Finally, the only non-ambiguous way of decomposing inequality by source is the natural decomposition of CV2, which additionally allows the interpretation of marginal term contributions. Empirically, this paper contributes to the environmental inequality measurement of EF: this inequality has been quite stable and its change over time is due to per capita vector changes rather than population changes. Almost the entirety of the EF inequality is explainable by differences in the means between the countries of the World Bank group. This finding suggests that international environmental agreements should be attempted on a regional basis in an attempt to achieve greater consensus between the parties involved. Additionally, source decomposition warns of the dangers of confining CO2 emissions reduction to crop-based energies because of the implications for basic needs satisfaction. Keywords: ecological footprint; ecological inequality measurement, inequality decomposition.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Recently, White (2007) analysed the international inequalities in Ecological Footprints per capita (EF hereafter) based on a two-factor decomposition of an index from the Atkinson family (Atkinson (1970)). Specifically, this paper evaluated the separate role of environment intensity (EF/GDP) and average income as explanatory factors for these global inequalities. However, in addition to other comments on their appeal, this decomposition suffers from the serious limitation of the omission of the role exerted by probable factorial correlation (York et al. (2005)). This paper proposes, by way of an alternative, a decomposition of a conceptually similar index like Theil’s (Theil, 1967) which, in effect, permits clear decomposition in terms of the role of both factors plus an inter-factor correlation, in line with Duro and Padilla (2006). This decomposition might, in turn, be extended to group inequality components (Shorrocks, 1980), an analysis that cannot be conducted in the case of the Atkinson indices. The proposed methodology is implemented empirically with the aim of analysing the international inequalities in EF per capita for the 1980-2007 period and, amongst other results, we find that, indeed, the interactive component explains, to a significant extent, the apparent pattern of stability observed in overall international inequalities. Key words: ecological footprint; international environmental distribution; inequality decomposition

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we study the average inter-crossing number between two random walks and two random polygons in the three-dimensional space. The random walks and polygons in this paper are the so-called equilateral random walks and polygons in which each segment of the walk or polygon is of unit length. We show that the mean average inter-crossing number ICN between two equilateral random walks of the same length n is approximately linear in terms of n and we were able to determine the prefactor of the linear term, which is a = (3 In 2)/(8) approximate to 0.2599. In the case of two random polygons of length n, the mean average inter-crossing number ICN is also linear, but the prefactor of the linear term is different from that of the random walks. These approximations apply when the starting points of the random walks and polygons are of a distance p apart and p is small compared to n. We propose a fitting model that would capture the theoretical asymptotic behaviour of the mean average ICN for large values of p. Our simulation result shows that the model in fact works very well for the entire range of p. We also study the mean ICN between two equilateral random walks and polygons of different lengths. An interesting result is that even if one random walk (polygon) has a fixed length, the mean average ICN between the two random walks (polygons) would still approach infinity if the length of the other random walk (polygon) approached infinity. The data provided by our simulations match our theoretical predictions very well.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work analyzes whether the relationship between risk and returns predicted by the Capital Asset Pricing Model (CAPM) is valid in the Brazilian stock market. The analysis is based on discrete wavelet decomposition on different time scales. This technique allows to analyze the relationship between different time horizons, since the short-term ones (2 to 4 days) up to the long-term ones (64 to 128 days). The results indicate that there is a negative or null relationship between systemic risk and returns for Brazil from 2004 to 2007. As the average excess return of a market portfolio in relation to a risk-free asset during that period was positive, it would be expected this relationship to be positive. That is, higher systematic risk should result in higher excess returns, which did not occur. Therefore, during that period, appropriate compensation for systemic risk was not observed in the Brazilian market. The scales that proved to be most significant to the risk-return relation were the first three, which corresponded to short-term time horizons. When treating differently, year-by-year, and consequently separating positive and negative premiums, some relevance is found, during some years, in the risk/return relation predicted by the CAPM. However, this pattern did not persist throughout the years. Therefore, there is not any evidence strong enough confirming that the asset pricing follows the model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A conceptually new approach is introduced for the decomposition of the molecular energy calculated at the density functional theory level of theory into sum of one- and two-atomic energy components, and is realized in the "fuzzy atoms" framework. (Fuzzy atoms mean that the three-dimensional physical space is divided into atomic regions having no sharp boundaries but exhibiting a continuous transition from one to another.) The new scheme uses the new concept of "bond order density" to calculate the diatomic exchange energy components and gives them unexpectedly close to the values calculated by the exact (Hartree-Fock) exchange for the same Kohn-Sham orbitals

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we characterize the non-emptiness of the equity core (Selten, 1978) and provide a method, easy to implement, for computing the Lorenz-maximal allocations in the equal division core (Dutta-Ray, 1991). Both results are based on a geometrical decomposition of the equity core as a finite union of polyhedrons. Keywords: Cooperative game, equity core, equal division core, Lorenz domination. JEL classification: C71

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Computational Biophysics Group at the Universitat Pompeu Fabra (GRIB-UPF) hosts two unique computational resources dedicated to the execution of large scale molecular dynamics (MD) simulations: (a) the ACMD molecular-dynamics software, used on standard personal computers with graphical processing units (GPUs); and (b) the GPUGRID. net computing network, supported by users distributed worldwide that volunteer GPUs for biomedical research. We leveraged these resources and developed studies, protocols and open-source software to elucidate energetics and pathways of a number of biomolecular systems, with a special focus on flexible proteins with many degrees of freedom. First, we characterized ion permeation through the bactericidal model protein Gramicidin A conducting one of the largest studies to date with the steered MD biasing methodology. Next, we addressed an open problem in structural biology, the determination of drug-protein association kinetics; we reconstructed the binding free energy, association, and dissaciociation rates of a drug like model system through a spatial decomposition and a Makov-chain analysis. The work was published in the Proceedings of the National Academy of Sciences and become one of the few landmark papers elucidating a ligand-binding pathway. Furthermore, we investigated the unstructured Kinase Inducible Domain (KID), a 28-peptide central to signalling and transcriptional response; the kinetics of this challenging system was modelled with a Markovian approach in collaboration with Frank Noe’s group at the Freie University of Berlin. The impact of the funding includes three peer-reviewed publication on high-impact journals; three more papers under review; four MD analysis components, released as open-source software; MD protocols; didactic material, and code for the hosting group.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

To date, state-of-the-art seismic material parameter estimates from multi-component sea-bed seismic data are based on the assumption that the sea-bed consists of a fully elastic half-space. In reality, however, the shallow sea-bed generally consists of soft, unconsolidated sediments that are characterized by strong to very strong seismic attenuation. To explore the potential implications, we apply a state-of-the-art elastic decomposition algorithm to synthetic data for a range of canonical sea-bed models consisting of a viscoelastic half-space of varying attenuation. We find that in the presence of strong seismic attenuation, as quantified by Q-values of 10 or less, significant errors arise in the conventional elastic estimation of seismic properties. Tests on synthetic data indicate that these errors can be largely avoided by accounting for the inherent attenuation of the seafloor when estimating the seismic parameters. This can be achieved by replacing the real-valued expressions for the elastic moduli in the governing equations in the parameter estimation by their complex-valued viscoelastic equivalents. The practical application of our parameter procedure yields realistic estimates of the elastic seismic material properties of the shallow sea-bed, while the corresponding Q-estimates seem to be biased towards too low values, particularly for S-waves. Given that the estimation of inelastic material parameters is notoriously difficult, particularly in the immediate vicinity of the sea-bed, this is expected to be of interest and importance for civil and ocean engineering purposes.