65 resultados para GENERATION MEANS ANALYSIS

em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain


Relevância:

40.00% 40.00%

Publicador:

Resumo:

HEMOLIA (a project under European community’s 7th framework programme) is a new generation Anti-Money Laundering (AML) intelligent multi-agent alert and investigation system which in addition to the traditional financial data makes extensive use of modern society’s huge telecom data source, thereby opening up a new dimension of capabilities to all Money Laundering fighters (FIUs, LEAs) and Financial Institutes (Banks, Insurance Companies, etc.). This Master-Thesis project is done at AIA, one of the partners for the HEMOLIA project in Barcelona. The objective of this thesis is to find the clusters in a network drawn by using the financial data. An extensive literature survey has been carried out and several standard algorithms related to networks have been studied and implemented. The clustering problem is a NP-hard problem and several algorithms like K-Means and Hierarchical clustering are being implemented for studying several problems relating to sociology, evolution, anthropology etc. However, these algorithms have certain drawbacks which make them very difficult to implement. The thesis suggests (a) a possible improvement to the K-Means algorithm, (b) a novel approach to the clustering problem using the Genetic Algorithms and (c) a new algorithm for finding the cluster of a node using the Genetic Algorithm.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A procedure based on quantum molecular similarity measures (QMSM) has been used to compare electron densities obtained from conventional ab initio and density functional methodologies at their respective optimized geometries. This method has been applied to a series of small molecules which have experimentally known properties and molecular bonds of diverse degrees of ionicity and covalency. Results show that in most cases the electron densities obtained from density functional methodologies are of a similar quality than post-Hartree-Fock generalized densities. For molecules where Hartree-Fock methodology yields erroneous results, the density functional methodology is shown to yield usually more accurate densities than those provided by the second order Møller-Plesset perturbation theory

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Mitochondrial DNA (mtDNA), a maternally inherited 16.6-Kb molecule crucial for energy production, is implicated in numerous human traits and disorders. It has been hypothesized that the presence of mutations in the mtDNA may contribute to the complex genetic basis of schizophreniadisease, due to the evidence of maternal inheritance and the presence of schizophrenia symptoms in patients affected of a mitochondrial disorder related to a mtDNA mutation. The present project aims to study the association of variants of mitochondrial DNA (mtDNA), and an increased risk of schizophrenia in a cohort of patients and controls from the same population. The entire mtDNA of 55 schizophrenia patients with an apparent maternal transmission of the disease and 38 controls was sequenced by Next Generation Sequencing (Ion Torrent PGM, Life Technologies) and compared to the reference sequence. The current method for establishing mtDNA haplotypes is Sanger sequencing, which is laborious, timeconsuming, and expensive. With the emergence of Next Generation Sequencing technologies, this sequencing process can be much more quickly and cost-efficiently. We have identified 14 variants that have not been previously reported. Two of them were missense variants: MTATP6 p.V113M and MTND5 p.F334L ,and also three variants encoding rRNA and one variant encoding tRNA. Not significant differences have been found in the number of variants between the two groups. We found that the sequence alignment algorithm employed to align NGS reads played a significant role in the analysis of the data and the resulting mtDNA haplotypes. Further development of the bioinformatics analysis and annotation step would be desirable to facilitate the application of NGS in mtDNA analysis.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

With the aim of monitoring the dynamics of the Livingston Island ice cap, the Departament de Geodinàmica i Geofísica of the Universitat de Barcelona began ye a r ly surveys in the austral summer of 1994-95 on Johnsons Glacier. During this field campaign 10 shallow ice cores were sampled with a manual ve rtical ice-core drilling machine. The objectives were: i) to detect the tephra layer accumulated on the glacier surface, attributed to the 1970 Deception Island pyroclastic eruption, today interstratified; ii) to verify wheter this layer might serve as a reference level; iii) to measure the 1 3 7Cs radio-isotope concentration accumulated in the 1965 snow stratum; iv) to use the isochrone layer as a mean of verifying the age of the 1970 tephra layer; and, v) to calculate both the equilibrium line of the glacier and average mass balance over the last 28 years (1965-1993). The stratigr a p hy of the cores, their cumulative density curves and the isothermal ice temperatures recorded confi rm that Johnsons Glacier is a temperate glacier. Wi n d, solar radiation heating and liquid water are the main agents controlling the ve rtical and horizontal redistribution of the volcanic and cryoclastic particles that are sedimented and remain interstratified within the g l a c i e r. It is because of this redistribution that the 1970 tephra layer does not always serve as a ve ry good reference level. The position of the equilibrium line altitude (ELA) in 1993, obtained by the 1 3 7Cs spectrometric analysis, varies from about 200 m a.s.l. to 250 m a.s.l. This indicates a rising trend in the equilibrium line altitude from the beginning of the 1970s to the present day. The va rying slope orientation of Johnsons Glacier relative to the prevailing NE wind gives rise to large local differences in snow accumulation, which locally modifies the equilibrium line altitude. In the cores studied, 1 3 7Cs appears to be associated with the 1970 tephra laye r. This indicates an intense ablation episode throughout the sampled area (at least up to 330 m a.s.l), which probably occurred synchronically to the 1970 tephra deposition or later. A rough estimate of the specific mass balance reveals a considerable accumulation gradient related to the increase with altitude.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Projecte de recerca elaborat a partir d’una estada al Laboratory of Archaeometry del National Centre of Scientific Research “Demokritos” d’Atenes, Grècia, entre juny i setembre 2006. Aquest estudi s’emmarca dins d’un context més ampli d’estudi del canvi tecnològic que es documenta en la producció d’àmfores de tipologia romana durant els segles I aC i I dC en els territoris costaners de Catalunya. Una part d’aquest estudi contempla el càlcul de les propietats mecàniques d’aquestes àmfores i la seva avaluació en funció de la tipologia amforal, a partir de l’Anàlisi d’Elements Finits (AEF). L’AEF és una aproximació numèrica que té el seu origen en les ciències d’enginyeria i que ha estat emprada per estimar el comportament mecànic d’un model en termes, per exemple, de deformació i estrès. Així, un objecte, o millor dit el seu model, es dividit en sub-dominis anomenats elements finits, als quals se’ls atribueixen les propietats mecàniques del material en estudi. Aquests elements finits estan connectats formant una xarxa amb constriccions que pot ser definida. En el cas d’aplicar una força determinada a un model, el comportament de l’objecte pot ser estimat mitjançant el conjunt d’equacions lineals que defineixen el rendiment dels elements finits, proporcionant una bona aproximació per a la descripció de la deformació estructural. Així, aquesta simulació per ordinador suposa una important eina per entendre la funcionalitat de ceràmiques arqueològiques. Aquest procediment representa un model quantitatiu per predir el trencament de l’objecte ceràmic quan aquest és sotmès a diferents condicions de pressió. Aquest model ha estat aplicat a diferents tipologies amforals. Els resultats preliminars mostren diferències significatives entre la tipologia pre-romana i les tipologies romanes, així com entre els mateixos dissenys amforals romans, d’importants implicacions arqueològiques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the properties of the well known Replicator Dynamics when applied to a finitely repeated version of the Prisoners' Dilemma game. We characterize the behavior of such dynamics under strongly simplifying assumptions (i.e. only 3 strategies are available) and show that the basin of attraction of defection shrinks as the number of repetitions increases. After discussing the difficulties involved in trying to relax the 'strongly simplifying assumptions' above, we approach the same model by means of simulations based on genetic algorithms. The resulting simulations describe a behavior of the system very close to the one predicted by the replicator dynamics without imposing any of the assumptions of the analytical model. Our main conclusion is that analytical and computational models are good complements for research in social sciences. Indeed, while on the one hand computational models are extremely useful to extend the scope of the analysis to complex scenar

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Capital taxation is currently under debate, basically due to problems of administrative control and proper assessment of the levied assets. We analyze both problems focusing on a capital tax, the annual wealth tax (WT), which is only applied in five OECD countries, being Spain one of them. We concentrate our analysis on top 1% adult population, which permits us to describe the evolution of wealth concentration in Spain along 1983-2001. On average top 1% holds about 18% of total wealth, which rises to 19% when tax incompliance and under-assessment is corrected for housing, the main asset. The evolution suggests wealth concentration has risen. Regarding WT, we analyze whether it helps to reduce wealth inequality or, on the contrary, it reinforces vertical inequity (due to especial concessions) and horizontal inequity (due to the de iure and to de facto different treatment of assets). We analyze in detail housing and equity shares. By means of a time series analysis, we relate the reported values with reasonable price indicators and proxies of the propensity to save. We infer net tax compliance is extremely low, which includes both what we commonly understand by (gross) tax compliance and the degree of under-assessment due to fiscal legislation (for housing). That is especially true for housing, whose level of net tax compliance is well below 50%. Hence, we corroborate the difficulties in taxing capital, and so cast doubts on the current role of the WT in Spain in reducing wealth inequality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this study we propose an application of the MuSIASEM approach which is used to provide an integrated analysis of Laos across different scales. With the term “integrated analysis across scales” we mean the generation of a series of packages of quantitative indicators, characterizing the performance of the socioeconomic activities performed in Laos when considering: (i) different hierarchical levels of organization (farming systems described at the level of household, rural villages, regions of Laos, the whole country level); and (ii) different dimensions of analysis (economic dimension, social dimension, ecological dimension, technical dimension). What is relevant in this application is that the information carried out by these different packages of indicators is integrated in a system of accounting which establishes interlinkages across these indicators. This is a essential feature to study sustainability trade-offs and to build more robust scenarios of possible changes. The multi-scale integrated representation presented in this study is based on secondary data (gathered in a three year EU project – SEAtrans and integrated by other available statistical sources) and it is integrated in GIS, when dealing with the spatial representation of Laos. However, even if we use data referring to Laos, the goal of this study is not that of providing useful information about a practical policy issue of Laos, but rather, to illustrate the possibility of using a multipurpose grammar to produce an integrated set of sustainability indicators at three different levels: (i) local; (ii) meso; (iii) macro level. The technical issue addressed is the simultaneous adoption of two multi-level matrices – one referring to a characterization of human activity over a set of different categories, and another referring to a characterization of land uses over the same set of categories. In this way, it becomes possible to explain the characteristics of Laos (an integrated set of indicators defining the performance of the whole country) in relation to the characteristics of the rural Laos and urban Laos. The characteristics of rural Laos, can be explained using the characteristics of three regions defined within Laos (Northern Laos, Central Laos and Southern Laos), which in turn can be defined (using an analogous package of indicators), starting from the characteristics of three main typologies of farming systems found in the regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this paper is the analysis of the Catalan economy (2001) with the use of a National Accounting Matrix with environmental accounts (NAMEA) for the Catalan economy with 2001 data. We will focus on the analysis of the emission multipliers and we will also analyse the impact of a 10% reduction in greenhouse emissions on emission multipliers. This emission-reduction percentage would bring the Catalan economy into compliance with the maximum emissions level allowed by the Kyoto Protocol. We consider three possible scenarios that would allow this goal to be met. First, we will simulate a 10% reduction in regional emissions and a 5% drop in the endogenous income of the multipliers' model (production, factorial and private income). Second, we will simulate a 10% reduction in emissions and a 10% increase in endogenous income. Finally, we will simulate a 10% reduction in emissions and a 5% increase in endogenous income. Additionally, we will analyse the decomposition of the emission multipliers into own effects, open effects and circular effects to capture the different channels of the emission generation process. Keywords: NAMEA, emission multipliers, Kyoto Protocol.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of gas emissions by the input-output subsystem approach provides detailed insight into pollution generation in an economy. Structural decomposition analysis, on the other hand, identifies the factors behind the changes in key variables over time. Extending the input-output subsystem model to account for the changes in these variables reveals the channels by which environmental burdens are caused and transmitted throughout the production system. In this paper we propose a decomposition of the changes in the components of CO2 emissions captured by an input-output subsystems representation. The empirical application is for the Spanish service sector, and the economic and environmental data are for years 1990 and 2000. Our results show that services increased their CO2 emissions mainly because of a rise in emissions generated by non-services to cover the final demand for services. In all service activities, the decomposed effects show an increase in CO2 emissions due to a decrease in emission coefficients (i.e., emissions per unit of output) compensated by an increase in emissions caused both by the input-output coefficients and the rise in demand for services. Finally, large asymmetries exist not only in the quantitative changes in the CO2 emissions of the various services but also in the decomposed effects of these changes. Keywords: structural decomposition analysis, input-output subsystems, CO2 emissions, service sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the PhD thesis “Sound Texture Modeling” we deal with statistical modelling or textural sounds like water, wind, rain, etc. For synthesis and classification. Our initial model is based on a wavelet tree signal decomposition and the modeling of the resulting sequence by means of a parametric probabilistic model, that can be situated within the family of models trainable via expectation maximization (hidden Markov tree model ). Our model is able to capture key characteristics of the source textures (water, rain, fire, applause, crowd chatter ), and faithfully reproduces some of the sound classes. In terms of a more general taxonomy of natural events proposed by Graver, we worked on models for natural event classification and segmentation. While the event labels comprise physical interactions between materials that do not have textural propierties in their enterity, those segmentation models can help in identifying textural portions of an audio recording useful for analysis and resynthesis. Following our work on concatenative synthesis of musical instruments, we have developed a pattern-based synthesis system, that allows to sonically explore a database of units by means of their representation in a perceptual feature space. Concatenative syntyhesis with “molecules” built from sparse atomic representations also allows capture low-level correlations in perceptual audio features, while facilitating the manipulation of textural sounds based on their physical and perceptual properties. We have approached the problem of sound texture modelling for synthesis from different directions, namely a low-level signal-theoretic point of view through a wavelet transform, and a more high-level point of view driven by perceptual audio features in the concatenative synthesis setting. The developed framework provides unified approach to the high-quality resynthesis of natural texture sounds. Our research is embedded within the Metaverse 1 European project (2008-2011), where our models are contributting as low level building blocks within a semi-automated soundscape generation system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Es tracta d'una recerca d'eines CASEque actualment suporten OCL en la generació automàtica de codi Java per estudiar-les ianalitzar-les a través d'un model de proves consistent en un diagrama de classes del modelestàtic de l'UML i una mostra variada d'instruccions OCL, amb l'objectiu de detectar lesseves mancances, analitzant el codi obtingut i determinar si controla o no cada tipus derestricció, i si s'han implementat bé en el codi.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an earlier investigation (Burger et al., 2000) five sediment cores near the RodriguesTriple Junction in the Indian Ocean were studied applying classical statistical methods(fuzzy c-means clustering, linear mixing model, principal component analysis) for theextraction of endmembers and evaluating the spatial and temporal variation ofgeochemical signals. Three main factors of sedimentation were expected by the marinegeologists: a volcano-genetic, a hydro-hydrothermal and an ultra-basic factor. Thedisplay of fuzzy membership values and/or factor scores versus depth providedconsistent results for two factors only; the ultra-basic component could not beidentified. The reason for this may be that only traditional statistical methods wereapplied, i.e. the untransformed components were used and the cosine-theta coefficient assimilarity measure.During the last decade considerable progress in compositional data analysis was madeand many case studies were published using new tools for exploratory analysis of thesedata. Therefore it makes sense to check if the application of suitable data transformations,reduction of the D-part simplex to two or three factors and visualinterpretation of the factor scores would lead to a revision of earlier results and toanswers to open questions . In this paper we follow the lines of a paper of R. Tolosana-Delgado et al. (2005) starting with a problem-oriented interpretation of the biplotscattergram, extracting compositional factors, ilr-transformation of the components andvisualization of the factor scores in a spatial context: The compositional factors will beplotted versus depth (time) of the core samples in order to facilitate the identification ofthe expected sources of the sedimentary process.Kew words: compositional data analysis, biplot, deep sea sediments

Relevância:

30.00% 30.00%

Publicador:

Resumo:

”compositions” is a new R-package for the analysis of compositional and positive data.It contains four classes corresponding to the four different types of compositional andpositive geometry (including the Aitchison geometry). It provides means for computation,plotting and high-level multivariate statistical analysis in all four geometries.These geometries are treated in an fully analogous way, based on the principle of workingin coordinates, and the object-oriented programming paradigm of R. In this way,called functions automatically select the most appropriate type of analysis as a functionof the geometry. The graphical capabilities include ternary diagrams and tetrahedrons,various compositional plots (boxplots, barplots, piecharts) and extensive graphical toolsfor principal components. Afterwards, ortion and proportion lines, straight lines andellipses in all geometries can be added to plots. The package is accompanied by ahands-on-introduction, documentation for every function, demos of the graphical capabilitiesand plenty of usage examples. It allows direct and parallel computation inall four vector spaces and provides the beginner with a copy-and-paste style of dataanalysis, while letting advanced users keep the functionality and customizability theydemand of R, as well as all necessary tools to add own analysis routines. A completeexample is included in the appendix

Relevância:

30.00% 30.00%

Publicador:

Resumo:

R from http://www.r-project.org/ is ‘GNU S’ – a language and environment for statistical computingand graphics. The environment in which many classical and modern statistical techniques havebeen implemented, but many are supplied as packages. There are 8 standard packages and many moreare available through the cran family of Internet sites http://cran.r-project.org .We started to develop a library of functions in R to support the analysis of mixtures and our goal isa MixeR package for compositional data analysis that provides support foroperations on compositions: perturbation and power multiplication, subcomposition with or withoutresiduals, centering of the data, computing Aitchison’s, Euclidean, Bhattacharyya distances,compositional Kullback-Leibler divergence etc.graphical presentation of compositions in ternary diagrams and tetrahedrons with additional features:barycenter, geometric mean of the data set, the percentiles lines, marking and coloring ofsubsets of the data set, theirs geometric means, notation of individual data in the set . . .dealing with zeros and missing values in compositional data sets with R procedures for simpleand multiplicative replacement strategy,the time series analysis of compositional data.We’ll present the current status of MixeR development and illustrate its use on selected data sets