101 resultados para Markov Branching-processes


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Este trabajo tiene como propósito presentar y valorar, desde la perspectiva del alumnado participante, un proyecto de investigación-formación puesto en marcha durante el curso 2003-2004 en la elaboración del trabajo de tesina, fin de carrera, en la Escuela de Enfermería de Vitoria, dentro del programa de Licenciatura Europea de Enfermería. Constituye el punto de partida de un proyecto a largo plazo, iniciado con la intención de desarrollar principios teóricos y procedimientos prácticos que nos permitan sistematizar procesos formativos que, centrados en la investigación, articulen la teoría y la práctica e integren una perspectiva comunicativa y cooperativa.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper argues that women’s absence in peace processes cannot be explained by their alleged lack of experience in dialogue and negotiation, but by a serious lack of will to include them in such important initiatives of change. Women have wide ranging experience in dialogue processes including many war and post-war contexts, but there has been a deliberate lack of effort to integrate them in formal peace processes. After introducing the research framework, the paper addresses women’s involvement in peace, and analyzes the role played by women in peace processes, through the cases of Sri Lanka and Northern Ireland. The paper concludes that peace processes are as gendered as wars, and for that reason gender has to be a guiding line for including women in peace processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

"Vegeu el resum a l'inici del document del fitxer adjunt."

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The peace process in Northern Ireland demonstrates that new sovereignty formulas need to be explored in order to meet the demands of the populations and territories in conflict. The profound transformation of the classic symbolic elements of the nation-state within the context of the European Union has greatly contributed to the prospects for a resolution of this old conflict. Today’s discussions are focused on the search for instruments of shared sovereignty that are adapted to a complex and plural social reality. This new approach for finding a solution to the Irish conflict is particularly relevant to the Basque debate about formulating creative and modern solutions to similar conflicts over identity and sovereignty. The notion of shared sovereignty implemented in Northern Ireland –a formula for complex interdependent relations– is of significant relevance to the broader international community and is likely to become an increasingly potent and transcendent model for conflict resolution and peace building.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper the scales of classes of stochastic processes are introduced. New interpolation theorems and boundedness of some transforms of stochastic processes are proved. Interpolation method for generously-monotonous rocesses is entered. Conditions and statements of interpolation theorems concern he xed stochastic process, which diers from the classical results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

At the end of 2008 the Convention on Cluster Munitions (CCM) that outlawed almost all types of cluster munitions was signed. It was the product of the so-called Oslo process, which had been set up two years earlier as a reaction to the failure to add a new protocol banning cluster munitions to the Convention on Certain Conventional Weapons (CCW). The position of the EU in these two processes was ambivalent: on the one hand it belonged to the strongest proponents for a new protocol within the CCW, but on the other hand the member states were in general not able to act jointly in the Oslo Process. According to this working paper especially the aspect of national security and the related relationship to the United States influenced the stances of many member states and complicated the formation of a common European position. There were common normative values of the EU detected, which played a role in the CCW, but they were only secondary to other interests of the member states.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Developments in the statistical analysis of compositional data over the last twodecades have made possible a much deeper exploration of the nature of variability,and the possible processes associated with compositional data sets from manydisciplines. In this paper we concentrate on geochemical data sets. First we explainhow hypotheses of compositional variability may be formulated within the naturalsample space, the unit simplex, including useful hypotheses of subcompositionaldiscrimination and specific perturbational change. Then we develop through standardmethodology, such as generalised likelihood ratio tests, statistical tools to allow thesystematic investigation of a complete lattice of such hypotheses. Some of these tests are simple adaptations of existing multivariate tests but others require specialconstruction. We comment on the use of graphical methods in compositional dataanalysis and on the ordination of specimens. The recent development of the conceptof compositional processes is then explained together with the necessary tools for astaying- in-the-simplex approach, namely compositional singular value decompositions. All these statistical techniques are illustrated for a substantial compositional data set, consisting of 209 major-oxide and rare-element compositions of metamorphosed limestones from the Northeast and Central Highlands of Scotland.Finally we point out a number of unresolved problems in the statistical analysis ofcompositional processes

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Aitchison and Bacon-Shone (1999) considered convex linear combinations ofcompositions. In other words, they investigated compositions of compositions, wherethe mixing composition follows a logistic Normal distribution (or a perturbationprocess) and the compositions being mixed follow a logistic Normal distribution. Inthis paper, I investigate the extension to situations where the mixing compositionvaries with a number of dimensions. Examples would be where the mixingproportions vary with time or distance or a combination of the two. Practicalsituations include a river where the mixing proportions vary along the river, or acrossa lake and possibly with a time trend. This is illustrated with a dataset similar to thatused in the Aitchison and Bacon-Shone paper, which looked at how pollution in aloch depended on the pollution in the three rivers that feed the loch. Here, I explicitlymodel the variation in the linear combination across the loch, assuming that the meanof the logistic Normal distribution depends on the river flows and relative distancefrom the source origins

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En aquest treball, es proposa un nou mètode per estimar en temps real la qualitat del producte final en processos per lot. Aquest mètode permet reduir el temps necessari per obtenir els resultats de qualitat de les anàlisi de laboratori. S'utiliza un model de anàlisi de componentes principals (PCA) construït amb dades històriques en condicions normals de funcionament per discernir si un lot finalizat és normal o no. Es calcula una signatura de falla pels lots anormals i es passa a través d'un model de classificació per la seva estimació. L'estudi proposa un mètode per utilitzar la informació de les gràfiques de contribució basat en les signatures de falla, on els indicadors representen el comportament de les variables al llarg del procés en les diferentes etapes. Un conjunt de dades compost per la signatura de falla dels lots anormals històrics es construeix per cercar els patrons i entrenar els models de classifcació per estimar els resultas dels lots futurs. La metodologia proposada s'ha aplicat a un reactor seqüencial per lots (SBR). Diversos algoritmes de classificació es proven per demostrar les possibilitats de la metodologia proposada.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantitative or algorithmic trading is the automatization of investments decisions obeying a fixed or dynamic sets of rules to determine trading orders. It has increasingly made its way up to 70% of the trading volume of one of the biggest financial markets such as the New York Stock Exchange (NYSE). However, there is not a signi cant amount of academic literature devoted to it due to the private nature of investment banks and hedge funds. This projects aims to review the literature and discuss the models available in a subject that publications are scarce and infrequently. We review the basic and fundamental mathematical concepts needed for modeling financial markets such as: stochastic processes, stochastic integration and basic models for prices and spreads dynamics necessary for building quantitative strategies. We also contrast these models with real market data with minutely sampling frequency from the Dow Jones Industrial Average (DJIA). Quantitative strategies try to exploit two types of behavior: trend following or mean reversion. The former is grouped in the so-called technical models and the later in the so-called pairs trading. Technical models have been discarded by financial theoreticians but we show that they can be properly cast into a well defined scientific predictor if the signal generated by them pass the test of being a Markov time. That is, we can tell if the signal has occurred or not by examining the information up to the current time; or more technically, if the event is F_t-measurable. On the other hand the concept of pairs trading or market neutral strategy is fairly simple. However it can be cast in a variety of mathematical models ranging from a method based on a simple euclidean distance, in a co-integration framework or involving stochastic differential equations such as the well-known Ornstein-Uhlenbeck mean reversal ODE and its variations. A model for forecasting any economic or financial magnitude could be properly defined with scientific rigor but it could also lack of any economical value and be considered useless from a practical point of view. This is why this project could not be complete without a backtesting of the mentioned strategies. Conducting a useful and realistic backtesting is by no means a trivial exercise since the \laws" that govern financial markets are constantly evolving in time. This is the reason because we make emphasis in the calibration process of the strategies' parameters to adapt the given market conditions. We find out that the parameters from technical models are more volatile than their counterpart form market neutral strategies and calibration must be done in a high-frequency sampling manner to constantly track the currently market situation. As a whole, the goal of this project is to provide an overview of a quantitative approach to investment reviewing basic strategies and illustrating them by means of a back-testing with real financial market data. The sources of the data used in this project are Bloomberg for intraday time series and Yahoo! for daily prices. All numeric computations and graphics used and shown in this project were implemented in MATLAB^R scratch from scratch as a part of this thesis. No other mathematical or statistical software was used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As stated in Aitchison (1986), a proper study of relative variation in a compositional data set should be based on logratios, and dealing with logratios excludes dealing with zeros. Nevertheless, it is clear that zero observations might be present in real data sets, either because the corresponding part is completelyabsent –essential zeros– or because it is below detection limit –rounded zeros. Because the second kind of zeros is usually understood as “a trace too small to measure”, it seems reasonable to replace them by a suitable small value, and this has been the traditional approach. As stated, e.g. by Tauber (1999) and byMartín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000), the principal problem in compositional data analysis is related to rounded zeros. One should be careful to use a replacement strategy that does not seriously distort the general structure of the data. In particular, the covariance structure of the involvedparts –and thus the metric properties– should be preserved, as otherwise further analysis on subpopulations could be misleading. Following this point of view, a non-parametric imputation method isintroduced in Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2000). This method is analyzed in depth by Martín-Fernández, Barceló-Vidal, and Pawlowsky-Glahn (2003) where it is shown that thetheoretical drawbacks of the additive zero replacement method proposed in Aitchison (1986) can be overcome using a new multiplicative approach on the non-zero parts of a composition. The new approachhas reasonable properties from a compositional point of view. In particular, it is “natural” in the sense thatit recovers the “true” composition if replacement values are identical to the missing values, and it is coherent with the basic operations on the simplex. This coherence implies that the covariance structure of subcompositions with no zeros is preserved. As a generalization of the multiplicative replacement, in thesame paper a substitution method for missing values on compositional data sets is introduced

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This file contains the ontology of patterns of educational settings, as part of the formal framework for specifying, reusing and implementing educational settings. Furthermore, it includes the set of rules that extend the ontology of educational scenarios as well as a brief description of the level of patters of such ontological framework.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper discusses maintenance challenges of organisations with a huge number of devices and proposes the use of probabilistic models to assist monitoring and maintenance planning. The proposal assumes connectivity of instruments to report relevant features for monitoring. Also, the existence of enough historical registers with diagnosed breakdowns is required to make probabilistic models reliable and useful for predictive maintenance strategies based on them. Regular Markov models based on estimated failure and repair rates are proposed to calculate the availability of the instruments and Dynamic Bayesian Networks are proposed to model cause-effect relationships to trigger predictive maintenance services based on the influence between observed features and previously documented diagnostics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The chemical composition of sediments and rocks, as well as their distribution at theMartian surface, represent a long term archive of processes, which have formed theplanetary surface. A survey of chemical compositions by means of Compositional DataAnalysis represents a valuable tool to extract direct evidence for weathering processesand allows to quantify weathering and sedimentation rates. clr-biplot techniques areapplied for visualization of chemical relationships across the surface (“chemical maps”).The variability among individual suites of data is further analyzed by means of clr-PCA,in order to extract chemical alteration vectors between fresh rocks and their crusts andfor an assessment of different source reservoirs accessible to soil formation. Bothtechniques are applied to elucidate the influence of remote weathering by combinedanalysis of several soil forming branches. Vector analysis in the Simplex provides theopportunity to study atmosphere surface interactions, including the role andcomposition of volcanic gases