7 resultados para Discrete Time Branching Processes

em CORA - Cork Open Research Archive - University College Cork - Ireland


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The class of all Exponential-Polynomial-Trigonometric (EPT) functions is classical and equal to the Euler-d’Alembert class of solutions of linear differential equations with constant coefficients. The class of non-negative EPT functions defined on [0;1) was discussed in Hanzon and Holland (2010) of which EPT probability density functions are an important subclass. EPT functions can be represented as ceAxb, where A is a square matrix, b a column vector and c a row vector where the triple (A; b; c) is the minimal realization of the EPT function. The minimal triple is only unique up to a basis transformation. Here the class of 2-EPT probability density functions on R is defined and shown to be closed under a variety of operations. The class is also generalised to include mixtures with the pointmass at zero. This class coincides with the class of probability density functions with rational characteristic functions. It is illustrated that the Variance Gamma density is a 2-EPT density under a parameter restriction. A discrete 2-EPT process is a process which has stochastically independent 2-EPT random variables as increments. It is shown that the distribution of the minimum and maximum of such a process is an EPT density mixed with a pointmass at zero. The Laplace Transform of these distributions correspond to the discrete time Wiener-Hopf factors of the discrete time 2-EPT process. A distribution of daily log-returns, observed over the period 1931-2011 from a prominent US index, is approximated with a 2-EPT density function. Without the non-negativity condition, it is illustrated how this problem is transformed into a discrete time rational approximation problem. The rational approximation software RARL2 is used to carry out this approximation. The non-negativity constraint is then imposed via a convex optimisation procedure after the unconstrained approximation. Sufficient and necessary conditions are derived to characterise infinitely divisible EPT and 2-EPT functions. Infinitely divisible 2-EPT density functions generate 2-EPT Lévy processes. An assets log returns can be modelled as a 2-EPT Lévy process. Closed form pricing formulae are then derived for European Options with specific times to maturity. Formulae for discretely monitored Lookback Options and 2-Period Bermudan Options are also provided. Certain Greeks, including Delta and Gamma, of these options are also computed analytically. MATLAB scripts are provided for calculations involving 2-EPT functions. Numerical option pricing examples illustrate the effectiveness of the 2-EPT approach to financial modelling.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phase-locked loops (PLLs) are a crucial component in modern communications systems. Comprising of a phase-detector, linear filter, and controllable oscillator, they are widely used in radio receivers to retrieve the information content from remote signals. As such, they are capable of signal demodulation, phase and carrier recovery, frequency synthesis, and clock synchronization. Continuous-time PLLs are a mature area of study, and have been covered in the literature since the early classical work by Viterbi [1] in the 1950s. With the rise of computing in recent decades, discrete-time digital PLLs (DPLLs) are a more recent discipline; most of the literature published dates from the 1990s onwards. Gardner [2] is a pioneer in this area. It is our aim in this work to address the difficulties encountered by Gardner [3] in his investigation of the DPLL output phase-jitter where additive noise to the input signal is combined with frequency quantization in the local oscillator. The model we use in our novel analysis of the system is also applicable to another of the cases looked at by Gardner, that is the DPLL with a delay element integrated in the loop. This gives us the opportunity to look at this system in more detail, our analysis providing some unique insights into the variance `dip' seen by Gardner in [3]. We initially provide background on the probability theory and stochastic processes. These branches of mathematics are the basis for the study of noisy analogue and digital PLLs. We give an overview of the classical analogue PLL theory as well as the background on both the digital PLL and circle map, referencing the model proposed by Teplinsky et al. [4, 5]. For our novel work, the case of the combined frequency quantization and noisy input from [3] is investigated first numerically, and then analytically as a Markov chain via its Chapman-Kolmogorov equation. The resulting delay equation for the steady-state jitter distribution is treated using two separate asymptotic analyses to obtain approximate solutions. It is shown how the variance obtained in each case matches well to the numerical results. Other properties of the output jitter, such as the mean, are also investigated. In this way, we arrive at a more complete understanding of the interaction between quantization and input noise in the first order DPLL than is possible using simulation alone. We also do an asymptotic analysis of a particular case of the noisy first-order DPLL with delay, previously investigated by Gardner [3]. We show a unique feature of the simulation results, namely the variance `dip' seen for certain levels of input noise, is explained by this analysis. Finally, we look at the second-order DPLL with additive noise, using numerical simulations to see the effects of low levels of noise on the limit cycles. We show how these effects are similar to those seen in the noise-free loop with non-zero initial conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

There is much common ground between the areas of coding theory and systems theory. Fitzpatrick has shown that a Göbner basis approach leads to efficient algorithms in the decoding of Reed-Solomon codes and in scalar interpolation and partial realization. This thesis simultaneously generalizes and simplifies that approach and presents applications to discrete-time modeling, multivariable interpolation and list decoding. Gröbner basis theory has come into its own in the context of software and algorithm development. By generalizing the concept of polynomial degree, term orders are provided for multivariable polynomial rings and free modules over polynomial rings. The orders are not, in general, unique and this adds, in no small way, to the power and flexibility of the technique. As well as being generating sets for ideals or modules, Gröbner bases always contain a element which is minimal with respect tot the corresponding term order. Central to this thesis is a general algorithm, valid for any term order, that produces a Gröbner basis for the solution module (or ideal) of elements satisfying a sequence of generalized congruences. These congruences, based on shifts and homomorphisms, are applicable to a wide variety of problems, including key equations and interpolations. At the core of the algorithm is an incremental step. Iterating this step lends a recursive/iterative character to the algorithm. As a consequence, not all of the input to the algorithm need be available from the start and different "paths" can be taken to reach the final solution. The existence of a suitable chain of modules satisfying the criteria of the incremental step is a prerequisite for applying the algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis contributes to the understanding of the processes involved in the formation and transformation of identities. It achieves this goal by establishing the critical importance of ‘background’ and ‘liminality’ in the shaping of identity. Drawing mainly from the work of cultural anthropology and philosophical hermeneutics a theoretical framework is constructed from which transformative experiences can be analysed. The particular experience at the heart of this study is the phenomenon of conversion and the dynamics involved in the construction of that process. Establishing the axial age as the horizon from which the process of conversion emerged will be the main theme of the first part of the study. Identifying the ‘birth’ of conversion allows a deeper understanding of the historical dynamics that make up the process. From these fundamental dynamics a theoretical framework is constructed in order to analyse the conversion process. Applying this theoretical framework to a number of case-studies will be the central focus of this study. The transformative experiences of Saint Augustine, the fourteenth century nun Margaret Ebner, the communist revolutionary Karl Marx and the literary figure of Arthur Koestler will provide the material onto which the theoretical framework can be applied. A synthesis of the Judaic religious and the Greek philosophical traditions will be the main findings for the shaping of Augustine’s conversion experience. The dissolution of political order coupled with the institutionalisation of the conversion process will illuminate the mystical experiences of Margaret Ebner at a time when empathetic conversion reached its fullest expression. The final case-studies examine two modern ‘conversions’ that seem to have an ideological rather than a religious basis to them. On closer examination it will be found that the German tradition of Biblical Criticism played a most influential role in the ‘conversion’ of Marx and mythology the best medium to understand the experiences of Koestler. The main ideas emerging from this study highlight the fluidity of identity and the important role of ‘background’ in its transformation. The theoretical framework, as constructed for this study, is found to be a useful methodological tool that can offer insights into experiences, such as conversion, that otherwise would remain hidden from our enquiries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the optimisation of chemoenzymatic methods in asymmetric synthesis. Modern synthetic organic chemistry has experienced an enormous growth in biocatalytic methodologies; enzymatic transformations and whole cell bioconversions have become generally accepted synthetic tools for asymmetric synthesis. Biocatalysts are exceptional catalysts, combining broad substrate scope with high regio-, enantio- and chemoselectivities enabling the resolution of organic substrates with superb efficiency and selectivity. In this study three biocatalytic applications in enantioselective synthesis were explored and perhaps the most significant outcome of this work is the excellent enantioselectivity achieved through optimisation of reaction conditions improving the synthetic utility of the biotransformations. In the first chapter a summary of literature discussing the stereochemical control of baker’s yeast (Saccharomyces Cerevisae) mediated reduction of ketones by the introduction of sulfur moieties is presented, and sets the work of Chapter 2 in context. The focus of the second chapter was the synthesis and biocatalytic resolution of (±)-trans-2-benzenesulfonyl-3-n-butylcyclopentanone. For the first time the practical limitations of this resolution have been addressed providing synthetically useful quantities of enantiopure synthons for application in the total synthesis of both enantiomers of 4-methyloctanoic acid, the aggregation pheromone of the rhinoceros beetles of the genus Oryctes. The unique aspect of this enantioselective synthesis was the overall regio- and enantioselective introduction of the methyl group to the octanoic acid chain. This work is part of an ongoing research programme in our group focussed on baker’s yeast mediated kinetic resolution of 2-keto sulfones. The third chapter describes hydrolase-catalysed kinetic resolutions leading to a series of 3-aryl alkanoic acids. Hydrolysis of the ethyl esters with a series of hydrolases was undertaken to identify biocatalysts that yield the corresponding acids in highly enantioenriched form. Contrary to literature reports where a complete disappearance of efficiency and, accordingly enantioselection, was described upon kinetic resolution of sterically demanding 3-arylalkanoic acids, the highest reported enantiopurities of these acids was achieved (up to >98% ee) in this study through optimisation of reaction conditions. Steric and electronic effects on the efficiency and enantioselectivity of the biocatalytic transformation were also explored. Furthermore, a novel approach to determine the absolute stereochemistry of the enantiopure 3-aryl alkanoic acids was investigated through combination of co-crystallisation and X-ray diffraction linked with chiral HPLC analysis. The fourth chapter was focused on the development of a biocatalytic protocol for the asymmetric Henry reaction. Efficient kinetic resolution in hydrolase-mediated transesterification of cis- and trans- β-nitrocyclohexanol derivatives was achieved. Combination of a base-catalysed intramolecular Henry reaction coupled with the hydrolase-mediated kinetic resolution with the view to selective acetylation of a single stereoisomer was investigated. While dynamic kinetic resolution in the intramolecular Henry was not achieved, significant progress in each of the individual elements was made and significantly the feasibility of this process has been demonstrated. The final chapter contains the full experimental details, including spectroscopic and analytical data of all compounds synthesised in this project, while details of chiral HPLC analysis are included in the appendix. The data for the crystal structures are contained in the attached CD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ecosystem goods and services provided by estuarine and near coastal regions are being increasingly recognised for their immense value, as is the biodiversity in these areas and these near coastal communities have been identified as sentinels of climate change also. Population structure and reproductive biology of two bivalve molluscs, Cerastoderma edule and, Mytilus edulis were assessed at two study sites over a 16-month study period. Following an anomalously harsh winter, advancement of spawning time was observed in both species. Throughout Ireland and Europe the cockle has experienced mass surfacings in geographically distinct regions, and a concurrent study of cockles was undertaken to explore this phenomenon. Surfaced and buried cockles were collected on a monthly basis and their health compared. Age was highlighted as a source of variation between dying and healthy animals with a parasite threshold being reached possibly around age three. Local factors dominated when looking at the cause of surfacing at each site. The health of mussels was explored too on a temporal and seasonal basis in an attempt to assess what constitutes a healthy organism. In essence external drivers can tip the balance between “acceptable” levels of infection where the mussel can still function physiologically and “unacceptable” where prevalence and intensity of infection can result in physiological impairment at the individual and population level. Synecological studies of intertidal ecosystems are lacking, so all bivalves encountered during the sampling were assessed in terms of population structure, reproduction, and health. It became clear, that some parasites might specialize on one host species while others are not so specific in host choice. Furthermore the population genetics of the cockle, its parasite Meiogymnophallus minutus, and its hyperparasite Unikaryon legeri were examined too. A small nucleotide polymorphism was detected upon comparison of Ireland and Morocco.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is estimated that the quantity of digital data being transferred, processed or stored at any one time currently stands at 4.4 zettabytes (4.4 × 2 70 bytes) and this figure is expected to have grown by a factor of 10 to 44 zettabytes by 2020. Exploiting this data is, and will remain, a significant challenge. At present there is the capacity to store 33% of digital data in existence at any one time; by 2020 this capacity is expected to fall to 15%. These statistics suggest that, in the era of Big Data, the identification of important, exploitable data will need to be done in a timely manner. Systems for the monitoring and analysis of data, e.g. stock markets, smart grids and sensor networks, can be made up of massive numbers of individual components. These components can be geographically distributed yet may interact with one another via continuous data streams, which in turn may affect the state of the sender or receiver. This introduces a dynamic causality, which further complicates the overall system by introducing a temporal constraint that is difficult to accommodate. Practical approaches to realising the system described above have led to a multiplicity of analysis techniques, each of which concentrates on specific characteristics of the system being analysed and treats these characteristics as the dominant component affecting the results being sought. The multiplicity of analysis techniques introduces another layer of heterogeneity, that is heterogeneity of approach, partitioning the field to the extent that results from one domain are difficult to exploit in another. The question is asked can a generic solution for the monitoring and analysis of data that: accommodates temporal constraints; bridges the gap between expert knowledge and raw data; and enables data to be effectively interpreted and exploited in a transparent manner, be identified? The approach proposed in this dissertation acquires, analyses and processes data in a manner that is free of the constraints of any particular analysis technique, while at the same time facilitating these techniques where appropriate. Constraints are applied by defining a workflow based on the production, interpretation and consumption of data. This supports the application of different analysis techniques on the same raw data without the danger of incorporating hidden bias that may exist. To illustrate and to realise this approach a software platform has been created that allows for the transparent analysis of data, combining analysis techniques with a maintainable record of provenance so that independent third party analysis can be applied to verify any derived conclusions. In order to demonstrate these concepts, a complex real world example involving the near real-time capturing and analysis of neurophysiological data from a neonatal intensive care unit (NICU) was chosen. A system was engineered to gather raw data, analyse that data using different analysis techniques, uncover information, incorporate that information into the system and curate the evolution of the discovered knowledge. The application domain was chosen for three reasons: firstly because it is complex and no comprehensive solution exists; secondly, it requires tight interaction with domain experts, thus requiring the handling of subjective knowledge and inference; and thirdly, given the dearth of neurophysiologists, there is a real world need to provide a solution for this domain