895 resultados para Markov chains. Convergence. Evolutionary Strategy. Large Deviations
Resumo:
2000 Mathematics Subject Classification: 60J80, 60J10.
Resumo:
Markovian models are widely used to analyse quality-of-service properties of both system designs and deployed systems. Thanks to the emergence of probabilistic model checkers, this analysis can be performed with high accuracy. However, its usefulness is heavily dependent on how well the model captures the actual behaviour of the analysed system. Our work addresses this problem for a class of Markovian models termed discrete-time Markov chains (DTMCs). We propose a new Bayesian technique for learning the state transition probabilities of DTMCs based on observations of the modelled system. Unlike existing approaches, our technique weighs observations based on their age, to account for the fact that older observations are less relevant than more recent ones. A case study from the area of bioinformatics workflows demonstrates the effectiveness of the technique in scenarios where the model parameters change over time.
Resumo:
We investigate the theoretical and numerical computation of rare transitions in simple geophysical turbulent models. We consider the barotropic quasi-geostrophic and two-dimensional Navier–Stokes equations in regimes where bistability between two coexisting large-scale attractors exist. By means of large deviations and instanton theory with the use of an Onsager–Machlup path integral formalism for the transition probability, we show how one can directly compute the most probable transition path between two coexisting attractors analytically in an equilibrium (Langevin) framework and numerically otherWe adapt a class of numerical optimization algorithms known as minimum action methods to simple geophysical turbulent models. We show that by numerically minimizing an appropriate action functional in a large deviation limit, one can predict the most likely transition path for a rare transition between two states. By considering examples where theoretical predictions can be made, we show that the minimum action method successfully predicts the most likely transition path. Finally, we discuss the application and extension of such numerical optimization schemes to the computation of rare transitions observed in direct numerical simulations and experiments and to other, more complex, turbulent systems.
Resumo:
Lehet-e beszélni a 2011-ig felgyülemlett empirikus tapasztalatok tükrében egy egységes válságlefolyásról, amely a fejlett ipari országok egészére általában jellemző, és a meghatározó országok esetében is megragadható? Megállapíthatók-e olyan univerzális változások a kibocsátás, a munkapiacok, a fogyasztás, valamint a beruházás tekintetében, amelyek jól illeszkednek a korábbi tapasztalatokhoz, nem kevésbé az ismert makromodellek predikcióihoz? A válasz – legalábbis jelen sorok írásakor – nemleges: sem a válság lefolyásának jellegzetességeiben és a makrogazdasági teljesítmények romlásának ütemében, sem a visszacsúszás mértékében és időbeli kiterjedésében sincsenek jól azonosítható közös jegyek, olyanok, amelyek a meglévő elméleti keretekbe jól beilleszthetők. A tanulmány áttekinti a válsággal és a makrogazdasági sokkokkal foglalkozó empirikus irodalom – a pénzügyi globalizáció értelmezései nyomán – relevánsnak tartott munkáit. Ezt követően egy 60 év távlatát átfogó vizsgálatban próbáljuk megítélni a recessziós időszakokban az amerikai gazdaság teljesítményét azzal a célkitűzéssel, hogy az elmúlt válság súlyosságának megítélése kellően objektív lehessen, legalább a fontosabb makrováltozók elmozdulásának nagyságrendje tekintetében. / === / Based on the empirical evidence accumulated until 2011, using official statistics from the OECD data bank and the US Commerce Department, the article addresses the question whether one can, or cannot, speak about generally observable recession/crisis patterns, such that were to be universally recognized in all major industrial countries (the G7). The answer to this question is a firm no. Changes and volatility in most major macroeconomic indicators such as output-gap, labor market distortions and large deviations from trend in consumption and in investment did all, respectively, exhibit wide differences in depth and width across the G7 countries. The large deviations in output-gaps and especially strong distortions in labor market inputs and hours per capita worked over the crisis months can hardly be explained by the existing model classes of DSGE and those of the real business cycle. Especially bothering are the difficulties in fitting the data into any established model whether business cycle or some other types, in which financial distress reduces economic activity. It is argued that standard business cycle models with financial market imperfections have no mechanism for generating deviation from standard theory, thus they do not shed light on the key factors underlying the 2007–2009 recession. That does not imply that the financial crisis is unimportant in understanding the recession, but it does indicate however, that we do not fully understand the channels through which financial distress reduced labor input. Long historical trends on the privately held portion of the federal debt in the US economy indicate that the standard macro proposition of public debt crowding out private investment and thus inhibiting growth, can be strongly challenged in so far as this ratio is neither a direct indicator of growth slowing down, nor for recession.
Resumo:
Type systems for secure information flow aim to prevent a program from leaking information from H (high) to L (low) variables. Traditionally, bisimulation has been the prevalent technique for proving the soundness of such systems. This work introduces a new proof technique based on stripping and fast simulation, and shows that it can be applied in a number of cases where bisimulation fails. We present a progressive development of this technique over a representative sample of languages including a simple imperative language (core theory), a multiprocessing nondeterministic language, a probabilistic language, and a language with cryptographic primitives. In the core theory we illustrate the key concepts of this technique in a basic setting. A fast low simulation in the context of transition systems is a binary relation where simulating states can match the moves of simulated states while maintaining the equivalence of low variables; stripping is a function that removes high commands from programs. We show that we can prove secure information flow by arguing that the stripping relation is a fast low simulation. We then extend the core theory to an abstract distributed language under a nondeterministic scheduler. Next, we extend to a probabilistic language with a random assignment command; we generalize fast simulation to the setting of discrete time Markov Chains, and prove approximate probabilistic noninterference. Finally, we introduce cryptographic primitives into the probabilistic language and prove computational noninterference, provided that the underling encryption scheme is secure.
Resumo:
The use of behavioural indicators of suffering and welfare in captive animals has produced ambiguous results. In comparisons between groups, those in worse condition tend to exhibit increased overall rate of Behaviours Potentially Indicative of Stress (BPIS), but when comparing within groups, individuals differ in their stress coping strategies. This dissertation presents analyses to unravel the Behavioural Profile of a sample of 26 captive capuchin monkeys, of three different species (Sapajus libidinosus, S. flavius and S. xanthosternos), kept in different enclosure types. In total, 147,17 hours of data were collected. We explored four type of analysis: Activity Budgets, Diversity indexes, Markov chains and Sequence analyses, and Social Network Analyses, resulting in nine indexes of behavioural occurrence and organization. In chapter One we explore group differences. Results support predictions of minor sex and species differences and major differences in behavioural profile due to enclosure type: i. individuals in less enriched enclosures exhibited a more diverse BPIS repertoire and a decreased probability of a sequence with six Genus Normative Behaviour; ii. number of most probable behavioural transitions including at least one BPIS was higher in less enriched enclosures; iii. proeminence indexes indicate that BPIS function as dead ends of behavioural sequences, and proeminence of three BPIS (pacing, self-direct, active I) were higher in less enriched enclosures. Overall, these data are not supportive of BPIS as a repetitive pattern, with a mantra-like calming effect. Rather, the picture that emerges is more supportive of BPIS as activities that disrupt organization of behaviours, introducing “noise” that compromises optimal activity budget. In chapter Two we explored individual differences in stress coping strategies. We classified individuals along six axes of exploratory behaviour. These were only weakly correlated indicating low correlation among behavioural indicators of syndromes. Nevertheless, the results are suggestive of two broad stress coping strategies, similar to the bold/proactive and shy/reactive pattern: more exploratory capuchin monkeys exhibited increased values of proeminence in Pacing, aberrant sexual display and Active 1 BPIS, while less active animals exhibited increased probability in significant sequences involving at least one BPIS, and increased prominence in own stereotypy. Capuchin monkeys are known for their cognitive capacities and behavioural flexibility, therefore, the search for a consistent set of behavioural indictors of welfare and individual differences requires further studies and larger data sets. With this work we aim contributing to design scientifically grounded and statistically correct protocols for collection of behavioural data that permits comparability of results and meta-analyses, from whatever theoretical perspective interpretation it may receive.
Resumo:
The fatty acid (FA) composition of representatives belonging to 18 polychaete families from the Southern Ocean shelf and deep sea (600 to 5337 m) was analysed in order to identify trophic biomarkers and elucidate possible feeding preferences. Total FA content was relatively low with few exceptions and ranged from 1.0 to 11.6% of total body dry weight. The most prominent FA found were 20:5(n-3), 16:0, 22:6(n-3), 18:1(n-7), 20:4(n-6), 18:0, 20:1(n-11) and 18:1(n-9). For some polychaete families and species FA profiles indicated selective feeding on certain dietary components, like freshly deposited diatom remains (e.g., Spionidae, Fauveliopsidae and Flabelligeridae) or foraminiferans (e.g., Euphrosinidae, Nephtyidae and Syllidae). Feeding patterns were relatively consistent within families at the deep stations, while the FA composition differed between the deep and the shelf stations within the same family. Fatty alcohols, indicative of wax ester storage, were found in almost all families (in proportions of 0.0 to 29.3% of total FA and fatty alcohols). The development of this long-term storage mechanism of energy reserves possibly displays an evolutionary strategy.
Resumo:
Authentication plays an important role in how we interact with computers, mobile devices, the web, etc. The idea of authentication is to uniquely identify a user before granting access to system privileges. For example, in recent years more corporate information and applications have been accessible via the Internet and Intranet. Many employees are working from remote locations and need access to secure corporate files. During this time, it is possible for malicious or unauthorized users to gain access to the system. For this reason, it is logical to have some mechanism in place to detect whether the logged-in user is the same user in control of the user's session. Therefore, highly secure authentication methods must be used. We posit that each of us is unique in our use of computer systems. It is this uniqueness that is leveraged to "continuously authenticate users" while they use web software. To monitor user behavior, n-gram models are used to capture user interactions with web-based software. This statistical language model essentially captures sequences and sub-sequences of user actions, their orderings, and temporal relationships that make them unique by providing a model of how each user typically behaves. Users are then continuously monitored during software operations. Large deviations from "normal behavior" can possibly indicate malicious or unintended behavior. This approach is implemented in a system called Intruder Detector (ID) that models user actions as embodied in web logs generated in response to a user's actions. User identification through web logs is cost-effective and non-intrusive. We perform experiments on a large fielded system with web logs of approximately 4000 users. For these experiments, we use two classification techniques; binary and multi-class classification. We evaluate model-specific differences of user behavior based on coarse-grain (i.e., role) and fine-grain (i.e., individual) analysis. A specific set of metrics are used to provide valuable insight into how each model performs. Intruder Detector achieves accurate results when identifying legitimate users and user types. This tool is also able to detect outliers in role-based user behavior with optimal performance. In addition to web applications, this continuous monitoring technique can be used with other user-based systems such as mobile devices and the analysis of network traffic.
Resumo:
There is considerable interest in the use of genetic algorithms to solve problems arising in the areas of scheduling and timetabling. However, the classical genetic algorithm paradigm is not well equipped to handle the conflict between objectives and constraints that typically occurs in such problems. In order to overcome this, successful implementations frequently make use of problem specific knowledge. This paper is concerned with the development of a GA for a nurse rostering problem at a major UK hospital. The structure of the constraints is used as the basis for a co-evolutionary strategy using co-operating sub-populations. Problem specific knowledge is also used to define a system of incentives and disincentives, and a complementary mutation operator. Empirical results based on 52 weeks of live data show how these features are able to improve an unsuccessful canonical GA to the point where it is able to provide a practical solution to the problem.
Resumo:
There is considerable interest in the use of genetic algorithms to solve problems arising in the areas of scheduling and timetabling. However, the classical genetic algorithm paradigm is not well equipped to handle the conflict between objectives and constraints that typically occurs in such problems. In order to overcome this, successful implementations frequently make use of problem specific knowledge. This paper is concerned with the development of a GA for a nurse rostering problem at a major UK hospital. The structure of the constraints is used as the basis for a co-evolutionary strategy using co-operating sub-populations. Problem specific knowledge is also used to define a system of incentives and disincentives, and a complementary mutation operator. Empirical results based on 52 weeks of live data show how these features are able to improve an unsuccessful canonical GA to the point where it is able to provide a practical solution to the problem.
Resumo:
There is considerable interest in the use of genetic algorithms to solve problems arising in the areas of scheduling and timetabling. However, the classical genetic algorithm paradigm is not well equipped to handle the conflict between objectives and constraints that typically occurs in such problems. In order to overcome this, successful implementations frequently make use of problem specific knowledge. This paper is concerned with the development of a GA for a nurse rostering problem at a major UK hospital. The structure of the constraints is used as the basis for a co-evolutionary strategy using co-operating sub-populations. Problem specific knowledge is also used to define a system of incentives and disincentives, and a complementary mutation operator. Empirical results based on 52 weeks of live data show how these features are able to improve an unsuccessful canonical GA to the point where it is able to provide a practical solution to the problem.
Resumo:
Rubisco is responsible for the fixation of CO2 into organic compounds through photosynthesis and thus has a great agronomic importance. It is well established that this enzyme suffers from a slow catalysis, and its low specificity results into photorespiration, which is considered as an energy waste for the plant. However, natural variations exist, and some Rubisco lineages, such as in C4 plants, exhibit higher catalytic efficiencies coupled to lower specificities. These C4 kinetics could have evolved as an adaptation to the higher CO2 concentration present in C4 photosynthetic cells. In this study, using phylogenetic analyses on a large data set of C3 and C4 monocots, we showed that the rbcL gene, which encodes the large subunit of Rubisco, evolved under positive selection in independent C4 lineages. This confirms that selective pressures on Rubisco have been switched in C4 plants by the high CO2 environment prevailing in their photosynthetic cells. Eight rbcL codons evolving under positive selection in C4 clades were involved in parallel changes among the 23 independent monocot C4 lineages included in this study. These amino acids are potentially responsible for the C4 kinetics, and their identification opens new roads for human-directed Rubisco engineering. The introgression of C4-like high-efficiency Rubisco would strongly enhance C3 crop yields in the future CO2-enriched atmosphere.
Resumo:
This work presents the application of a multiobjective evolutionary algorithm (MOEA) for optimal power flow (OPF) solution. The OPF is modeled as a constrained nonlinear optimization problem, non-convex of large-scale, with continuous and discrete variables. The violated inequality constraints are treated as objective function of the problem. This strategy allows attending the physical and operational restrictions without compromise the quality of the found solutions. The developed MOEA is based on the theory of Pareto and employs a diversity-preserving mechanism to overcome the premature convergence of algorithm and local optimal solutions. Fuzzy set theory is employed to extract the best compromises of the Pareto set. Results for the IEEE-30, RTS-96 and IEEE-354 test systems are presents to validate the efficiency of proposed model and solution technique.
Resumo:
The past decade has seen the rise of high resolution datasets. One of the main surprises of analysing such data has been the discovery of a large genetic, phenotypic and behavioural variation and heterogeneous metabolic rates among individuals within natural populations. A parallel discovery from theory and experiments has shown a strong temporal convergence between evolutionary and ecological dynamics, but a general framework to analyse from individual-level processes the convergence between ecological and evolutionary dynamics and its implications for patterns of biodiversity in food webs has been particularly lacking. Here, as a first approximation to take into account intraspecific variability and the convergence between the ecological and evolutionary dynamics in large food webs, we develop a model from population genomics and microevolutionary processes that uses sexual reproduction, genetic-distance-based speciation and trophic interactions. We confront the model with the prey consumption per individual predator, species-level connectance and prey–predator diversity in several environmental situations using a large food web with approximately 25,000 sampled prey and predator individuals. We show higher than expected diversity of abundant species in heterogeneous environmental conditions and strong deviations from the observed distribution of individual prey consumption (i.e. individual connectivity per predator) in all the environmental conditions. The observed large variance in individual prey consumption regardless of the environmental variability collapsed species-level connectance after small increases in sampling effort. These results suggest (1) intraspecific variance in prey–predator interactions has a strong effect on the macroscopic properties of food webs and (2) intraspecific variance is a potential driver regulating the speed of the convergence between ecological and evolutionary dynamics in species-rich food webs. These results also suggest that genetic–ecological drift driven by sexual reproduction, equal feeding rate among predator individuals, mutations and genetic-distance-based speciation can be used as a neutral food web dynamics test to detect the ecological and microevolutionary processes underlying the observed patterns of individual and species-based food webs at local and macroecological scales.
Resumo:
Let (Phi(t))(t is an element of R+) be a Harris ergodic continuous-time Markov process on a general state space, with invariant probability measure pi. We investigate the rates of convergence of the transition function P-t(x, (.)) to pi; specifically, we find conditions under which r(t) vertical bar vertical bar P-t (x, (.)) - pi vertical bar vertical bar -> 0 as t -> infinity, for suitable subgeometric rate functions r(t), where vertical bar vertical bar - vertical bar vertical bar denotes the usual total variation norm for a signed measure. We derive sufficient conditions for the convergence to hold, in terms of the existence of suitable points on which the first hitting time moments are bounded. In particular, for stochastically ordered Markov processes, explicit bounds on subgeometric rates of convergence are obtained. These results are illustrated in several examples.