942 resultados para Markov decision processes
Resumo:
This paper deals with exponential stability of discrete-time singular systems with Markov jump parameters. We propose a set of coupled generalized Lyapunov equations (CGLE) that provides sufficient conditions to check this property for this class of systems. A method for solving the obtained CGLE is also presented, based on iterations of standard singular Lyapunov equations. We present also a numerical example to illustrate the effectiveness of the approach we are proposing.
Resumo:
In a recent paper, "A combined tool for environmental scientists and decision makers: ternary diagrams and emergy accounting." [Giannettti BF, Barrella FA, Almeida CMVB. A combined tool for environment scientists and decision makers: ternary diagrams and emergy accounting. J Clean Prod, in press http://dx.doi.org/10.1016/j.jclepro.2004.09.002] Ternary diagrams were proposed as a graphical tool to assist emergy analysis. The graphical representation of the emergy accounting data makes it possible to compare processes and systems with and without ecosystem services, to evaluate improvements and to follow the system performance over time. The graphic tool is versatile and adaptable to represent products, processes, systems, countries, and different periods of time.The use and the versatility of ternary diagrams for assisting in performing emergy analyses are illustrated by means of five examples taken from the literature, which are presented and discussed. It is shown that emergetic ternary diagram's properties assist the assessment of the system of the system efficiency, its dependance upon renewable and non-renewable inputs and the environmental support for dilution and abatement of process emissions. With the aid of ternary diagrams, details such as the interaction between systems and between systems and the environment are recognized and evaluated. Such a tool for graphical analysis allows a transparent presentation of the results and can serve as an interface between emergy scientists and decision makers, provided the meaning of each line in the diagram is carefully explained and understood. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Existing studies of on-line process control are concerned with economic aspects, and the parameters of the processes are optimized with respect to the average cost per item produced. However, an equally important dimension is the adoption of an efficient maintenance policy. In most cases, only the frequency of the corrective adjustment is evaluated because it is assumed that the equipment becomes "as good as new" after corrective maintenance. For this condition to be met, a sophisticated and detailed corrective adjustment system needs to be employed. The aim of this paper is to propose an integrated economic model incorporating the following two dimensions: on-line process control and a corrective maintenance program. Both performances are objects of an average cost per item minimization. Adjustments are based on the location of the measurement of a quality characteristic of interest in a three decision zone. Numerical examples are illustrated in the proposal. (c) 2012 Elsevier B.V. All rights reserved.
Resumo:
The question of how we make, and how we should make judgments and decisions has occupied thinkers for many centuries. This thesis has the aim to add new evidences to clarify the brain’s mechanisms for decisions. The cognitive and the emotional processes of social actions and decisions are investigated with the aim to understand which brain areas are mostly involved. Four experimental studies are presented. A specific kind of population is involved in the first study (as well as in study III) concerning patients with lesion of ventromedial prefrontal cortex (vmPFC). This region is collocated in the ventral surface of frontal lobe, and it seems have an important role in social and moral decision in forecasting the negative emotional consequences of choice. In study I, it is examined whether emotions, specifically social emotions subserved by the vmPFC, affect people’s willingness to trust others. In study II is observed how incidental emotions could encourage trusting behaviour, especially when individuals are not aware of emotive stimulation. Study III has the aim to gather a direct psychophysiological evidence, both in healthy and neurologically impaired individuals, that emotions are crucially involved in shaping moral judgment, by preventing moral violations. Study IV explores how the moral meaning of a decision and its subsequent action can modulate the basic component of action such as sense of agency.
Resumo:
In dieser Arbeit geht es um die Schätzung von Parametern in zeitdiskreten ergodischen Markov-Prozessen im allgemeinen und im CIR-Modell im besonderen. Beim CIR-Modell handelt es sich um eine stochastische Differentialgleichung, die von Cox, Ingersoll und Ross (1985) zur Beschreibung der Dynamik von Zinsraten vorgeschlagen wurde. Problemstellung ist die Schätzung der Parameter des Drift- und des Diffusionskoeffizienten aufgrund von äquidistanten diskreten Beobachtungen des CIR-Prozesses. Nach einer kurzen Einführung in das CIR-Modell verwenden wir die insbesondere von Bibby und Sørensen untersuchte Methode der Martingal-Schätzfunktionen und -Schätzgleichungen, um das Problem der Parameterschätzung in ergodischen Markov-Prozessen zunächst ganz allgemein zu untersuchen. Im Anschluss an Untersuchungen von Sørensen (1999) werden hinreichende Bedingungen (im Sinne von Regularitätsvoraussetzungen an die Schätzfunktion) für die Existenz, starke Konsistenz und asymptotische Normalität von Lösungen einer Martingal-Schätzgleichung angegeben. Angewandt auf den Spezialfall der Likelihood-Schätzung stellen diese Bedingungen zugleich lokal-asymptotische Normalität des Modells sicher. Ferner wird ein einfaches Kriterium für Godambe-Heyde-Optimalität von Schätzfunktionen angegeben und skizziert, wie dies in wichtigen Spezialfällen zur expliziten Konstruktion optimaler Schätzfunktionen verwendet werden kann. Die allgemeinen Resultate werden anschließend auf das diskretisierte CIR-Modell angewendet. Wir analysieren einige von Overbeck und Rydén (1997) vorgeschlagene Schätzer für den Drift- und den Diffusionskoeffizienten, welche als Lösungen quadratischer Martingal-Schätzfunktionen definiert sind, und berechnen das optimale Element in dieser Klasse. Abschließend verallgemeinern wir Ergebnisse von Overbeck und Rydén (1997), indem wir die Existenz einer stark konsistenten und asymptotisch normalen Lösung der Likelihood-Gleichung zeigen und lokal-asymptotische Normalität für das CIR-Modell ohne Einschränkungen an den Parameterraum beweisen.
Resumo:
This thesis addresses the issue of generating texts in the style of an existing author, that also satisfy structural constraints imposed by the genre of the text. Although Markov processes are known to be suitable for representing style, they are difficult to control in order to satisfy non-local properties, such as structural constraints, that require long distance modeling. The framework of Constrained Markov Processes allows to precisely generate texts that are consistent with a corpus, while being controllable in terms of rhymes and meter. Controlled Markov processes consist in reformulating Markov processes in the context of constraint satisfaction. The thesis describes how to represent stylistic and structural properties in terms of constraints in this framework and how this approach can be used for the generation of lyrics in the style of 60 differents authors An evaluation of the desctibed method is provided by comparing it to both pure Markov and pure constraint-based approaches. Finally the thesis describes the implementation of an augmented text editor, called Perec. Perec is intended to improve creativity, by helping the user to write lyrics and poetry, exploiting the techniques presented so far.
Resumo:
Changepoint analysis is a well established area of statistical research, but in the context of spatio-temporal point processes it is as yet relatively unexplored. Some substantial differences with regard to standard changepoint analysis have to be taken into account: firstly, at every time point the datum is an irregular pattern of points; secondly, in real situations issues of spatial dependence between points and temporal dependence within time segments raise. Our motivating example consists of data concerning the monitoring and recovery of radioactive particles from Sandside beach, North of Scotland; there have been two major changes in the equipment used to detect the particles, representing known potential changepoints in the number of retrieved particles. In addition, offshore particle retrieval campaigns are believed may reduce the particle intensity onshore with an unknown temporal lag; in this latter case, the problem concerns multiple unknown changepoints. We therefore propose a Bayesian approach for detecting multiple changepoints in the intensity function of a spatio-temporal point process, allowing for spatial and temporal dependence within segments. We use Log-Gaussian Cox Processes, a very flexible class of models suitable for environmental applications that can be implemented using integrated nested Laplace approximation (INLA), a computationally efficient alternative to Monte Carlo Markov Chain methods for approximating the posterior distribution of the parameters. Once the posterior curve is obtained, we propose a few methods for detecting significant change points. We present a simulation study, which consists in generating spatio-temporal point pattern series under several scenarios; the performance of the methods is assessed in terms of type I and II errors, detected changepoint locations and accuracy of the segment intensity estimates. We finally apply the above methods to the motivating dataset and find good and sensible results about the presence and quality of changes in the process.
Resumo:
Scopo della modellizzazione delle stringhe di DNA è la formulazione di modelli matematici che generano sequenze di basi azotate compatibili con il genoma esistente. In questa tesi si prendono in esame quei modelli matematici che conservano un'importante proprietà, scoperta nel 1952 dal biochimico Erwin Chargaff, chiamata oggi "seconda regola di Chargaff". I modelli matematici che tengono conto delle simmetrie di Chargaff si dividono principalmente in due filoni: uno la ritiene un risultato dell'evoluzione sul genoma, mentre l'altro la ipotizza peculiare di un genoma primitivo e non intaccata dalle modifiche apportate dall'evoluzione. Questa tesi si propone di analizzare un modello del secondo tipo. In particolare ci siamo ispirati al modello definito da da Sobottka e Hart. Dopo un'analisi critica e lo studio del lavoro degli autori, abbiamo esteso il modello ad un più ampio insieme di casi. Abbiamo utilizzato processi stocastici come Bernoulli-scheme e catene di Markov per costruire una possibile generalizzazione della struttura proposta nell'articolo, analizzando le condizioni che implicano la validità della regola di Chargaff. I modelli esaminati sono costituiti da semplici processi stazionari o concatenazioni di processi stazionari. Nel primo capitolo vengono introdotte alcune nozioni di biologia. Nel secondo si fa una descrizione critica e prospettica del modello proposto da Sobottka e Hart, introducendo le definizioni formali per il caso generale presentato nel terzo capitolo, dove si sviluppa l'apparato teorico del modello generale.
Resumo:
The article focuses on the effects of Eastern enlargement on EU trade policy-making. On interest constellation, the article makes a case that protectionist forces have been strengthened relative to liberal forces. This slight protectionist turn is mostly witnessed in the area of anti-dumping and with respect to the Doha trade round. On preference aggregation, guided by a principal–agent framework, it is argued that the growth in the number of actors (principals and interest groups) has not constrained the role of the European Commission (agent). However, it has led to an increase in informal processes and has empowered large trading nations vis-a`-vis smaller and less ‘comitology-experienced’ member states.
Resumo:
In Malani and Neilsen (1992) we have proposed alternative estimates of survival function (for time to disease) using a simple marker that describes time to some intermediate stage in a disease process. In this paper we derive the asymptotic variance of one such proposed estimator using two different methods and compare terms of order 1/n when there is no censoring. In the absence of censoring the asymptotic variance obtained using the Greenwood type approach converges to exact variance up to terms involving 1/n. But the asymptotic variance obtained using the theory of the counting process and results from Voelkel and Crowley (1984) on semi-Markov processes has a different term of order 1/n. It is not clear to us at this point why the variance formulae using the latter approach give different results.
Resumo:
Situationally adaptive behavior relies on the identification of relevant target stimuli, the evaluation of these with respect to the current context and the selection of an appropriate action. We used functional magnetic resonance imaging (fMRI) to disentangle the neural networks underlying these processes within a single task. Our results show that activation of mid-ventrolateral prefrontal cortex (PFC) reflects the perceived presence of a target stimulus regardless of context, whereas context-appropriate evaluation is subserved by mid-dorsolateral PFC. Enhancing demands on response selection by means of response conflict activated a network of regions, all of which are directly connected to motor areas. On the midline, rostral anterior paracingulate cortex was found to link target detection and response selection by monitoring for the presence of behaviorally significant conditions. In summary, we provide new evidence for process-specific functional dissociations in the frontal lobes. In target-centered processing, target detection in the VLPFC is separable from contextual evaluation in the DLPFC. Response-centered processing in motor-associated regions occurs partly in parallel to these processes, which may enhance behavioral efficiency, but it may also lead to reaction time increases when an irrelevant response tendency is elicited.
Resumo:
Amplifications and deletions of chromosomal DNA, as well as copy-neutral loss of heterozygosity have been associated with diseases processes. High-throughput single nucleotide polymorphism (SNP) arrays are useful for making genome-wide estimates of copy number and genotype calls. Because neighboring SNPs in high throughput SNP arrays are likely to have dependent copy number and genotype due to the underlying haplotype structure and linkage disequilibrium, hidden Markov models (HMM) may be useful for improving genotype calls and copy number estimates that do not incorporate information from nearby SNPs. We improve previous approaches that utilize a HMM framework for inference in high throughput SNP arrays by integrating copy number, genotype calls, and the corresponding confidence scores when available. Using simulated data, we demonstrate how confidence scores control smoothing in a probabilistic framework. Software for fitting HMMs to SNP array data is available in the R package ICE.
Resumo:
Vietnam has developed rapidly over the past 15 years. However, progress was not uniformly distributed across the country. Availability, adequate visualization and analysis of spatially explicit data on socio-economic and environmental aspects can support both research and policy towards sustainable development. Applying appropriate mapping techniques allows gleaning important information from tabular socio-economic data. Spatial analysis of socio-economic phenomena can yield insights into locally-specifi c patterns and processes that cannot be generated by non-spatial applications. This paper presents techniques and applications that develop and analyze spatially highly disaggregated socioeconomic datasets. A number of examples show how such information can support informed decisionmaking and research in Vietnam.
Resumo:
During the project, managers encounter numerous contingencies and are faced with the challenging task of making decisions that will effectively keep the project on track. This task is very challenging because construction projects are non-prototypical and the processes are irreversible. Therefore, it is critical to apply a methodological approach to develop a few alternative management decision strategies during the planning phase, which can be deployed to manage alternative scenarios resulting from expected and unexpected disruptions in the as-planned schedule. Such a methodology should have the following features but are missing in the existing research: (1) looking at the effects of local decisions on the global project outcomes, (2) studying how a schedule responds to decisions and disruptive events because the risk in a schedule is a function of the decisions made, (3) establishing a method to assess and improve the management decision strategies, and (4) developing project specific decision strategies because each construction project is unique and the lessons from a particular project cannot be easily applied to projects that have different contexts. The objective of this dissertation is to develop a schedule-based simulation framework to design, assess, and improve sequences of decisions for the execution stage. The contribution of this research is the introduction of applying decision strategies to manage a project and the establishment of iterative methodology to continuously assess and improve decision strategies and schedules. The project managers or schedulers can implement the methodology to develop and identify schedules accompanied by suitable decision strategies to manage a project at the planning stage. The developed methodology also lays the foundation for an algorithm towards continuously automatically generating satisfactory schedule and strategies through the construction life of a project. Different from studying isolated daily decisions, the proposed framework introduces the notion of {em decision strategies} to manage construction process. A decision strategy is a sequence of interdependent decisions determined by resource allocation policies such as labor, material, equipment, and space policies. The schedule-based simulation framework consists of two parts, experiment design and result assessment. The core of the experiment design is the establishment of an iterative method to test and improve decision strategies and schedules, which is based on the introduction of decision strategies and the development of a schedule-based simulation testbed. The simulation testbed used is Interactive Construction Decision Making Aid (ICDMA). ICDMA has an emulator to duplicate the construction process that has been previously developed and a random event generator that allows the decision-maker to respond to disruptions in the emulation. It is used to study how the schedule responds to these disruptions and the corresponding decisions made over the duration of the project while accounting for cascading impacts and dependencies between activities. The dissertation is organized into two parts. The first part presents the existing research, identifies the departure points of this work, and develops a schedule-based simulation framework to design, assess, and improve decision strategies. In the second part, the proposed schedule-based simulation framework is applied to investigate specific research problems.
Resumo:
Decision-making and memory are fundamental processes for successful human behaviour. For eye movements, the frontal eye fields (FEF), the supplementary eye fields (SEF), the dorsolateral prefrontal cortex (DLPFC), the ventrolateral frontal cortex and the anterior cingulum are important for these cognitive processes. The online approach of transcranial magnetic stimulation (TMS), i.e., the application of magnetic pulses during planning and performance of saccades, allows interfering specifically with information processing of the stimulated region at a very specific time interval (chronometry of cortical processing). The paper presents studies, which showed the different roles of the FEF and DLPFC in antisaccade control. The critical time interval of DLPFC control seems to be before target onset since TMS significantly increased the percentage of antisaccade errors at that time interval. The FEF seems to be important for the triggering of correct antisaccades. Bilateral stimulation of the DLPFC could demonstrate parallel information-processing transfer in spatial working memory during memory-guided saccades.