939 resultados para Markov process modeling
Resumo:
Stepwise uncertainty reduction (SUR) strategies aim at constructing a sequence of points for evaluating a function f in such a way that the residual uncertainty about a quantity of interest progressively decreases to zero. Using such strategies in the framework of Gaussian process modeling has been shown to be efficient for estimating the volume of excursion of f above a fixed threshold. However, SUR strategies remain cumbersome to use in practice because of their high computational complexity, and the fact that they deliver a single point at each iteration. In this article we introduce several multipoint sampling criteria, allowing the selection of batches of points at which f can be evaluated in parallel. Such criteria are of particular interest when f is costly to evaluate and several CPUs are simultaneously available. We also manage to drastically reduce the computational cost of these strategies through the use of closed form formulas. We illustrate their performances in various numerical experiments, including a nuclear safety test case. Basic notions about kriging, auxiliary problems, complexity calculations, R code, and data are available online as supplementary materials.
Resumo:
Statistical methods are developed which assess survival data for two attributes; (1) prolongation of life, (2) quality of life. Health state transition probabilities correspond to prolongation of life and are modeled as a discrete-time semi-Markov process. Imbedded within the sojourn time of a particular health state are the quality of life transitions. They reflect events which differentiate perceptions of pain and suffering over a fixed time period. Quality of life transition probabilities are derived from the assumptions of a simple Markov process. These probabilities depend on the health state currently occupied and the next health state to which a transition is made. Utilizing the two forms of attributes the model has the capability to estimate the distribution of expected quality adjusted life years (in addition to the distribution of expected survival times). The expected quality of life can also be estimated within the health state sojourn time making more flexible the assessment of utility preferences. The methods are demonstrated on a subset of follow-up data from the Beta Blocker Heart Attack Trial (BHAT). This model contains the structure necessary to make inferences when assessing a general survival problem with a two dimensional outcome. ^
Resumo:
Parallel recordings of spike trains of several single cortical neurons in behaving monkeys were analyzed as a hidden Markov process. The parallel spike trains were considered as a multivariate Poisson process whose vector firing rates change with time. As a consequence of this approach, the complete recording can be segmented into a sequence of a few statistically discriminated hidden states, whose dynamics are modeled as a first-order Markov chain. The biological validity and benefits of this approach were examined in several independent ways: (i) the statistical consistency of the segmentation and its correspondence to the behavior of the animals; (ii) direct measurement of the collective flips of activity, obtained by the model; and (iii) the relation between the segmentation and the pair-wise short-term cross-correlations between the recorded spike trains. Comparison with surrogate data was also carried out for each of the above examinations to assure their significance. Our results indicated the existence of well-separated states of activity, within which the firing rates were approximately stationary. With our present data we could reliably discriminate six to eight such states. The transitions between states were fast and were associated with concomitant changes of firing rates of several neurons. Different behavioral modes and stimuli were consistently reflected by different states of neural activity. Moreover, the pair-wise correlations between neurons varied considerably between the different states, supporting the hypothesis that these distinct states were brought about by the cooperative action of many neurons.
Resumo:
Um dos aspectos regulatórios fundamentais para o mercado imobiliário no Brasil são os limites para obtenção de financiamento no Sistema Financeiro de Habitação. Esses limites podem ser definidos de forma a aumentar ou reduzir a oferta de crédito neste mercado, alterando o comportamento dos seus agentes e, com isso, o preço de mercado dos imóveis. Neste trabalho, propomos um modelo de formação de preços no mercado imobiliário brasileiro com base no comportamento dos agentes que o compõem. Os agentes vendedores têm comportamento heterogêneo e são influenciados pela demanda histórica, enquanto que os agentes compradores têm o seu comportamento determinado pela disponibilidade de crédito. Esta disponibilidade de crédito, por sua vez, é definida pelos limites para concessão de financiamento no Sistema Financeiro de Habitação. Verificamos que o processo markoviano que descreve preço de mercado converge para um sistema dinâmico determinístico quando o número de agentes aumenta, e analisamos o comportamento deste sistema dinâmico. Mostramos qual é a família de variáveis aleatórias que representa o comportamento dos agentes vendedores de forma que o sistema apresente um preço de equilíbrio não trivial, condizente com a realidade. Verificamos ainda que o preço de equilíbrio depende não só das regras de concessão de financiamento no Sistema Financeiro de Habitação, como também do preço de reserva dos compradores e da memória e da sensibilidade dos vendedores a alterações na demanda. A memória e a sensibilidade dos vendedores podem levar a oscilações de preços acima ou abaixo do preço de equilíbrio (típicas de processos de formação de bolhas); ou até mesmo a uma bifurcação de Neimark-Sacker, quando o sistema apresenta dinâmica oscilatória estável.
Resumo:
O propósito deste trabalho foi o desenvolvimento de um procedimento simulador de processo reproduzindo a etapa de destilação extrativa de uma unidade de extração de butadieno a partir de uma corrente de hidrocarbonetos na faixa de quatro átomos de carbono, através da adição do solvente n-metil-2- pirrolidona (NMP). Os resultados obtidos foram comparados e validados com dados de processo obtidos por uma unidade industrial de extração de butadieno. O aprofundamento nos conceitos do processo de separação através de uma ferramenta em simulador de processo capaz de predizer condições de operação permitiu avaliações de aumento de capacidade. A capacidade dos elementos internos dos equipamentos envolvidos na separação pode ser avaliada e a identificação do ponto inicial de engargalamento da unidade foi possível. O procedimento proposto também permite reduzir incertezas para identificação de novos pontos de engargalamento a partir de uma nova configuração dos elementos internos identificados como ineficientes com a elevação de carga processada.
Resumo:
Society, as we know it today, is completely dependent on computer networks, Internet and distributed systems, which place at our disposal the necessary services to perform our daily tasks. Moreover, and unconsciously, all services and distributed systems require network management systems. These systems allow us to, in general, maintain, manage, configure, scale, adapt, modify, edit, protect or improve the main distributed systems. Their role is secondary and is unknown and transparent to the users. They provide the necessary support to maintain the distributed systems whose services we use every day. If we don’t consider network management systems during the development stage of main distributed systems, then there could be serious consequences or even total failures in the development of the distributed systems. It is necessary, therefore, to consider the management of the systems within the design of distributed systems and systematize their conception to minimize the impact of the management of networks within the project of distributed systems. In this paper, we present a formalization method of the conceptual modelling for design of a network management system through the use of formal modelling tools, thus allowing from the definition of processes to identify those responsible for these. Finally we will propose a use case to design a conceptual model intrusion detection system in network.
Resumo:
Atualmente, as instituições do ensino superior, onde se inclui a Escola Superior de Desporto de Rio Maior do Instituto Politécnico de Santarém, deparam-se com várias questões e desafios relacionados com a sua acreditação e a dos seus ciclos de estudo, e consequentemente, com a melhoria da qualidade do seu desempenho e o acesso a financiamento. Esta realidade exige novas abordagens e o aumento do nível de exigência a todos os intervenientes que contribuem para a qualidade do serviço prestado. No sentido de dar resposta a estes desafios, o Gabinete de Avaliação e Qualidade tem desenvolvido iniciativas e abordagens das quais o presente trabalho é um exemplo. Com este trabalho pretendeu-se, a partir de numa abordagem de Business Process Management, demonstrar a viabilidade e operacionalidade da utilização de uma ferramenta de Business Process Management System neste contexto. Para tal, realizou-se a modelação do processo de avaliação e acreditação desenvolvido pela Agência de Avaliação e Acreditação do Ensino Superior, através da utilização do Business Process Model and Notation. Esta proposta permitiu modelar os processos na instituição, demonstrando a utilização de uma abordagem Business Process Management numa organização desta natureza, com o objetivo de promover a sua melhoria.
Resumo:
Enterprise systems interoperability (ESI) is an important topic for business currently. This situation is evidenced, at least in part, by the number and extent of potential candidate protocols for such process interoperation, viz., ebXML, BPML, BPEL, and WSCI. Wide-ranging support for each of these candidate standards already exists. However, despite broad acceptance, a sound theoretical evaluation of these approaches has not yet been provided. We use the Bunge-Wand-Weber (BWW) models, in particular, the representation model, to provide the basis for such a theoretical evaluation. We, and other researchers, have shown the usefulness of the representation model for analyzing, evaluating, and engineering techniques in the areas of traditional and structured systems analysis, object-oriented modeling, and process modeling. In this work, we address the question, what are the potential semantic weaknesses of using ebXML alone for process interoperation between enterprise systems? We find that users will lack important implementation information because of representational deficiencies; due to ontological redundancy, the complexity of the specification is unnecessarily increased; and, users of the specification will have to bring in extra-model knowledge to understand constructs in the specification due to instances of ontological excess.
Resumo:
Consider a haploid population and, within its genome, a gene whose presence is vital for the survival of any individual. Each copy of this gene is subject to mutations which destroy its function. Suppose one member of the population somehow acquires a duplicate copy of the gene, where the duplicate is fully linked to the original gene's locus. Preservation is said to occur if eventually the entire population consists of individuals descended from this one which initially carried the duplicate. The system is modelled by a finite state-space Markov process which in turn is approximated by a diffusion process, whence an explicit expression for the probability of preservation is derived. The event of preservation can be compared to the fixation of a selectively neutral gene variant initially present in a single individual, the probability of which is the reciprocal of the population size. For very weak mutation, this and the probability of preservation are equal, while as mutation becomes stronger, the preservation probability tends to double this reciprocal. This is in excellent agreement with simulation studies.
Resumo:
Biologists are increasingly conscious of the critical role that noise plays in cellular functions such as genetic regulation, often in connection with fluctuations in small numbers of key regulatory molecules. This has inspired the development of models that capture this fundamentally discrete and stochastic nature of cellular biology - most notably the Gillespie stochastic simulation algorithm (SSA). The SSA simulates a temporally homogeneous, discrete-state, continuous-time Markov process, and of course the corresponding probabilities and numbers of each molecular species must all remain positive. While accurately serving this purpose, the SSA can be computationally inefficient due to very small time stepping so faster approximations such as the Poisson and Binomial τ-leap methods have been suggested. This work places these leap methods in the context of numerical methods for the solution of stochastic differential equations (SDEs) driven by Poisson noise. This allows analogues of Euler-Maruyuma, Milstein and even higher order methods to be developed through the Itô-Taylor expansions as well as similar derivative-free Runge-Kutta approaches. Numerical results demonstrate that these novel methods compare favourably with existing techniques for simulating biochemical reactions by more accurately capturing crucial properties such as the mean and variance than existing methods.
Resumo:
In this paper, we present a top down approach for integrated process modelling and distributed process execution. The integrated process model can be utilized for global monitoring and visualization and distributed process models for local execution. Our main focus in this paper is the presentation of the approach to support automatic generation and linking of distributed process models from an integrated process definition.
Resumo:
This paper aims at development of procedures and algorithms for application of artificial intelligence tools to acquire process and analyze various types of knowledge. The proposed environment integrates techniques of knowledge and decision process modeling such as neural networks and fuzzy logic-based reasoning methods. The problem of an identification of complex processes with the use of neuro-fuzzy systems is solved. The proposed classifier has been successfully applied for building one decision support systems for solving managerial problem.
Resumo:
Sustainable development support, balanced scorecard development and business process modeling are viewed from the position of systemology. Extensional, intentional and potential properties of a system are considered as necessary to satisfy functional requirements of a meta-system. The correspondence between extensional, intentional and potential properties of a system and sustainable, unsustainable, crisis and catastrophic states of a system is determined. The inaccessibility cause of the system mission is uncovered. The correspondence between extensional, intentional and potential properties of a system and balanced scorecard perspectives is showed. The IDEF0 function modeling method is checked against balanced scorecard perspectives. The correspondence between balanced scorecard perspectives and IDEF0 notations is considered.
Resumo:
Existing approaches to quality estimation of e-learning systems are analyzed. The “layered” approach for quality estimation of e-learning systems enhanced with learning process modeling and simulation is presented. The method of quality estimation using learning process modeling and quality criteria are suggested. The learning process model based on extended colored stochastic Petri net is described. The method has been implemented in the automated system of quality estimation of e-learning systems named “QuAdS”. Results of approbation of the developed method and quality criteria are shown. We argue that using learning process modeling for quality estimation simplifies identifying lacks of an e-learning system for an expert.
Resumo:
ACM Computing Classification System (1998): D.0, D.2.11.