939 resultados para Markov process modeling


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Existing studies of on-line process control are concerned with economic aspects, and the parameters of the processes are optimized with respect to the average cost per item produced. However, an equally important dimension is the adoption of an efficient maintenance policy. In most cases, only the frequency of the corrective adjustment is evaluated because it is assumed that the equipment becomes "as good as new" after corrective maintenance. For this condition to be met, a sophisticated and detailed corrective adjustment system needs to be employed. The aim of this paper is to propose an integrated economic model incorporating the following two dimensions: on-line process control and a corrective maintenance program. Both performances are objects of an average cost per item minimization. Adjustments are based on the location of the measurement of a quality characteristic of interest in a three decision zone. Numerical examples are illustrated in the proposal. (c) 2012 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In electronic commerce, systems development is based on two fundamental types of models, business models and process models. A business model is concerned with value exchanges among business partners, while a process model focuses on operational and procedural aspects of business communication. Thus, a business model defines the what in an e-commerce system, while a process model defines the how. Business process design can be facilitated and improved by a method for systematically moving from a business model to a process model. Such a method would provide support for traceability, evaluation of design alternatives, and seamless transition from analysis to realization. This work proposes a unified framework that can be used as a basis to analyze, to interpret and to understand different concepts associated at different stages in e-Commerce system development. In this thesis, we illustrate how UN/CEFACT’s recommended metamodels for business and process design can be analyzed, extended and then integrated for the final solutions based on the proposed unified framework. Also, as an application of the framework, we demonstrate how process-modeling tasks can be facilitated in e-Commerce system design. The proposed methodology, called BP3 stands for Business Process Patterns Perspective. The BP3 methodology uses a question-answer interface to capture different business requirements from the designers. It is based on pre-defined process patterns, and the final solution is generated by applying the captured business requirements by means of a set of production rules to complete the inter-process communication among these patterns.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this thesis we consider systems of finitely many particles moving on paths given by a strong Markov process and undergoing branching and reproduction at random times. The branching rate of a particle, its number of offspring and their spatial distribution are allowed to depend on the particle's position and possibly on the configuration of coexisting particles. In addition there is immigration of new particles, with the rate of immigration and the distribution of immigrants possibly depending on the configuration of pre-existing particles as well. In the first two chapters of this work, we concentrate on the case that the joint motion of particles is governed by a diffusion with interacting components. The resulting process of particle configurations was studied by E. Löcherbach (2002, 2004) and is known as a branching diffusion with immigration (BDI). Chapter 1 contains a detailed introduction of the basic model assumptions, in particular an assumption of ergodicity which guarantees that the BDI process is positive Harris recurrent with finite invariant measure on the configuration space. This object and a closely related quantity, namely the invariant occupation measure on the single-particle space, are investigated in Chapter 2 where we study the problem of the existence of Lebesgue-densities with nice regularity properties. For example, it turns out that the existence of a continuous density for the invariant measure depends on the mechanism by which newborn particles are distributed in space, namely whether branching particles reproduce at their death position or their offspring are distributed according to an absolutely continuous transition kernel. In Chapter 3, we assume that the quantities defining the model depend only on the spatial position but not on the configuration of coexisting particles. In this framework (which was considered by Höpfner and Löcherbach (2005) in the special case that branching particles reproduce at their death position), the particle motions are independent, and we can allow for more general Markov processes instead of diffusions. The resulting configuration process is a branching Markov process in the sense introduced by Ikeda, Nagasawa and Watanabe (1968), complemented by an immigration mechanism. Generalizing results obtained by Höpfner and Löcherbach (2005), we give sufficient conditions for ergodicity in the sense of positive recurrence of the configuration process and finiteness of the invariant occupation measure in the case of general particle motions and offspring distributions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Questa tesi si inserisce nell’ambito di studio dei modelli stocastici applicati alle sequenze di DNA. I random walk e le catene di Markov sono tra i processi aleatori che hanno trovato maggiore diffusione in ambito applicativo grazie alla loro capacità di cogliere le caratteristiche salienti di molti sistemi complessi, pur mantenendo semplice la descrizione di questi. Nello specifico, la trattazione si concentra sull’applicazione di questi nel contesto dell’analisi statistica delle sequenze genomiche. Il DNA può essere rappresentato in prima approssimazione da una sequenza di nucleotidi che risulta ben riprodotta dal modello a catena di Markov; ciò rappresenta il punto di partenza per andare a studiare le proprietà statistiche delle catene di DNA. Si approfondisce questo discorso andando ad analizzare uno studio che si ripropone di caratterizzare le sequenze di DNA tramite le distribuzioni delle distanze inter-dinucleotidiche. Se ne commentano i risultati, al fine di mostrare le potenzialità di questi modelli nel fare emergere caratteristiche rilevanti in altri ambiti, in questo caso quello biologico.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The aim of our study was to develop a modeling framework suitable to quantify the incidence, absolute number and economic impact of osteoporosis-attributable hip, vertebral and distal forearm fractures, with a particular focus on change over time, and with application to the situation in Switzerland from 2000 to 2020. A Markov process model was developed and analyzed by Monte Carlo simulation. A demographic scenario provided by the Swiss Federal Statistical Office and various Swiss and international data sources were used as model inputs. Demographic and epidemiologic input parameters were reproduced correctly, confirming the internal validity of the model. The proportion of the Swiss population aged 50 years or over will rise from 33.3% in 2000 to 41.3% in 2020. At the total population level, osteoporosis-attributable incidence will rise from 1.16 to 1.54 per 1,000 person-years in the case of hip fracture, from 3.28 to 4.18 per 1,000 person-years in the case of radiographic vertebral fracture, and from 0.59 to 0.70 per 1,000 person-years in the case of distal forearm fracture. Osteoporosis-attributable hip fracture numbers will rise from 8,375 to 11,353, vertebral fracture numbers will rise from 23,584 to 30,883, and distal forearm fracture numbers will rise from 4,209 to 5,186. Population-level osteoporosis-related direct medical inpatient costs per year will rise from 713.4 million Swiss francs (CHF) to CHF946.2 million. These figures correspond to 1.6% and 2.2% of Swiss health care expenditures in 2000. The modeling framework described can be applied to a wide variety of settings. It can be used to assess the impact of new prevention, diagnostic and treatment strategies. In Switzerland incidences of osteoporotic hip, vertebral and distal forearm fracture will rise by 33%, 27%, and 19%, respectively, between 2000 and 2020, if current prevention and treatment patterns are maintained. Corresponding absolute fracture numbers will rise by 36%, 31%, and 23%. Related direct medical inpatient costs are predicted to increase by 33%; however, this estimate is subject to uncertainty due to limited availability of input data.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

With a steady increase of regulatory requirements for business processes, automation support of compliance management is a field garnering increasing attention in Information Systems research. Several approaches have been developed to support compliance checking of process models. One major challenge for such approaches is their ability to handle different modeling techniques and compliance rules in order to enable widespread adoption and application. Applying a structured literature search strategy, we reflect and discuss compliance-checking approaches in order to provide an insight into their generalizability and evaluation. The results imply that current approaches mainly focus on special modeling techniques and/or a restricted set of types of compliance rules. Most approaches abstain from real-world evaluation which raises the question of their practical applicability. Referring to the search results, we propose a roadmap for further research in model-based business process compliance checking.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper, we extend the debate concerning Credit Default Swap valuation to include time varying correlation and co-variances. Traditional multi-variate techniques treat the correlations between covariates as constant over time; however, this view is not supported by the data. Secondly, since financial data does not follow a normal distribution because of its heavy tails, modeling the data using a Generalized Linear model (GLM) incorporating copulas emerge as a more robust technique over traditional approaches. This paper also includes an empirical analysis of the regime switching dynamics of credit risk in the presence of liquidity by following the general practice of assuming that credit and market risk follow a Markov process. The study was based on Credit Default Swap data obtained from Bloomberg that spanned the period January 1st 2004 to August 08th 2006. The empirical examination of the regime switching tendencies provided quantitative support to the anecdotal view that liquidity decreases as credit quality deteriorates. The analysis also examined the joint probability distribution of the credit risk determinants across credit quality through the use of a copula function which disaggregates the behavior embedded in the marginal gamma distributions, so as to isolate the level of dependence which is captured in the copula function. The results suggest that the time varying joint correlation matrix performed far superior as compared to the constant correlation matrix; the centerpiece of linear regression models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In the last decades, neuropsychological theories tend to consider cognitive functions as a result of the whole brainwork and not as individual local areas of its cortex. Studies based on neuroimaging techniques have increased in the last years, promoting an exponential growth of the body of knowledge about relations between cognitive functions and brain structures [1]. However, so fast evolution make complicated to integrate them in verifiable theories and, even more, translated in to cognitive rehabilitation. The aim of this research work is to develop a cognitive process-modeling tool. The purpose of this system is, in the first term, to represent multidimensional data, from structural and functional connectivity, neuroimaging, data from lesion studies and derived data from clinical intervention [2][3]. This will allow to identify consolidated knowledge, hypothesis, experimental designs, new data from ongoing studies and emerging results from clinical interventions. In the second term, we pursuit to use Artificial Intelligence to assist in decision making allowing to advance towards evidence based and personalized treatments in cognitive rehabilitation. This work presents the knowledge base design of the knowledge representation tool. It is compound of two different taxonomies (structure and function) and a set of tags linking both taxonomies at different levels of structural and functional organization. The remainder of the abstract is organized as follows: Section 2 presents the web application used for gathering necessary information for generating the knowledge base, Section 3 describes knowledge base structure and finally Section 4 expounds reached conclusions.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Workflow systems have traditionally focused on the so-called production processes which are characterized by pre-definition, high volume, and repetitiveness. Recently, the deployment of workflow systems in non-traditional domains such as collaborative applications, e-learning and cross-organizational process integration, have put forth new requirements for flexible and dynamic specification. However, this flexibility cannot be offered at the expense of control, a critical requirement of business processes. In this paper, we will present a foundation set of constraints for flexible workflow specification. These constraints are intended to provide an appropriate balance between flexibility and control. The constraint specification framework is based on the concept of pockets of flexibility which allows ad hoc changes and/or building of workflows for highly flexible processes. Basically, our approach is to provide the ability to execute on the basis of a partially specified model, where the full specification of the model is made at runtime, and may be unique to each instance. The verification of dynamically built models is essential. Where as ensuring that the model conforms to specified constraints does not pose great difficulty, ensuring that the constraint set itself does not carry conflicts and redundancy is an interesting and challenging problem. In this paper, we will provide a discussion on both the static and dynamic verification aspects. We will also briefly present Chameleon, a prototype workflow engine that implements these concepts. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Current initiatives in the field of Business Process Management (BPM) strive for the development of a BPM standard notation by pushing the Business Process Modeling Notation (BPMN). However, such a proposed standard notation needs to be carefully examined. Ontological analysis is an established theoretical approach to evaluating modelling techniques. This paper reports on the outcomes of an ontological analysis of BPMN and explores identified issues by reporting on interviews conducted with BPMN users in Australia. Complementing this analysis we consolidate our findings with previous ontological analyses of process modelling notations to deliver a comprehensive assessment of BPMN.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 60K15, 60K20, 60G20,60J75, 60J80, 60J85, 60-08, 90B15.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Rapid advances in electronic communication devices and technologies have resulted in a shift in the way communication applications are being developed. These new development strategies provide abstract views of the underlying communication technologies and lead to the so-called user-centric communication applications. One user-centric communication (UCC) initiative is the Communication Virtual Machine (CVM) technology, which uses the Communication Modeling Language (CML) for modeling communication services and the CVM for realizing these services. In communication-intensive domains such as telemedicine and disaster management, there is an increasing need for user-centric communication applications that are domain-specific and that support the dynamic coordination of communication services commonly found in collaborative communication scenarios. However, UCC approaches like the CVM offer little support for the dynamic coordination of communication services resulting from inherent dependencies between individual steps of a collaboration task. Users either have to manually coordinate communication services, or reply on a process modeling technique to build customized solutions for services in a specific domain that are usually costly, rigidly defined and technology specific. ^ This dissertation proposes a domain-specific modeling approach to address this problem by extending the CVM technology with communication-specific abstractions of workflow concepts commonly found in business processes. The extension involves (1) the definition of the Workflow Communication Modeling Language (WF-CML), a superset of CML, and (2) the extension of the functionality of CVM to process communication-specific workflows. The definition of WF-CML includes the meta-model and the dynamic semantics for control constructs and concurrency. We also extended the CVM prototype to handle the modeling and realization of WF-CML models. A comparative study of the proposed approach with other workflow environments validates the claimed benefits of WF-CML and CVM.^

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The goal was to understand, document and module how information is currently flown internally in the largest dairy organization in Finland. The organization has undergone radical changes in the past years due to economic sanctions between European Union and Russia. Therefore, organization’s ultimate goal would be to continue its growth through managing its sales process more efficiently. The thesis consists of a literature review and an empirical part. The literature review consists of knowledge management and process modeling theories. First, the knowledge management discusses how data, information and knowledge are exchanged in the process. Knowledge management models and processes are describing how knowledge is created, exchanged and can be managed in an organization. Secondly, the process modeling is responsible for visualizing information flow through discussion of modeling approaches and presenting different methods and techniques. Finally, process’ documentation procedure was presented. In the end, a constructive research approach was used in order to identify process’ related problems and bottlenecks. Therefore, possible solutions were presented based on this approach. The empirical part of the study is based on 37 interviews, organization’s internal data sources and theoretical framework. The acquired data and information were used to document and to module the sales process in question with a flowchart diagram. Results are conducted through construction of the flowchart diagram and analysis of the documentation. In fact, answers to research questions are derived from empirical and theoretical parts. In the end, 14 problems and two bottlenecks were identified in the process. The most important problems are related to approach and/or standardization for information sharing, insufficient information technology tool utilization and lack of systematization of documentation. The bottlenecks are caused by the alarming amount of changes to files after their deadlines.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study a general stochastic rumour model in which an ignorant individual has a certain probability of becoming a stifler immediately upon hearing the rumour. We refer to this special kind of stifler as an uninterested individual. Our model also includes distinct rates for meetings between two spreaders in which both become stiflers or only one does, so that particular cases are the classical Daley-Kendall and Maki-Thompson models. We prove a Law of Large Numbers and a Central Limit Theorem for the proportions of those who ultimately remain ignorant and those who have heard the rumour but become uninterested in it.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Chemorheology (and thus process modeling) of highly filled thermosets used in integrated circuit (IC) packaging has been complicated by their highly filled nature, fast kinetics of curing, and viscoelastic nature. This article summarizes a more thorough chemorheological analysis of a typical IC packaging thermoset material, including novel isothermal and nonisothermal multiwave parallel-plate chemorheology. This new chemorheological analysis may be used to optimize existing and design new IC packaging processes. (C) 1997 John Wiley & Sons, Inc.