919 resultados para Bayesian Markov process
Resumo:
Crisis holds the potential for profound change in organizations and industries. The past 50 years of crisis management highlight key shifts in crisis practice, creating opportunities for multiple theories and research tracks. Defining crises such as Tylenol, Exxon Valdez, and September 11 terrorist attacks have influenced or challenged the principles of best practice of crisis communication in public relations. This study traces the development of crisis process and practice by identifying shifts in crisis research and models and mapping these against key management theories and practices. The findings define three crisis domains: crisis planning, building and testing predictive models, and mapping and measuring external environmental influences. These crisis domains mirror but lag the evolution of management theory, suggesting challenges for researchers to reshape the research agenda to close the gap and lead the next stage of development in the field of crisis communication for effective organizational outcomes.
Resumo:
Providing effective IT support for business processes has become crucial for enterprises to stay competitive. In response to this need numerous process support paradigms (e.g., workflow management, service flow management, case handling), process specification standards (e.g., WS-BPEL, BPML, BPMN), process tools (e.g., ARIS Toolset, Tibco Staffware, FLOWer), and supporting methods have emerged in recent years. Summarized under the term “Business Process Management” (BPM), these paradigms, standards, tools, and methods have become a success-critical instrument for improving process performance.
Resumo:
Nowadays, business process management is an important approach for managing organizations from an operational perspective. As a consequence, it is common to see organizations develop collections of hundreds or even thousands of business process models. Such large collections of process models bring new challenges and provide new opportunities, as the knowledge that they encapsulate requires to be properly managed. Therefore, a variety of techniques for managing large collections of business process models is being developed. The goal of this paper is to provide an overview of the management techniques that currently exist, as well as the open research challenges that they pose.
Resumo:
The compressed gas industry and government agencies worldwide utilize "adiabatic compression" testing for qualifying high-pressure valves, regulators, and other related flow control equipment for gaseous oxygen service. This test methodology is known by various terms including adiabatic compression testing, gaseous fluid impact testing, pneumatic impact testing, and BAM testing as the most common terms. The test methodology will be described in greater detail throughout this document but in summary it consists of pressurizing a test article (valve, regulator, etc.) with gaseous oxygen within 15 to 20 milliseconds (ms). Because the driven gas1 and the driving gas2 are rapidly compressed to the final test pressure at the inlet of the test article, they are rapidly heated by the sudden increase in pressure to sufficient temperatures (thermal energies) to sometimes result in ignition of the nonmetallic materials (seals and seats) used within the test article. In general, the more rapid the compression process the more "adiabatic" the pressure surge is presumed to be and the more like an isentropic process the pressure surge has been argued to simulate. Generally speaking, adiabatic compression is widely considered the most efficient ignition mechanism for directly kindling a nonmetallic material in gaseous oxygen and has been implicated in many fire investigations. Because of the ease of ignition of many nonmetallic materials by this heating mechanism, many industry standards prescribe this testing. However, the results between various laboratories conducting the testing have not always been consistent. Research into the test method indicated that the thermal profile achieved (i.e., temperature/time history of the gas) during adiabatic compression testing as required by the prevailing industry standards has not been fully modeled or empirically verified, although attempts have been made. This research evaluated the following questions: 1) Can the rapid compression process required by the industry standards be thermodynamically and fluid dynamically modeled so that predictions of the thermal profiles be made, 2) Can the thermal profiles produced by the rapid compression process be measured in order to validate the thermodynamic and fluid dynamic models; and, estimate the severity of the test, and, 3) Can controlling parameters be recommended so that new guidelines may be established for the industry standards to resolve inconsistencies between various test laboratories conducting tests according to the present standards?
Resumo:
We describe the population pharmacokinetics of an acepromazine (ACP) metabolite (2-(1-hydroxyethyl)promazine) (HEPS) in horses for the estimation of likely detection times in plasma and urine. Acepromazine (30 mg) was administered to 12 horses, and blood and urine samples were taken at frequent intervals for chemical analysis. A Bayesian hierarchical model was fitted to describe concentration-time data and cumulative urine amounts for HEPS. The metabolite HEPS was modelled separately from the parent ACP as the half-life of the parent was considerably less than that of the metabolite. The clearance ($Cl/F_{PM}$) and volume of distribution ($V/F_{PM}$), scaled by the fraction of parent converted to metabolite, were estimated as 769 L/h and 6874 L, respectively. For a typical horse in the study, after receiving 30 mg of ACP, the upper limit of the detection time was 35 hours in plasma and 100 hours in urine, assuming an arbitrary limit of detection of 1 $\mu$g/L, and a small ($\approx 0.01$) probability of detection. The model derived allowed the probability of detection to be estimated at the population level. This analysis was conducted on data collected from only 12 horses, but we assume that this is representative of the wider population.
Resumo:
The Texas Department of Transportation (TxDOT) is concerned about the widening gap between preservation needs and available funding. Funding levels are not adequate to meet the preservation needs of the roadway network; therefore projects listed in the 4-Year Pavement Management Plan must be ranked to determine which projects should be funded now and which can be postponed until a later year. Currently, each district uses locally developed methods to prioritize projects. These ranking methods have relied on less formal qualitative assessments based on engineers’ subjective judgment. It is important for TxDOT to have a 4-Year Pavement Management Plan that uses a transparent, rational project ranking process. The objective of this study is to develop a conceptual framework that describes the development of the 4-Year Pavement Management Plan. It can be largely divided into three Steps; 1) Network-Level project screening process, 2) Project-Level project ranking process, and 3) Economic Analysis. A rational pavement management procedure and a project ranking method accepted by districts and the TxDOT administration will maximize efficiency in budget allocations and will potentially help improve pavement condition. As a part of the implementation of the 4-Year Pavement Management Plan, the Network-Level Project Screening (NLPS) tool including the candidate project identification algorithm and the preliminary project ranking matrix was developed. The NLPS has been used by the Austin District Pavement Engineer (DPE) to evaluate PMIS (Pavement Management Information System) data and to prepare a preliminary list of candidate projects for further evaluation.
Resumo:
A process evaluation enables understanding of critical issues that can inform the improved, ongoing implementation of an intervention program. This study describes the process evaluation of a comprehensive, multi-level injury prevention program for adolescents. The program targets change in injury associated with violence, transport and alcohol risks and incorporates two primary elements: an 8-week, teacher delivered attitude and behaviour change curriculum for Grade 8 students; and a professional development program for teachers on school level methods of protection, focusing on strategies to increase students’ connectedness to school.
Resumo:
This study examined the effect that temporal order within the entrepreneurial discovery exploitation process has on the outcomes of venture creation. Consistent with sequential theories of discovery-exploitation, the general flow of venture creation was found to be directed from discovery toward exploitation in a random sample of nascent ventures. However, venture creation attempts which specifically follow this sequence derive poor outcomes. Moreover, simultaneous discovery-exploitation was the most prevalent temporal order observed, and venture attempts that proceed in this manner more likely become operational. These findings suggest that venture creation is a multi-scale phenomenon that is at once directional in time, and simultaneously driven by symbiotically coupled discovery and exploitation.
Resumo:
We apply Lazear’s jack-of-all-trades theory to investigate the effect of nascent entrepreneurs´ balanced skill set across various functional areas on the performance of nascent projects. Analyzing longitudinal data on innovative nascent projects, we find that nascent entrepreneurs with a more balanced skill set are more successful in that they progress faster in the venture creation process.
Resumo:
Discrete Markov random field models provide a natural framework for representing images or spatial datasets. They model the spatial association present while providing a convenient Markovian dependency structure and strong edge-preservation properties. However, parameter estimation for discrete Markov random field models is difficult due to the complex form of the associated normalizing constant for the likelihood function. For large lattices, the reduced dependence approximation to the normalizing constant is based on the concept of performing computationally efficient and feasible forward recursions on smaller sublattices which are then suitably combined to estimate the constant for the whole lattice. We present an efficient computational extension of the forward recursion approach for the autologistic model to lattices that have an irregularly shaped boundary and which may contain regions with no data; these lattices are typical in applications. Consequently, we also extend the reduced dependence approximation to these scenarios enabling us to implement a practical and efficient non-simulation based approach for spatial data analysis within the variational Bayesian framework. The methodology is illustrated through application to simulated data and example images. The supplemental materials include our C++ source code for computing the approximate normalizing constant and simulation studies.
Resumo:
Recent studies have started to explore context-awareness as a driver in the design of adaptable business processes. The emerging challenge of identifying and considering contextual drivers in the environment of a business process are well understood, however, typical methods used in business process modeling do not yet consider this additional contextual information in their process designs. In this chapter, we describe our research towards innovative and advanced process modeling methods that include mechanisms to incorporate relevant contextual drivers and their impacts on business processes in process design models. We report on our ongoing work with an Australian insurance provider and describe the design science we employed to develop these innovative and useful artifacts as part of a context-aware method framework. We discuss the utility of these artifacts in an application in the claims handling process at the case organization.
Resumo:
Though the value of a process-centred view for the understanding and (re-)design of corporations has been widely accepted, our understanding of the research process in Information Systems (IS) remains superficial. A process-centred view on IS research considers the conduct of a research project as a sequence of activities involving resources, data and research artifacts. As such, it helps to reflect on more effective ways to conduct IS research, to consolidate and compare diverse practices and to complement the focus on research methodologies with research project practices. This paper takes a first step towards the discipline of ‘Research Process Management’ by exploring the features of research processes and by presenting a preliminary approach for research process design that can facilitate modelling IS research. The case study method and the design science research method are used as examples to demonstrate the potential of such reference research process models.
Resumo:
Transcending traditional national borders, the Internet is an evolving technology that has opened up many new international market opportunities. However, ambiguity remains, with limited research and understanding of how the Internet influences the firm’s internationalisation process components. As a consequence, there has been a call for further investigation of the phenomenon. Thus, the purpose of this study was to investigate the Internet’s impact on the internationalisation process components, specifically, information availability, information usage, interactive communication and international market growth. Analysis was undertaken using structural equation modelling. Findings highlight the mediating impact of the Internet on information and knowledge transference in the internationalisation process. Contributions of the study test conceptualisations and give statistical validation of interrelationships, while illuminating the Internet’s impact on firm internationalisation.
Resumo:
PySSM is a Python package that has been developed for the analysis of time series using linear Gaussian state space models (SSM). PySSM is easy to use; models can be set up quickly and efficiently and a variety of different settings are available to the user. It also takes advantage of scientific libraries Numpy and Scipy and other high level features of the Python language. PySSM is also used as a platform for interfacing between optimised and parallelised Fortran routines. These Fortran routines heavily utilise Basic Linear Algebra (BLAS) and Linear Algebra Package (LAPACK) functions for maximum performance. PySSM contains classes for filtering, classical smoothing as well as simulation smoothing.