296 resultados para Process parameters
Resumo:
Adiabatic compression testing of components in gaseous oxygen is a test method that is utilized worldwide and is commonly required to qualify a component for ignition tolerance under its intended service. This testing is required by many industry standards organizations and government agencies; however, a thorough evaluation of the test parameters and test system influences on the thermal energy produced during the test has not yet been performed. This paper presents a background for adiabatic compression testing and discusses an approach to estimating potential differences in the thermal profiles produced by different test laboratories. A “Thermal Profile Test Fixture” (TPTF) is described that is capable of measuring and characterizing the thermal energy for a typical pressure shock by any test system. The test systems at Wendell Hull & Associates, Inc. (WHA) in the USA and at the BAM Federal Institute for Materials Research and Testing in Germany are compared in this manner and some of the data obtained is presented. The paper also introduces a new way of comparing the test method to idealized processes to perform system-by-system comparisons. Thus, the paper introduces an “Idealized Severity Index” (ISI) of the thermal energy to characterize a rapid pressure surge. From the TPTF data a “Test Severity Index” (TSI) can also be calculated so that the thermal energies developed by different test systems can be compared to each other and to the ISI for the equivalent isentropic process. Finally, a “Service Severity Index” (SSI) is introduced to characterizing the thermal energy of actual service conditions. This paper is the second in a series of publications planned on the subject of adiabatic compression testing.
Resumo:
A configurable process model describes a family of similar process models in a given domain. Such a model can be configured to obtain a specific process model that is subsequently used to handle individual cases, for instance, to process customer orders. Process configuration is notoriously difficult as there may be all kinds of interdependencies between configuration decisions.} In fact, an incorrect configuration may lead to behavioral issues such as deadlocks and livelocks. To address this problem, we present a novel verification approach inspired by the ``operating guidelines'' used for partner synthesis. We view the configuration process as an external service, and compute a characterization of all such services which meet particular requirements using the notion of configuration guideline. As a result, we can characterize all feasible configurations (i.\,e., configurations without behavioral problems) at design time, instead of repeatedly checking each individual configuration while configuring a process model.
Resumo:
Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business process model repositories. For example, in some cases new process models may be derived from existing models, thus finding these models and adapting them may be more effective than developing them from scratch. As process model repositories may be large, query evaluation may be time consuming. Hence, we investigate the use of indexes to speed up this evaluation process. Experiments are conducted to demonstrate that our proposal achieves a significant reduction in query evaluation time.
Resumo:
The literature identifies several models that describe inter-phase mass transfer, key to the emission process. While the emission process is complex and these models may be more or less successful at predicting mass transfer rates, they identify three key variables for a system involving a liquid and an air phase in contact with it: • A concentration (or partial pressure) gradient driving force; • The fluid dynamic characteristics within the liquid and air phases, and • The chemical properties of the individual components within the system. In three applied research projects conducted prior to this study, samples collected with two well-known sampling devices resulted in very different odour emission rates. It was not possible to adequately explain the differences observed. It appeared likely, however, that the sample collection device might have artefact effects on the emission of odorants, i.e. the sampling device appeared to have altered the mass transfer process. This raised the obvious question: Where two different emission rates are reported for a single source (differing only in the selection of sampling device), and a credible explanation for the difference in emission rate cannot be provided, which emission rate is correct? This research project aimed to identify the factors that determine odour emission rates, the impact that the characteristics of a sampling device may exert on the key mass transfer variables, and ultimately, the impact of the sampling device on the emission rate itself. To meet these objectives, a series of targeted reviews, and laboratory and field investigations, were conducted. Two widely-used, representative devices were chosen to investigate the influence of various parameters on the emission process. These investigations provided insight into the odour emission process generally, and the influence of the sampling device specifically.
Resumo:
In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.
Resumo:
Business Process Modelling is a fast growing field in business and information technology, which uses visual grammars to model and execute the processes within an organisation. However, many analysts present such models in a 2D static and iconic manner that is difficult to understand by many stakeholders. Difficulties in understanding such grammars can impede the improvement of processes within an enterprise due to communication problems. In this chapter we present a novel framework for intuitively visualising animated business process models in interactive Virtual Environments. We also show that virtual environment visualisations can be performed with present 2D business process modelling technology, thus providing a low barrier to entry for business process practitioners. Two case studies are presented from film production and healthcare domains that illustrate the ease with which these visualisations can be created. This approach can be generalised to other executable workflow systems, for any application domain being modelled.
Resumo:
Process modeling is a complex organizational task that requires many iterations and communication between the business analysts and the domain specialists involved in the process modeling. The challenge of process modeling is exacerbated, when the process of modeling has to be performed in a cross-organizational, distributed environment. Some systems have been developed to support collaborative process modeling, all of which use traditional 2D interfaces. We present an environment for collaborative process modeling, using 3D virtual environment technology. We make use of avatar instantiations of user ego centres, to allow for the spatial embodiment of the user with reference to the process model. We describe an innovative prototype collaborative process modeling approach, implemented as a modeling environment in Second Life. This approach leverages the use of virtual environments to provide user context for editing and collaborative exercises. We present a positive preliminary report on a case study, in which a test group modelled a business process using the system in Second Life.
Resumo:
Process models provide visual support for analyzing and improving complex organizational processes. In this paper, we discuss differences of process modeling languages using cognitive effectiveness considerations, to make statements about the ease of use and quality of user experience. Aspects of cognitive effectiveness are of importance for learning a modeling language, creating models, and understanding models. We identify the criteria representational clarity, perceptual discriminability, perceptual immediacy, visual expressiveness, and graphic parsimony to compare and assess the cognitive effectiveness of different modeling languages. We apply these criteria in an analysis of the routing elements of UML Activity Diagrams, YAWL, BPMN, and EPCs, to uncover their relative strengths and weaknesses from a quality of user experience perspective. We draw conclusions that are relevant to the usability of these languages in business process modeling projects.
Resumo:
Process models are used by information professionals to convey semantics about the business operations in a real world domain intended to be supported by an information system. The understandability of these models is vital to them actually being used. After all, what is not understood cannot be acted upon. Yet until now, understandability has primarily been defined as an intrinsic quality of the models themselves. Moreover, those studies that looked at understandability from a user perspective have mainly conceptualized users through rather arbitrary sets of variables. In this paper we advance an integrative framework to understand the role of the user in the process of understanding process models. Building on cognitive psychology, goal-setting theory and multimedia learning theory, we identify three stages of learning required to realize model understanding, these being Presage, Process, and Product. We define eight relevant user characteristics in the Presage stage of learning, three knowledge construction variables in the Process stage and three potential learning outcomes in the Product stage. To illustrate the benefits of the framework, we review existing process modeling work to identify where our framework can complement and extend existing studies.
Resumo:
The value of business process models is dependent not only on the choice of graphical elements in the model, but also on their annotation with additional textual and graphical information. This research discusses the use of text and icons for labeling the graphical constructs in a process model. We use two established verb classification schemes to examine the choice of activity labels in process modeling practice. Based on our findings, we synthesize a set of twenty-five activity label categories. We propose a systematic approach for graphically representing these label categories through the use of graphical icons, such that the resulting process models are easier and more readily understandable by end users. Our findings contribute to an ongoing stream of research investigating the practice of process modeling and thereby contribute to the body of knowledge about conceptual modeling quality overall.
Resumo:
Student learning research literature has shown that students' learning approaches are influenced by the learning context (Evans, Kirby, & Fabrigar, 2003). Of the many contextual factors, assessment has been found to have the most important influence on the way students go about learning. For example, assessment that is perceived to required a low level of cognitive abilities will more likely elicit a learning approach that concentrate on reproductive learning activities. Moreover, assessment demand will also interact with learning approach to determine academic performance. In this paper an assessment specific model of learning comprising presage, process and product variables (Biggs, 2001) was proposed and tested against data obtained from a sample of introductory economics students (n=434). The model developed was used to empirically investigate the influence of learning inputs and learning approaches on academic performances across assessment types (essay assignment, multiple choice question exam and exam essay). By including learning approaches in the learning model, the mechanism through which learning inputs determine academic performance was examined. Methodological limitations of the study will also be discussed.
Resumo:
As a result of the growing adoption of Business Process Management (BPM) technology different stakeholders need to understand and agree upon the process models that are used to configure BPM systems. However, BPM users have problems dealing with the complexity of such models. Therefore, the challenge is to improve the comprehension of process models. While a substantial amount of literature is devoted to this topic, there is no overview of the various mechanisms that exist to deal with managing complexity in (large) process models. It is thus hard to obtain comparative insight into the degree of support offered for various complexity reducing mechanisms by state-of-the-art languages and tools. This paper focuses on complexity reduction mechanisms that affect the abstract syntax of a process model, i.e. the structure of a process model. These mechanisms are captured as patterns, so that they can be described in their most general form and in a language- and tool-independent manner. The paper concludes with a comparative overview of the degree of support for these patterns offered by state-of-the-art languages and language implementations.
Resumo:
One of the primary treatment goals of adolescent idiopathic scoliosis (AIS) surgery is to achieve maximum coronal plane correction while maintaining coronal balance. However maintaining or restoring sagittal plane spinal curvature has become increasingly important in maintaining the long-term health of the spine. Patients with AIS are characterised by pre-operative thoracic hypokyphosis, and it is generally agreed that operative treatment of thoracic idiopathic scoliosis should aim to restore thoracic kyphosis to normal values while maintaining lumbar lordosis and good overall sagittal balance. The aim of this study was to evaluate CT sagittal plane parameters, with particular emphasis on thoracolumbar junctional alignment, in patients with AIS who underwent Video Assisted Thoracoscopic Spinal Fusion and Instrumentation (VATS). This study concluded that video-assisted thoracoscopic spinal fusion and instrumentation reliably increases thoracic kyphosis while preserving junctional alignment and lumbar lordosis in thoracic AIS.
Resumo:
This paper presents a novel matched rotation precoding (MRP) scheme to design a rate one space-frequency block code (SFBC) and a multirate SFBC for MIMO-OFDM systems with limited feedback. The proposed rate one MRP and multirate MRP can always achieve full transmit diversity and optimal system performance for arbitrary number of antennas, subcarrier intervals, and subcarrier groupings, with limited channel knowledge required by the transmit antennas. The optimization process of the rate one MRP is simple and easily visualized so that the optimal rotation angle can be derived explicitly, or even intuitively for some cases. The multirate MRP has a complex optimization process, but it has a better spectral efficiency and provides a relatively smooth balance between system performance and transmission rate. Simulations show that the proposed SFBC with MRP can overcome the diversity loss for specific propagation scenarios, always improve the system performance, and demonstrate flexible performance with large performance gain. Therefore the proposed SFBCs with MRP demonstrate flexibility and feasibility so that it is more suitable for a practical MIMO-OFDM system with dynamic parameters.
Resumo:
This thesis consists of a confessional narrative, What My Mother Doesn’t Know, and an accompanying exegesis, And Why I Should (Maybe) Tell Her. The creative piece employs the confessional mode as a subversive device in three separate narratives, each of which situates the bed as a site of resistance. The exegesis investigates how this self-disclosure in a domestic space flouts the governing rules of self-representation, specifically: telling the truth, respecting privacy and displaying normalcy. The female confession, I argue, creates an alternative space in women’s autobiography where notions of truth-telling can be undermined, the political dimensions of personal experience can be uncovered and the discourse of normality can be negotiated. In particular, women’s confessions told in, on or about the bed, dismantle the genre’s illusion of self and confirm the representative aspects of women’s experience. Framed within these parameters of power and powerlessness, the exegesis includes textual analyses of Charlotte Perkins Gilman’s The Yellow Wallpaper (1892), Tracey Emin’s My Bed (1999) and Lauren Slater’s Lying (2000), each of which exposes in a bedroom space, the author’s most obscure, intimate and traumatic experiences. Situated firmly within and against the genre’s traditional masculine domain, the exegesis also includes mediations on the creative work that validate the bed as my fabric for confession.