758 resultados para Process resources


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Risk identification is one of the most challenging stages in the risk management process. Conventional risk management approaches provide little guidance and companies often rely on the knowledge of experts for risk identification. In this paper we demonstrate how risk indicators can be used to predict process delays via a method for configuring so-called Process Risk Indicators(PRIs). The method learns suitable configurations from past process behaviour recorded in event logs. To validate the approach we have implemented it as a plug-in of the ProM process mining framework and have conducted experiments using various data sets from a major insurance company.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents an approach to assess the resilience of a water supply system under the impacts of climate change. Changes to climate characteristics such as rainfall, evapotranspiration and temperature can result in changes to the global hydrological cycle and thereby adversely impact on the ability of water supply systems to meet service standards in the future. Changes to the frequency and characteristics of floods and droughts as well as the quality of water provided by groundwater and surface water resources are the other consequences of climate change that will affect water supply system functionality. The extent and significance of these changes underline the necessity for assessing the future functionality of water supply systems under the impacts of climate change. Resilience can be a tool for assessing the ability of a water supply system to meet service standards under the future climate conditions. The study approach is based on defining resilience as the ability of a system to absorb pressure without going into failure state as well as its ability to achieve an acceptable level of function quickly after failure. In order to present this definition in the form of a mathematical function, a surrogate measure of resilience has been proposed in this paper. In addition, a step-by-step approach to estimate resilience of water storage reservoirs is presented. This approach will enable a comprehensive understanding of the functioning of a water storage reservoir under future climate scenarios and can also be a robust tool to predict future challenges faced by water supply systems under the consequence of climate change.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ability to steer business operations in alignment with the true origins of costs, and to be informed about this on a real-time basis, allows businesses to increase profitability. In most organisations however, high-level cost-based managerial decisions are still being made separately from process-related operational decisions. In this paper, we describe how process-related decisions at the operational level can be guided by cost considerations and how these cost-informed decision rules can be supported by a workflow management system. The paper presents the conceptual framework together with data requirements and technical challenges that need to be addressed to realise cost-informed workflow execution. The feasibility of our approach is demonstrated using a prototype implementation in the YAWL workflow environment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our task is to consider the evolving perspectives around curriculum documented in the Theory Into Practice (TIP) corpus to date. The 50 years in question, 1962–2012, account for approximately half the history of mass institutionalized schooling. Over this time, the upper age of compulsory schooling has crept up, stretching the school curriculum's reach, purpose, and clientele. These years also span remarkable changes in the social fabric, challenging deep senses of the nature and shelf-life of knowledge, whose knowledge counts, what science can and cannot deliver, and the very purpose of education. The school curriculum is a key social site where these challenges have to be addressed in a very practical sense, through a design on the future implemented within the resources and politics of the present. The task's metaphor of ‘evolution’ may invoke a sense of gradual cumulative improvement, but equally connotes mutation, hybridization, extinction, survival of the fittest, and environmental pressures. Viewed in this way, curriculum theory and practice cannot be isolated and studied in laboratory conditions—there is nothing natural, neutral, or self-evident about what knowledge gets selected into the curriculum. Rather, the process of selection unfolds as a series of messy, politically contaminated, lived experiments; thus curriculum studies require field work in dynamic open systems. We subscribe to Raymond Williams' approach to social change, which he argues is not absolute and abrupt, one set of ideas neatly replacing the other. For Williams, newly emergent ideas have to compete against the dominant mindset and residual ideas “still active in the cultural process'” (Williams, 1977, p. 122). This means ongoing debates. For these reasons, we join Schubert (1992) in advocating “continuous reconceptualising of the flow of experience” (p. 238) by both researchers and practitioners.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process mining encompasses the research area which is concerned with knowledge discovery from event logs. One common process mining task focuses on conformance checking, comparing discovered or designed process models with actual real-life behavior as captured in event logs in order to assess the “goodness” of the process model. This paper introduces a novel conformance checking method to measure how well a process model performs in terms of precision and generalization with respect to the actual executions of a process as recorded in an event log. Our approach differs from related work in the sense that we apply the concept of so-called weighted artificial negative events towards conformance checking, leading to more robust results, especially when dealing with less complete event logs that only contain a subset of all possible process execution behavior. In addition, our technique offers a novel way to estimate a process model’s ability to generalize. Existing literature has focused mainly on the fitness (recall) and precision (appropriateness) of process models, whereas generalization has been much more difficult to estimate. The described algorithms are implemented in a number of ProM plugins, and a Petri net conformance checking tool was developed to inspect process model conformance in a visual manner.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Business process management systems (BPMS) belong to a class of enterprise information systems that are characterized by the dependence on explicitly modeled process logic. Through the process logic, it is relatively easy to manage explicitly the routing and allocation of work items along a business process through the system. Inspired by the DeLone and McLean framework, we theorize that these process-aware system features are important attributes of system quality, which in turn will elevate key user evaluations such as perceived usefulness, and usage satisfaction. We examine this theoretical model using data collected from four different, mostly mature BPM system projects. Our findings validate the importance of input quality as well as allocation and routing attributes as antecedents of system quality, which, in turn, determines both usefulness and satisfaction with the system. We further demonstrate how service quality and workflow dependency are significant precursors to perceived usefulness. Our results suggest the appropriateness of a multi-dimensional conception of system quality for future research, and provide important design-oriented advice for the design and configuration of BPMSs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Often voltage rise along low voltage (LV) networks limits their capacity to accommodate more renewable energy (RE) sources. This paper proposes a robust and effective approach to coordinate customers' resources and control voltage rise in LV networks, where photovoltaics (PVs) are considered as the RE sources. The proposed coordination algorithm includes both localized and distributed control strategies. The localized strategy determines the value of PV inverter active and reactive power, while the distributed strategy coordinates customers' energy storage units (ESUs). To verify the effectiveness of proposed approach, a typical residential LV network is used and simulated in the PSCAD-EMTC platform.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Innovations are usually attributed to ideas generated in the minds of individuals. As we reflect upon the evolving design of an online project to engage students in learning science through hybridized writing activities we propose a more distributed view of the process of innovative design. That is, our experience suggests ideas are generated in the activity of interacting with human and material resources that expand and constrain possibilities. This project is innovative in that it is a new educational response to the problem of disengagement of students in science, and has proven to be effective in changing classroom practice and improving students’ scientific literacy. In this chapter, we identify the antecedents and trace the evolution of the project. This account illuminates the innovative design process, presents a summary of the evidence for the effectiveness of the project, and identifies future directions for further development and research. Keywords: Science learning, hybridized writing, case study, innovative approach

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The increased adoption of business process management approaches, tools and practices, has led organizations to accumulate large collections of business process models. These collections can easily include hundred to thousand models, especially in the context of multinational corporations or as a result of organizational mergers and acquisitions. A concrete problem is thus how to maintain these large repositories in such a way that their complexity does not hamper their practical usefulness as a means to describe and communicate business operations. This paper proposes a technique to automatically infer suitable names for business process models and fragments thereof. This technique is useful for model abstraction scenarios, as for instance when user-specific views of a repository are required, or as part of a refactoring initiative aimed to simplify the repository’s complexity. The technique is grounded in an adaptation of the theory of meaning to the realm of business process models. We implemented the technique in a prototype tool and conducted an extensive evaluation using three process model collections from practice and a case study involving process modelers with different experience.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Groundwater flow models are usually characterized as being either transient flow models or steady state flow models. Given that steady state groundwater flow conditions arise as a long time asymptotic limit of a particular transient response, it is natural for us to seek a finite estimate of the amount of time required for a particular transient flow problem to effectively reach steady state. Here, we introduce the concept of mean action time (MAT) to address a fundamental question: How long does it take for a groundwater recharge process or discharge processes to effectively reach steady state? This concept relies on identifying a cumulative distribution function, $F(t;x)$, which varies from $F(0;x)=0$ to $F(t;x) \to \infty$ as $t\to \infty$, thereby providing us with a measurement of the progress of the system towards steady state. The MAT corresponds to the mean of the associated probability density function $f(t;x) = \dfrac{dF}{dt}$, and we demonstrate that this framework provides useful analytical insight by explicitly showing how the MAT depends on the parameters in the model and the geometry of the problem. Additional theoretical results relating to the variance of $f(t;x)$, known as the variance of action time (VAT), are also presented. To test our theoretical predictions we include measurements from a laboratory–scale experiment describing flow through a homogeneous porous medium. The laboratory data confirms that the theoretical MAT predictions are in good agreement with measurements from the physical model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Process-aware information systems (PAISs) can be configured using a reference process model, which is typically obtained via expert interviews. Over time, however, contextual factors and system requirements may cause the operational process to start deviating from this reference model. While a reference model should ideally be updated to remain aligned with such changes, this is a costly and often neglected activity. We present a new process mining technique that automatically improves the reference model on the basis of the observed behavior as recorded in the event logs of a PAIS. We discuss how to balance the four basic quality dimensions for process mining (fitness, precision, simplicity and generalization) and a new dimension, namely the structural similarity between the reference model and the discovered model. We demonstrate the applicability of this technique using a real-life scenario from a Dutch municipality.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite growing recognition of creativity's importance for young people, the creativity of adolescents remains a neglected field of study. Hence, grounded theory research was conducted with 20 adolescents from two Australian schools regarding their self-reported experiences of creativity in diverse domains. Four approaches to the creative process – adaptation, transfer, synthesis, and genesis – emerged from the research. These approaches used by students across a range of domains contribute to the literature in two key ways: (a) explaining how adolescents engage in the creative process, theorised from adolescent creators’ self-reports of their experiences and (b) confirms hybrid theories that recognise that creativity has elements of both domain-generality and domain-specificity. The findings have educational implications for both students and teachers. For students, enhancing metacognitive awareness of their preferred approaches to creativity was reported as a valuable experience in itself, and might also enable adolescents to expand their creativity through experimenting with other ways of engaging in the creative process. For teachers, using these understandings to underpin their pedagogies can promote metacognitive awareness and experimentation, and also provide teachers with a framework for assessing students’ creative processes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Severe power quality problems can arise when a large number of single-phase distributed energy resources (DERs) are connected to a low-voltage power distribution system. Due to the random location and size of DERs, it may so happen that a particular phase generates excess power than its load demand. In such an event, the excess power will be fed back to the distribution substation and will eventually find its way to the transmission network, causing undesirable voltage-current unbalance. As a solution to this problem, the article proposes the use of a distribution static compensator (DSTATCOM), which regulates voltage at the point of common coupling (PCC), thereby ensuring balanced current flow from and to the distribution substation. Additionally, this device can also support the distribution network in the absence of the utility connection, making the distribution system work as a microgrid. The proposals are validated through extensive digital computer simulation studies using PSCADTM

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is common for organizations to maintain multiple variants of a given business process, such as multiple sales processes for different products or multiple bookkeeping processes for different countries. Conventional business process modeling languages do not explicitly support the representation of such families of process variants. This gap triggered significant research efforts over the past decade leading to an array of approaches to business process variability modeling. This survey examines existing approaches in this field based on a common set of criteria and illustrates their key concepts using a running example. The analysis shows that existing approaches are characterized by the fact that they extend a conventional process mod- eling language with constructs that make it able to capture customizable process models. A customizable process model represents a family of process variants in a way that each variant can be derived by adding or deleting fragments according to configuration parameters or according to a domain model. The survey puts into evidence an abundance of customizable process modeling languages, embodying a diverse set of con- structs. In contrast, there is comparatively little tool support for analyzing and constructing customizable process models, as well as a scarcity of empirical evaluations of languages in the field.