61 resultados para Process modelling

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years, various efforts have been made in air traffic control (ATC) to maintain traffic safety and efficiency in the face of increasing air traffic demands. ATC is a complex process that depends to a large degree on human capabilities, and so understanding how controllers carry out their tasks is an important issue in the design and development of ATC systems. In particular, the human factor is considered to be a serious problem in ATC safety and has been identified as a causal factor in both major and minor incidents. There is, therefore, a need to analyse the mechanisms by which errors occur due to complex factors and to develop systems that can deal with these errors. From the cognitive process perspective, it is essential that system developers have an understanding of the more complex working processes that involve the cooperative work of multiple controllers. Distributed cognition is a methodological framework for analysing cognitive processes that span multiple actors mediated by technology. In this research, we attempt to analyse and model interactions that take place in en route ATC systems based on distributed cognition. We examine the functional problems in an ATC system from a human factors perspective, and conclude by identifying certain measures by which to address these problems. This research focuses on the analysis of air traffic controllers' tasks for en route ATC and modelling controllers' cognitive processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Business process modelling can help an organisation better understand and improve its business processes. Most business process modelling methods adopt a task- or activity-based approach to identifying business processes. Within our work, we use activity theory to categorise elements within organisations as being either human beings, activities or artefacts. Due to the direct relationship between these three elements, an artefact-oriented approach to organisation analysis emerges. Organisational semiotics highlights the ontological dependency between affordances within an organisation. We analyse the ontological dependency between organisational elements, and therefore produce the ontology chart for artefact-oriented business process modelling in order to clarify the relationship between the elements of an organisation. Furthermore, we adopt the techniques from semantic analysis and norm analysis, of organisational semiotics, to develop the artefact-oriented method for business process modelling. The proposed method provides a novel perspective for identifying and analysing business processes, as well as agents and artefacts, as the artefact-oriented perspective demonstrates the fundamental flow of an organisation. The modelling results enable an organisation to understand and model its processes from an artefact perspective, viewing an organisation as a network of artefacts. The information and practice captured and stored in artefact can also be shared and reused between organisations that produce similar artefacts.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Shelf and coastal seas are regions of exceptionally high biological productivity, high rates of biogeochemical cycling and immense socio-economic importance. They are, however, poorly represented by the present generation of Earth system models, both in terms of resolution and process representation. Hence, these models cannot be used to elucidate the role of the coastal ocean in global biogeochemical cycles and the effects global change (both direct anthropogenic and climatic) are having on them. Here, we present a system for simulating all the coastal regions around the world (the Global Coastal Ocean Modelling System) in a systematic and practical fashion. It is based on automatically generating multiple nested model domains, using the Proudman Oceanographic Laboratory Coastal Ocean Modelling System coupled to the European Regional Seas Ecosystem Model. Preliminary results from the system are presented. These demonstrate the viability of the concept, and we discuss the prospects for using the system to explore key areas of global change in shelf seas, such as their role in the carbon cycle and climate change effects on fisheries.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This review introduces the methods used to simulate the processes affecting dissolved oxygen (DO) in lowland rivers. The important processes are described and this provides a modelling framework to describe those processes in the context of a mass-balance model. The process equations that are introduced all require (reaction) rate parameters and a variety of common procedures for identifying those parameters are reviewed. This is important because there is a wide range of estimation techniques for many of the parameters. These different techniques elicit different estimates of the parameter value and so there is the potential for a significant uncertainty in the model's inputs and therefore in the output too. Finally, the data requirements for modelling DO in lowland rivers are summarised on the basis of modelling the processes described in this review using a mass-balance model. This is reviewed with regard to what data are available and from where they might be obtained. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A wide variety of exposure models are currently employed for health risk assessments. Individual models have been developed to meet the chemical exposure assessment needs of Government, industry and academia. These existing exposure models can be broadly categorised according to the following types of exposure source: environmental, dietary, consumer product, occupational, and aggregate and cumulative. Aggregate exposure models consider multiple exposure pathways, while cumulative models consider multiple chemicals. In this paper each of these basic types of exposure model are briefly described, along with any inherent strengths or weaknesses, with the UK as a case study. Examples are given of specific exposure models that are currently used, or that have the potential for future use, and key differences in modelling approaches adopted are discussed. The use of exposure models is currently fragmentary in nature. Specific organisations with exposure assessment responsibilities tend to use a limited range of models. The modelling techniques adopted in current exposure models have evolved along distinct lines for the various types of source. In fact different organisations may be using different models for very similar exposure assessment situations. This lack of consistency between exposure modelling practices can make understanding the exposure assessment process more complex, can lead to inconsistency between organisations in how critical modelling issues are addressed (e.g. variability and uncertainty), and has the potential to communicate mixed messages to the general public. Further work should be conducted to integrate the various approaches and models, where possible and regulatory remits allow, to get a coherent and consistent exposure modelling process. We recommend the development of an overall framework for exposure and risk assessment with common approaches and methodology, a screening tool for exposure assessment, collection of better input data, probabilistic modelling, validation of model input and output and a closer working relationship between scientists and policy makers and staff from different Government departments. A much increased effort is required is required in the UK to address these issues. The result will be a more robust, transparent, valid and more comparable exposure and risk assessment process. (C) 2006 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Increased atmospheric deposition of inorganic nitrogen (N) may lead to increased leaching of nitrate (NO3-) to surface waters. The mechanisms responsible for, and controls on, this leaching are matters of debate. An experimental N addition has been conducted at Gardsjon, Sweden to determine the magnitude and identify the mechanisms of N leaching from forested catchments within the EU funded project NITREX. The ability of INCA-N, a simple process-based model of catchment N dynamics, to simulate catchment-scale inorganic N dynamics in soil and stream water during the course of the experimental addition is evaluated. Simulations were performed for 1990-2002. Experimental N addition began in 1991. INCA-N was able to successfully reproduce stream and soil water dynamics before and during the experiment. While INCA-N did not correctly simulate the lag between the start of N addition and NO 2 3 breakthrough, the model was able to simulate the state change resulting from increased N deposition. Sensitivity analysis showed that model behaviour was controlled primarily by parameters related to hydrology and vegetation dynamics and secondarily by in-soil processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new model, RothPC-1, is described for the turnover of organic C in the top metre of soil. RothPC-1 is a version of RothC-26.3, an earlier model for the turnover of C in topsoils. In RothPC-1 two extra parameters are used to model turnover in the top metre of soil: one, p, which moves organic C down the profile by an advective process, and the other, s, which slows decomposition with depth. RothPC-1 is parameterized and tested using measurements (described in Part 1, this issue) of total organic C and radiocarbon on soil profiles from the Rothamsted long-term field experiments, collected over a period of more than 100 years. RothPC-1 gives fits to measurements of organic C and radiocarbon in the 0-23, 23-46, 46-69 and 69-92 cm layers of soil that are almost all within (or close to) measurement error in two areas of regenerating woodland (Geescroft and Broadbalk Wildernesses) and an area of cultivated land from the Broadbalk Continuous Wheat Experiment. The fits to old grassland (the Park Grass Experiment) are less close. Two other sites that provide the requisite pre- and post-bomb data are also fitted; a prairie Chernozem from Russia and an annual grassland from California. Roth-PC-1 gives a close fit to measurements of organic C and radiocarbon down the Chernozem profile, provided that allowance is made for soil age; with the annual grassland the fit is acceptable in the upper part of the profile, but not in the clay-rich Bt horizon below. Calculations suggest that treating the top metre of soil as a homogeneous unit will greatly overestimate the effects of global warming in accelerating the decomposition of soil C and hence on the enhanced release of CO2 from soil organic matter; more realistic estimates will be obtained from multi-layer models such as RothPC-1.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A semi-distributed model, INCA, has been developed to determine the fate and distribution of nutrients in terrestrial and aquatic systems. The model simulates nitrogen and phosphorus processes in soils, groundwaters and river systems and can be applied in a semi-distributed manner at a range of scales. In this study, the model has been applied at field to sub-catchment to whole catchment scale to evaluate the behaviour of biosolid-derived losses of P in agricultural systems. It is shown that process-based models such as INCA, applied at a wide range of scales, reproduce field and catchment behaviour satisfactorily. The INCA model can also be used to generate generic information for risk assessment. By adjusting three key variables: biosolid application rates, the hydrological connectivity of the catchment and the initial P-status of the soils within the model, a matrix of P loss rates can be generated to evaluate the behaviour of the model and, hence, of the catchment system. The results, which indicate the sensitivity of the catchment to flow paths, to application rates and to initial soil conditions, have been incorporated into a Nutrient Export Risk Matrix (NERM).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this work was to couple a nitrogen (N) sub-model to already existent hydrological lumped (LU4-N) and semi-distributed (LU4-R-N and SD4-R-N) conceptual models, to improve our understanding of the factors and processes controlling nitrogen cycling and losses in Mediterranean catchments. The N model adopted provides a simplified conceptualization of the soil nitrogen cycle considering mineralization, nitrification, immobilization, denitrification, plant uptake, and ammonium adsorption/desorption. It also includes nitrification and denitrification in the shallow perched aquifer. We included a soil moisture threshold for all the considered soil biological processes. The results suggested that all the nitrogen processes were highly influenced by the rain episodes and that soil microbial processes occurred in pulses stimulated by soil moisture increasing after rain. Our simulation highlighted the riparian zone as a possible source of nitrate, especially after the summer drought period, but it can also act as an important sink of nitrate due to denitrification, in particular during the wettest period of the year. The riparian zone was a key element to simulate the catchment nitrate behaviour. The lumped LU4-N model (which does not include the riparian zone) could not be validated, while both the semi-distributed LU4-R-N and SD4-R-N model (which include the riparian zone) gave satisfactory results for the calibration process and acceptable results for the temporal validation process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For the very large nonlinear dynamical systems that arise in a wide range of physical, biological and environmental problems, the data needed to initialize a numerical forecasting model are seldom available. To generate accurate estimates of the expected states of the system, both current and future, the technique of ‘data assimilation’ is used to combine the numerical model predictions with observations of the system measured over time. Assimilation of data is an inverse problem that for very large-scale systems is generally ill-posed. In four-dimensional variational assimilation schemes, the dynamical model equations provide constraints that act to spread information into data sparse regions, enabling the state of the system to be reconstructed accurately. The mechanism for this is not well understood. Singular value decomposition techniques are applied here to the observability matrix of the system in order to analyse the critical features in this process. Simplified models are used to demonstrate how information is propagated from observed regions into unobserved areas. The impact of the size of the observational noise and the temporal position of the observations is examined. The best signal-to-noise ratio needed to extract the most information from the observations is estimated using Tikhonov regularization theory. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Europe's widely distributed climate modelling expertise, now organized in the European Network for Earth System Modelling (ENES), is both a strength and a challenge. Recognizing this, the European Union's Program for Integrated Earth System Modelling (PRISM) infrastructure project aims at designing a flexible and friendly user environment to assemble, run and post-process Earth System models. PRISM was started in December 2001 with a duration of three years. This paper presents the major stages of PRISM, including: (1) the definition and promotion of scientific and technical standards to increase component modularity; (2) the development of an end-to-end software environment (graphical user interface, coupling and I/O system, diagnostics, visualization) to launch, monitor and analyse complex Earth system models built around state-of-art community component models (atmosphere, ocean, atmospheric chemistry, ocean bio-chemistry, sea-ice, land-surface); and (3) testing and quality standards to ensure high-performance computing performance on a variety of platforms. PRISM is emerging as a core strategic software infrastructure for building the European research area in Earth system sciences. Copyright (c) 2005 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.