922 resultados para Marine systems modelling
Resumo:
A large and complex IT project may involve multiple organizations and be constrained within a temporal period. An organization is a system comprising of people, activities, processes, information, resources and goals. Understanding and modelling such a project and its interrelationship with relevant organizations are essential for organizational project planning. This paper introduces the problem articulation method (PAM) as a semiotic method for organizational infrastructure modelling. PAM offers a suite of techniques, which enables the articulation of the business, technical and organizational requirements, delivering an infrastructural framework to support the organization. It works by eliciting and formalizing (e. g. processes, activities, relationships, responsibilities, communications, resources, agents, dependencies and constraints) and mapping these abstractions to represent the manifestation of the "actual" organization. Many analysts forgo organizational modelling methods and use localized ad hoc and point solutions, but this is not amenable for organizational infrastructures modelling. A case study of the infrared atmospheric sounding interferometer (IASI) will be used to demonstrate the applicability of PAM, and to examine its relevancy and significance in dealing with the innovation and changes in the organizations.
Resumo:
A multivariable hyperstable robust adaptive decoupling control algorithm based on a neural network is presented for the control of nonlinear multivariable coupled systems with unknown parameters and structure. The Popov theorem is used in the design of the controller. The modelling errors, coupling action and other uncertainties of the system are identified on-line by a neural network. The identified results are taken as compensation signals such that the robust adaptive control of nonlinear systems is realised. Simulation results are given.
Resumo:
This paper discusses how the use of computer-based modelling tools has aided the design of a telemetry unit for use with oil well logging. With the aid of modern computer-based simulation techniques, the new design is capable of operating at data rates of 2.5 times faster than previous designs.
Resumo:
Acquiring a mechanistic understanding of the role of the biotic feedbacks on the links between atmospheric CO2 concentrations and temperature is essential for trustworthy climate predictions. Currently, computer based simulations are the only available tool to estimate the global impact of the biotic feedbacks on future atmospheric CO2 and temperatures. Here we propose an alternative and complementary approaches by using materially closed and energetically open analogue/physical models of the carbon cycle. We argue that there is potential in using a materially closed approach to improve our understanding of the magnitude and sign of many biotic feedbacks, and that recent technological advance make this feasible. We also suggest how such systems could be designed and discuss the advantages and limitations of establishing physical models of the global carbon cycle.
Resumo:
High rates of nutrient loading from agricultural and urban development have resulted in surface water eutrophication and groundwater contamination in regions of Ontario. In Lake Simcoe (Ontario, Canada), anthropogenic nutrient contributions have contributed to increased algal growth, low hypolimnetic oxygen concentrations, and impaired fish reproduction. An ambitious programme has been initiated to reduce phosphorus loads to the lake, aiming to achieve at least a 40% reduction in phosphorus loads by 2045. Achievement of this target necessitates effective remediation strategies, which will rely upon an improved understanding of controls on nutrient export from tributaries of Lake Simcoe as well as improved understanding of the importance of phosphorus cycling within the lake. In this paper, we describe a new model structure for the integrated dynamic and process-based model INCA-P, which allows fully-distributed applications, suited to branched river networks. We demonstrate application of this model to the Black River, a tributary of Lake Simcoe, and use INCA-P to simulate the fluxes of P entering the lake system, apportion phosphorus among different sources in the catchment, and explore future scenarios of land-use change and nutrient management to identify high priority sites for implementation of watershed best management practises.
Resumo:
We develop a complex-valued (CV) B-spline neural network approach for efficient identification and inversion of CV Wiener systems. The CV nonlinear static function in the Wiener system is represented using the tensor product of two univariate B-spline neural networks. With the aid of a least squares parameter initialisation, the Gauss-Newton algorithm effectively estimates the model parameters that include the CV linear dynamic model coefficients and B-spline neural network weights. The identification algorithm naturally incorporates the efficient De Boor algorithm with both the B-spline curve and first order derivative recursions. An accurate inverse of the CV Wiener system is then obtained, in which the inverse of the CV nonlinear static function of the Wiener system is calculated efficiently using the Gaussian-Newton algorithm based on the estimated B-spline neural network model, with the aid of the De Boor recursions. The effectiveness of our approach for identification and inversion of CV Wiener systems is demonstrated using the application of digital predistorter design for high power amplifiers with memory
Resumo:
Communication signal processing applications often involve complex-valued (CV) functional representations for signals and systems. CV artificial neural networks have been studied theoretically and applied widely in nonlinear signal and data processing [1–11]. Note that most artificial neural networks cannot be automatically extended from the real-valued (RV) domain to the CV domain because the resulting model would in general violate Cauchy-Riemann conditions, and this means that the training algorithms become unusable. A number of analytic functions were introduced for the fully CV multilayer perceptrons (MLP) [4]. A fully CV radial basis function (RBF) nework was introduced in [8] for regression and classification applications. Alternatively, the problem can be avoided by using two RV artificial neural networks, one processing the real part and the other processing the imaginary part of the CV signal/system. A even more challenging problem is the inverse of a CV
Resumo:
Understanding how and why the capability of one set of business resources, its structural arrangements and mechanisms compared to another works can provide competitive advantage in terms of new business processes and product and service development. However, most business models of capability are descriptive and lack formal modelling language to qualitatively and quantifiably compare capabilities, Gibson’s theory of affordance, the potential for action, provides a formal basis for a more robust and quantitative model, but most formal affordance models are complex and abstract and lack support for real-world applications. We aim to understand the ‘how’ and ‘why’ of business capability, by developing a quantitative and qualitative model that underpins earlier work on Capability-Affordance Modelling – CAM. This paper integrates an affordance based capability model and the formalism of Coloured Petri Nets to develop a simulation model. Using the model, we show how capability depends on the space time path of interacting resources, the mechanism of transition and specific critical affordance factors relating to the values of the variables for resources, people and physical objects. We show how the model can identify the capabilities of resources to enable the capability to inject a drug and anaesthetise a patient.
Resumo:
This paper proposes a novel adaptive multiple modelling algorithm for non-linear and non-stationary systems. This simple modelling paradigm comprises K candidate sub-models which are all linear. With data available in an online fashion, the performance of all candidate sub-models are monitored based on the most recent data window, and M best sub-models are selected from the K candidates. The weight coefficients of the selected sub-model are adapted via the recursive least square (RLS) algorithm, while the coefficients of the remaining sub-models are unchanged. These M model predictions are then optimally combined to produce the multi-model output. We propose to minimise the mean square error based on a recent data window, and apply the sum to one constraint to the combination parameters, leading to a closed-form solution, so that maximal computational efficiency can be achieved. In addition, at each time step, the model prediction is chosen from either the resultant multiple model or the best sub-model, whichever is the best. Simulation results are given in comparison with some typical alternatives, including the linear RLS algorithm and a number of online non-linear approaches, in terms of modelling performance and time consumption.
Resumo:
In this work, thermodynamic models for fitting the phase equilibrium of binary systems were applied, aiming to predict the high pressure phase equilibrium of multicomponent systems of interest in the food engineering field, comparing the results generated by the models with new experimental data and with those from the literature. Two mixing rules were used with the Peng-Robinson equation of state, one with the mixing rule of van der Waals and the other with the composition-dependent mixing rule of Mathias et al. The systems chosen are of fundamental importance in food industries, such as the binary systems CO(2)-limonene, CO(2)-citral and CO(2)-linalool, and the ternary systems CO(2)-Limonene-Citral and CO(2)-Limonene-Linalool, where high pressure phase equilibrium knowledge is important to extract and fractionate citrus fruit essential oils. For the CO(2)-limonene system, some experimental data were also measured in this work. The results showed the high capability of the model using the composition-dependent mixing rule to model the phase equilibrium behavior of these systems.
Resumo:
Determining the provenance of data, i.e. the process that led to that data, is vital in many disciplines. For example, in science, the process that produced a given result must be demonstrably rigorous for the result to be deemed reliable. A provenance system supports applications in recording adequate documentation about process executions to answer queries regarding provenance, and provides functionality to perform those queries. Several provenance systems are being developed, but all focus on systems in which the components are textitreactive, for example Web Services that act on the basis of a request, job submission system, etc. This limitation means that questions regarding the motives of autonomous actors, or textitagents, in such systems remain unanswerable in the general case. Such questions include: who was ultimately responsible for a given effect, what was their reason for initiating the process and does the effect of a process match what was intended to occur by those initiating the process? In this paper, we address this limitation by integrating two solutions: a generic, re-usable framework for representing the provenance of data in service-oriented architectures and a model for describing the goal-oriented delegation and engagement of agents in multi-agent systems. Using these solutions, we present algorithms to answer common questions regarding responsibility and success of a process and evaluate the approach with a simulated healthcare example.