123 resultados para modelling the robot
Resumo:
During the last few years Enterprise Architecture (EA) has received increasing attention among industry and academia. By adopting EA, organisations may gain a number of benefits such as better decision making,increased revenues and cost reduction, and alignment of business and IT. However, EA adoption has been found to be difficult. In this paper a model to explain resistance during EA adoption process (REAP) is introduced and validated. The model reveals relationships between strategic level of EA, resulting organisational changes, and sources of resistance. By utilising REAP model, organisations may anticipate and prepare for the organisational change resistance during EA adoption.
Resumo:
We present a general approach based on nonequilibrium thermodynamics for bridging the gap between a well-defined microscopic model and the macroscopic rheology of particle-stabilised interfaces. Our approach is illustrated by starting with a microscopic model of hard ellipsoids confined to a planar surface, which is intended to simply represent a particle-stabilised fluid–fluid interface. More complex microscopic models can be readily handled using the methods outlined in this paper. From the aforementioned microscopic starting point, we obtain the macroscopic, constitutive equations using a combination of systematic coarse-graining, computer experiments and Hamiltonian dynamics. Exemplary numerical solutions of the constitutive equations are given for a variety of experimentally relevant flow situations to explore the rheological behaviour of our model. In particular, we calculate the shear and dilatational moduli of the interface over a wide range of surface coverages, ranging from the dilute isotropic regime, to the concentrated nematic regime.
Resumo:
This chapter presents a simple econometric model of the medieval English economy, focusing on the relationship between money, prices and incomes. The model is estimated using annual data for the period 1263-1520 obtained from various sources. The start date is determined by the availability of continuous runs of annual data, while the finishing date immediately precedes the take-off of Tudor price inflation. Accounts from the ecclesiastical and monastic estates have survived in great numbers for this period, thereby ensuring that crop yields can be estimated from a regionally representative set of estates.
Resumo:
New models for estimating bioaccumulation of persistent organic pollutants in the agricultural food chain were developed using recent improvements to plant uptake and cattle transfer models. One model named AgriSim was based on K OW regressions of bioaccumulation in plants and cattle, while the other was a steady-state mechanistic model, AgriCom. The two developed models and European Union System for the Evaluation of Substances (EUSES), as a benchmark, were applied to four reported food chain (soil/air-grass-cow-milk) scenarios to evaluate the performance of each model simulation against the observed data. The four scenarios considered were as follows: (1) polluted soil and air, (2) polluted soil, (3) highly polluted soil surface and polluted subsurface and (4) polluted soil and air at different mountain elevations. AgriCom reproduced observed milk bioaccumulation well for all four scenarios, as did AgriSim for scenarios 1 and 2, but EUSES only did this for scenario 1. The main causes of the deviation for EUSES and AgriSim were the lack of the soil-air-plant pathway and the ambient air-plant pathway, respectively. Based on the results, it is recommended that soil-air-plant and ambient air-plant pathway should be calculated separately and the K OW regression of transfer factor to milk used in EUSES be avoided. AgriCom satisfied the recommendations that led to the low residual errors between the simulated and the observed bioaccumulation in agricultural food chain for the four scenarios considered. It is therefore recommended that this model should be incorporated into regulatory exposure assessment tools. The model uncertainty of the three models should be noted since the simulated concentration in milk from 5th to 95th percentile of the uncertainty analysis often varied over two orders of magnitude. Using a measured value of soil organic carbon content was effective to reduce this uncertainty by one order of magnitude.
Resumo:
Objective: To evaluate the effect of robot-mediated therapy on arm dysfunction post stroke. Design: A series of single-case studies using a randomized multiple baseline design with ABC or ACB order. Subjects (n = 20) had a baseline length of 8, 9 or 10 data points. They continued measurement during the B - robot-mediated therapy and C - sling suspension phases. Setting: Physiotherapy department, teaching hospital. Subjects: Twenty subjects with varying degrees of motor and sensory deficit completed the study. Subjects attended three times a week, with each phase lasting three weeks. Interventions: In the robot-mediated therapy phase they practised three functional exercises with haptic and visual feedback from the system. In the sling suspension phase they practised three single-plane exercises. Each treatment phase was three weeks long. Main measures: The range of active shoulder flexion, the Fugl-Meyer motor assessment and the Motor Assessment Scale were measured at each visit. Results: Each subject had a varied response to the measurement and intervention phases. The rate of recovery was greater during the robot-mediated therapy phase than in the baseline phase for the majority of subjects. The rate of recovery during the robot-mediated therapy phase was also greater than that during the sling suspension phase for most subjects. Conclusion: The positive treatment effect for both groups suggests that robot-mediated therapy can have a treatment effect greater than the same duration of non-functional exercises. Further studies investigating the optimal duration of treatment in the form of a randomized controlled trial are warranted.
Resumo:
Since 1998, the Aurora project has been investigating the use of a robotic platform as a tool for therapy use with children with autism. A key issue in this project is the evaluation of the interactions, which are not constricted and involve the child moving freely. Additionally, the response of the children is an important factor which must emerge from the robot trial sessions and the evaluation methodology, in order to guide further development work.
Resumo:
This paper outlines some rehabilitation applications of manipulators and identifies that new approaches demand that the robot make an intimate contact with the user. Design of new generations of manipulators with programmable compliance along with higher level controllers that can set the compliance appropriately for the task, are both feasible propositions. We must thus gain a greater insight into the way in which a person interacts with a machine, particularly given that the interaction may be non-passive. We are primarily interested in the change in wrist and arm dynamics as the person co-contracts his/her muscles. It is observed that this leads to a change in stiffness that can push an actuated interface into a limit cycle. We use both experimental results gathered from a PHANToM haptic interface and a mathematical model to observe this effect. Results are relevant to the fields of rehabilitation and therapy robots, haptic interfaces, and telerobotics
Resumo:
The robot control problem is discussed with regard to controller implementation on a multitransputer array. Some high-performance aspects required of such controllers are described, with particular reference to robot force control. The implications for the architecture required for controllers based on computed torque are discussed and an example is described. The idea of treating a transputer array as a virtual bus is put forward for the implementation of fast real-time controllers. An example is given of controlling a Puma 560 industrial robot. Some of the practical considerations for using transputers for such control are described.
Resumo:
To bridge the gaps between traditional mesoscale modelling and microscale modelling, the National Center for Atmospheric Research, in collaboration with other agencies and research groups, has developed an integrated urban modelling system coupled to the weather research and forecasting (WRF) model as a community tool to address urban environmental issues. The core of this WRF/urban modelling system consists of the following: (1) three methods with different degrees of freedom to parameterize urban surface processes, ranging from a simple bulk parameterization to a sophisticated multi-layer urban canopy model with an indoor–outdoor exchange sub-model that directly interacts with the atmospheric boundary layer, (2) coupling to fine-scale computational fluid dynamic Reynolds-averaged Navier–Stokes and Large-Eddy simulation models for transport and dispersion (T&D) applications, (3) procedures to incorporate high-resolution urban land use, building morphology, and anthropogenic heating data using the National Urban Database and Access Portal Tool (NUDAPT), and (4) an urbanized high-resolution land data assimilation system. This paper provides an overview of this modelling system; addresses the daunting challenges of initializing the coupled WRF/urban model and of specifying the potentially vast number of parameters required to execute the WRF/urban model; explores the model sensitivity to these urban parameters; and evaluates the ability of WRF/urban to capture urban heat islands, complex boundary-layer structures aloft, and urban plume T&D for several major metropolitan regions. Recent applications of this modelling system illustrate its promising utility, as a regional climate-modelling tool, to investigate impacts of future urbanization on regional meteorological conditions and on air quality under future climate change scenarios. Copyright © 2010 Royal Meteorological Society
Resumo:
Previous studies have argued that the autocorrelation of the winter North Atlantic Oscillation (NAO) index provides evidence of unusually persistent intraseasonal dynamics. We demonstrate that the autocorrelation on intraseasonal time-scales of 10–30 days is sensitive to the presence of interannual variability, part of which arises from the sampling of intraseasonal variability and the remainder of which we consider to be “externally forced”. Modelling the intraseasonal variability of the NAO as a red noise process we estimate, for winter, ~70% of the interannual variability is externally forced, whereas for summer sampling accounts for almost all of the interannual variability. Correcting for the externally forced interannual variability has a major impact on the autocorrelation function for winter. When externally forced interannual variability is taken into account the intrinsic persistence of the NAO is very similar in summer and winter (~5 days). This finding has implications for understanding the dynamics of the NAO.
Resumo:
This review introduces the methods used to simulate the processes affecting dissolved oxygen (DO) in lowland rivers. The important processes are described and this provides a modelling framework to describe those processes in the context of a mass-balance model. The process equations that are introduced all require (reaction) rate parameters and a variety of common procedures for identifying those parameters are reviewed. This is important because there is a wide range of estimation techniques for many of the parameters. These different techniques elicit different estimates of the parameter value and so there is the potential for a significant uncertainty in the model's inputs and therefore in the output too. Finally, the data requirements for modelling DO in lowland rivers are summarised on the basis of modelling the processes described in this review using a mass-balance model. This is reviewed with regard to what data are available and from where they might be obtained. (C) 2003 Elsevier Science B.V. All rights reserved.
Resumo:
Recent severe flooding in the UK has highlighted the need for better information on flood risk, increasing the pressure on engineers to enhance the capabilities of computer models for flood prediction. This paper evaluates the benefits to be gained from the use of remotely sensed data to support flood modelling. The remotely sensed data available can be used either to produce high-resolution digital terrain models (DTMs) (light detection and ranging (Lidar) data), or to generate accurate inundation mapping of past flood events (airborne synthetic aperture radar (SAR) data and aerial photography). The paper reports on the modelling of real flood events that occurred at two UK sites on the rivers Severn and Ouse. At these sites a combination of remotely sensed data and recorded hydrographs was available. It is concluded first that light detection and ranging Lidar generated DTMs support the generation of considerably better models and enhance the visualisation of model results and second that flood outlines obtained from airborne SAR or aerial images help develop an appreciation of the hydraulic behaviour of important model components, and facilitate model validation. The need for further research is highlighted by a number of limitations, namely: the difficulties in obtaining an adequate representation of hydraulically important features such as embankment crests and walls; uncertainties in the validation data; and difficulties in extracting flood outlines from airborne SAR images in urban areas.
Resumo:
Context: Learning can be regarded as knowledge construction in which prior knowledge and experience serve as basis for the learners to expand their knowledge base. Such a process of knowledge construction has to take place continuously in order to enhance the learners’ competence in a competitive working environment. As the information consumers, the individual users demand personalised information provision which meets their own specific purposes, goals, and expectations. Objectives: The current methods in requirements engineering are capable of modelling the common user’s behaviour in the domain of knowledge construction. The users’ requirements can be represented as a case in the defined structure which can be reasoned to enable the requirements analysis. Such analysis needs to be enhanced so that personalised information provision can be tackled and modelled. However, there is a lack of suitable modelling methods to achieve this end. This paper presents a new ontological method for capturing individual user’s requirements and transforming the requirements onto personalised information provision specifications. Hence the right information can be provided to the right user for the right purpose. Method: An experiment was conducted based on the qualitative method. A medium size of group of users participated to validate the method and its techniques, i.e. articulates, maps, configures, and learning content. The results were used as the feedback for the improvement. Result: The research work has produced an ontology model with a set of techniques which support the functions for profiling user’s requirements, reasoning requirements patterns, generating workflow from norms, and formulating information provision specifications. Conclusion: The current requirements engineering approaches provide the methodical capability for developing solutions. Our research outcome, i.e. the ontology model with the techniques, can further enhance the RE approaches for modelling the individual user’s needs and discovering the user’s requirements.
Resumo:
Process-based integrated modelling of weather and crop yield over large areas is becoming an important research topic. The production of the DEMETER ensemble hindcasts of weather allows this work to be carried out in a probabilistic framework. In this study, ensembles of crop yield (groundnut, Arachis hypogaea L.) were produced for 10 2.5 degrees x 2.5 degrees grid cells in western India using the DEMETER ensembles and the general large-area model (GLAM) for annual crops. Four key issues are addressed by this study. First, crop model calibration methods for use with weather ensemble data are assessed. Calibration using yield ensembles was more successful than calibration using reanalysis data (the European Centre for Medium-Range Weather Forecasts 40-yr reanalysis, ERA40). Secondly, the potential for probabilistic forecasting of crop failure is examined. The hindcasts show skill in the prediction of crop failure, with more severe failures being more predictable. Thirdly, the use of yield ensemble means to predict interannual variability in crop yield is examined and their skill assessed relative to baseline simulations using ERA40. The accuracy of multi-model yield ensemble means is equal to or greater than the accuracy using ERA40. Fourthly, the impact of two key uncertainties, sowing window and spatial scale, is briefly examined. The impact of uncertainty in the sowing window is greater with ERA40 than with the multi-model yield ensemble mean. Subgrid heterogeneity affects model accuracy: where correlations are low on the grid scale, they may be significantly positive on the subgrid scale. The implications of the results of this study for yield forecasting on seasonal time-scales are as follows. (i) There is the potential for probabilistic forecasting of crop failure (defined by a threshold yield value); forecasting of yield terciles shows less potential. (ii) Any improvement in the skill of climate models has the potential to translate into improved deterministic yield prediction. (iii) Whilst model input uncertainties are important, uncertainty in the sowing window may not require specific modelling. The implications of the results of this study for yield forecasting on multidecadal (climate change) time-scales are as follows. (i) The skill in the ensemble mean suggests that the perturbation, within uncertainty bounds, of crop and climate parameters, could potentially average out some of the errors associated with mean yield prediction. (ii) For a given technology trend, decadal fluctuations in the yield-gap parameter used by GLAM may be relatively small, implying some predictability on those time-scales.