913 resultados para Model predictive control
Resumo:
Building information models have created a paradigm shift in how buildings are built and managed by providing a dynamic repository for building data that is useful in many new operational scenarios. This change has also created an opportunity to use building information models as an integral part of security operations and especially as a tool to facilitate fine-grained access control to building spaces in smart buildings and critical infrastructure environments. In this paper, we identify the requirements for a security policy model for such an access control system and discuss why the existing policy models are not suitable for this application. We propose a new policy language extension to XACML, with BIM specific data types and functions based on the IFC specification, which we call BIM-XACML.
Resumo:
The functions of the volunteer functions inventory were combined with the constructs of the theory of planned behaviour (i.e., attitudes, subjective norms, and perceived behavioural control) to establish whether a stronger, single explanatory model prevailed. Undertaken in the context of episodic, skilled volunteering by individuals who were retired or approaching retirement (N = 186), the research advances on prior studies which either examined the predictive capacity of each model independently or compared their explanatory value. Using hierarchical regression analysis, the functions of the volunteer functions inventory (when controlling for demographic variables) explained an additional 7.0% of variability in individuals’ willingness to volunteer over and above that accounted for by the theory of planned behaviour. Significant predictors in the final model included attitudes, subjective norms and perceived behavioural control from the theory of planned behaviour and the understanding function from the volunteer functions inventory. It is proposed that the items comprising the understanding function may represent a deeper psychological construct (e.g., self-actualisation) not accounted for by the theory of planned behaviour. The findings highlight the potential benefit of combining these two prominent models in terms of improving understanding of volunteerism and providing a single parsimonious model for raising rates of this important behaviour.
Resumo:
A study was undertaken to examine further the effects of perceived work control on employee adjustment. On the basis of the stress antidote model, it was proposed that high levels of prediction, understanding, and control of work-related events would have direct, indirect, and interactive effects on levels of employee adjustment. These hypotheses were tested in a short-term longitudinal study of 137 employees of a large retail organization. The stress antidote measures appeared to be indirectly related to employee adjustment, via their effects on perceptions of work stress. There was weak evidence for the proposal that prediction, understanding, and control would buffer the negative effects of work stress. Additional analyses indicated that the observed effects of prediction, understanding, and control were independent of employees' generalized control beliefs. However, there was no support for the proposal that the effects of the stress antidote measures would be dependent on employees' generalized control beliefs.
Resumo:
Objective: The aim of this study was to develop a model capable of predicting variability in the mental workload experienced by frontline operators under routine and nonroutine conditions. Background: Excess workload is a risk that needs to be managed in safety-critical industries. Predictive models are needed to manage this risk effectively yet are difficult to develop. Much of the difficulty stems from the fact that workload prediction is a multilevel problem. Method: A multilevel workload model was developed in Study 1 with data collected from an en route air traffic management center. Dynamic density metrics were used to predict variability in workload within and between work units while controlling for variability among raters. The model was cross-validated in Studies 2 and 3 with the use of a high-fidelity simulator. Results: Reported workload generally remained within the bounds of the 90% prediction interval in Studies 2 and 3. Workload crossed the upper bound of the prediction interval only under nonroutine conditions. Qualitative analyses suggest that nonroutine events caused workload to cross the upper bound of the prediction interval because the controllers could not manage their workload strategically. Conclusion: The model performed well under both routine and nonroutine conditions and over different patterns of workload variation. Application: Workload prediction models can be used to support both strategic and tactical workload management. Strategic uses include the analysis of historical and projected workflows and the assessment of staffing needs. Tactical uses include the dynamic reallocation of resources to meet changes in demand.
Resumo:
Understanding the effects of different types and quality of data on bioclimatic modeling predictions is vital to ascertaining the value of existing models, and to improving future models. Bioclimatic models were constructed using the CLIMEX program, using different data types – seasonal dynamics, geographic (overseas) distribution, and a combination of the two – for two biological control agents for the major weed Lantana camara L. in Australia. The models for one agent, Teleonemia scrupulosa Stål (Hemiptera:Tingidae) were based on a higher quality and quantity of data than the models for the other agent, Octotoma scabripennis Guérin-Méneville (Coleoptera: Chrysomelidae). Predictions of the geographic distribution for Australia showed that T. scrupulosa models exhibited greater accuracy with a progressive improvement from seasonal dynamics data, to the model based on overseas distribution, and finally the model combining the two data types. In contrast, O. scabripennis models were of low accuracy, and showed no clear trends across the various model types. These case studies demonstrate the importance of high quality data for developing models, and of supplementing distributional data with species seasonal dynamics data wherever possible. Seasonal dynamics data allows the modeller to focus on the species response to climatic trends, while distributional data enables easier fitting of stress parameters by restricting the species envelope to the described distribution. It is apparent that CLIMEX models based on low quality seasonal dynamics data, together with a small quantity of distributional data, are of minimal value in predicting the spatial extent of species distribution.
Resumo:
The results of drying trials show that vacuum drying produces material of the same or better quality than is currently being produced by conventional methods within 41 to 66 % of the drying time, depending on the species. Economic analysis indicates positive or negative results depending on the species and the size of drying operation. Definite economic benefits exist by vacuum drying over conventional drying for all operation sizes, in terms of drying quality, time and economic viability, for E. marginata and E. pilularis. The same applies for vacuum drying C. citriodora and E. obliqua in larger drying operations (kiln capacity 50 m3 or above), but not for smaller operations at this stage. Further schedule refinement has the ability to reduce drying times further and may improve the vacuum drying viability of the latter species in smaller operations.
Resumo:
There is an increased interest on the use of Unmanned Aerial Vehicles (UAVs) for wildlife and feral animal monitoring around the world. This paper describes a novel system which uses a predictive dynamic application that places the UAV ahead of a user, with a low cost thermal camera, a small onboard computer that identifies heat signatures of a target animal from a predetermined altitude and transmits that target’s GPS coordinates. A map is generated and various data sets and graphs are displayed using a GUI designed for easy use. The paper describes the hardware and software architecture and the probabilistic model for downward facing camera for the detection of an animal. Behavioral dynamics of target movement for the design of a Kalman filter and Markov model based prediction algorithm are used to place the UAV ahead of the user. Geometrical concepts and Haversine formula are applied to the maximum likelihood case in order to make a prediction regarding a future state of the user, thus delivering a new way point for autonomous navigation. Results show that the system is capable of autonomously locating animals from a predetermined height and generate a map showing the location of the animals ahead of the user.
Resumo:
Extensive resources are allocated to managing vertebrate pests, yet spatial understanding of pest threats, and how they respond to management, is limited at the regional scale where much decision-making is undertaken. We provide regional-scale spatial models and management guidance for European rabbits (Oryctolagus cuniculus) in a 260,791 km(2) region in Australia by determining habitat suitability, habitat susceptibility and the effects of the primary rabbit management options (barrier fence, shooting and baiting and warren ripping) or changing predation or disease control levels. A participatory modelling approach was used to develop a Bayesian network which captured the main drivers of suitability and spread, which in turn was linked spatially to develop high resolution risk maps. Policy-makers, rabbit managers and technical experts were responsible for defining the questions the model needed to address, and for subsequently developing and parameterising the model. Habitat suitability was determined by conditions required for warren-building and by above-ground requirements, such as food and harbour, and habitat susceptibility by the distance from current distributions, habitat suitability, and the costs of traversing habitats of different quality. At least one-third of the region had a high probability of being highly suitable (support high rabbit densities), with the model supported by validation. Habitat susceptibility was largely restricted by the current known rabbit distribution. Warren ripping was the most effective control option as warrens were considered essential for rabbit persistence. The anticipated increase in disease resistance was predicted to increase the probability of moderately suitable habitat becoming highly suitable, but not increase the at-risk area. We demonstrate that it is possible to build spatial models to guide regional-level management of vertebrate pests which use the best available knowledge and capture fine spatial-scale processes.
Resumo:
Self-tuning is applied to the control of nonlinear systems represented by the Hammerstein model wherein the nonlinearity is any odd-order polynomial. But control costing is not feasible in general. Initial relay control is employed to contain the deviations.
Resumo:
Based on dynamic inversion, a relatively straightforward approach is presented in this paper for nonlinear flight control design of high performance aircrafts, which does not require the normal and lateral acceleration commands to be first transferred to body rates before computing the required control inputs. This leads to substantial improvement of the tracking response. Promising results are obtained from six degree-offreedom simulation studies of F-16 aircraft, which are found to be superior as compared to an existing approach (which is also based on dynamic inversion). The new approach has two potential benefits, namely reduced oscillatory response (including elimination of non-minimum phase behavior) and reduced control magnitude. Next, a model-following neuron-adaptive design is augmented the nominal design in order to assure robust performance in the presence of parameter inaccuracies in the model. Note that in the approach the model update takes place adaptively online and hence it is philosophically similar to indirect adaptive control. However, unlike a typical indirect adaptive control approach, there is no need to update the individual parameters explicitly. Instead the inaccuracy in the system output dynamics is captured directly and then used in modifying the control. This leads to faster adaptation, which helps in stabilizing the unstable plant quicker. The robustness study from a large number of simulations shows that the adaptive design has good amount of robustness with respect to the expected parameter inaccuracies in the model.
Resumo:
Designing and optimizing high performance microprocessors is an increasingly difficult task due to the size and complexity of the processor design space, high cost of detailed simulation and several constraints that a processor design must satisfy. In this paper, we propose the use of empirical non-linear modeling techniques to assist processor architects in making design decisions and resolving complex trade-offs. We propose a procedure for building accurate non-linear models that consists of the following steps: (i) selection of a small set of representative design points spread across processor design space using latin hypercube sampling, (ii) obtaining performance measures at the selected design points using detailed simulation, (iii) building non-linear models for performance using the function approximation capabilities of radial basis function networks, and (iv) validating the models using an independently and randomly generated set of design points. We evaluate our model building procedure by constructing non-linear performance models for programs from the SPEC CPU2000 benchmark suite with a microarchitectural design space that consists of 9 key parameters. Our results show that the models, built using a relatively small number of simulations, achieve high prediction accuracy (only 2.8% error in CPI estimates on average) across a large processor design space. Our models can potentially replace detailed simulation for common tasks such as the analysis of key microarchitectural trends or searches for optimal processor design points.
Resumo:
A class of model reference adaptive control system which make use of an augmented error signal has been introduced by Monopoli. Convergence problems in this attractive class of systems have been investigated in this paper using concepts from hyperstability theory. It is shown that the condition on the linear part of the system has to be stronger than the one given earlier. A boundedness condition on the input to the linear part of the system has been taken into account in the analysis - this condition appears to have been missed in the previous applications of hyperstability theory. Sufficient conditions for the convergence of the adaptive gain to the desired value are also given.
Resumo:
This paper is concerned with the optimal flow control of an ATM switching element in a broadband-integrated services digital network. We model the switching element as a stochastic fluid flow system with a finite buffer, a constant output rate server, and a Gaussian process to characterize the input, which is a heterogeneous set of traffic sources. The fluid level should be maintained between two levels namely b1 and b2 with b1
Resumo:
We develop a Markov model for a TCP CUBIC connection. Next we use it to obtain approximate expressions for throughput when there may be queuing in the network. Finally we provide the throughputs different TCP CUBIC and TCP NewReno connections obtain while sharing a channel when they may have different round trip delays and packet loss probabilities.
Resumo:
In order to reduce the motion artifacts in DSA, non-rigid image registration is commonly used before subtracting the mask from the contrast image. Since DSA registration requires a set of spatially non-uniform control points, a conventional MRF model is not very efficient. In this paper, we introduce the concept of pivotal and non-pivotal control points to address this, and propose a non-uniform MRF for DSA registration. We use quad-trees in a novel way to generate the non-uniform grid of control points. Our MRF formulation produces a smooth displacement field and therefore results in better artifact reduction than that of registering the control points independently. We achieve improved computational performance using pivotal control points without compromising on the artifact reduction. We have tested our approach using several clinical data sets, and have presented the results of quantitative analysis, clinical assessment and performance improvement on a GPU. (C) 2013 Elsevier Ltd. All rights reserved.