917 resultados para Hexarotor. Dynamic modeling. Robust backstepping control. EKF Attitude Estimation
Resumo:
Negative correlations between task performance in dynamic control tasks and verbalizable knowledge, as assessed by a post-task questionnaire, have been interpreted as dissociations that indicate two antagonistic modes of learning, one being “explicit”, the other “implicit”. This paper views the control tasks as finite-state automata and offers an alternative interpretation of these negative correlations. It is argued that “good controllers” observe fewer different state transitions and, consequently, can answer fewer post-task questions about system transitions than can “bad controllers”. Two experiments demonstrate the validity of the argument by showing the predicted negative relationship between control performance and the number of explored state transitions, and the predicted positive relationship between the number of explored state transitions and questionnaire scores. However, the experiments also elucidate important boundary conditions for the critical effects. We discuss the implications of these findings, and of other problems arising from the process control paradigm, for conclusions about implicit versus explicit learning processes.
Resumo:
We analyze how the characteristics of El Niño-Southern Oscillation (ENSO) are changed in coupled ocean–atmosphere simulations of the mid-Holocene (MH) and the Last Glacial Maximum (LGM) performed as part of the Paleoclimate Modeling Intercomparison Project phase 2 (PMIP2). Comparison of the model results with present day observations show that most of the models reproduce the large scale features of the tropical Pacific like the SST gradient, the mean SST and the mean seasonal cycles. All models simulate the ENSO variability, although with different skill. Our analyses show that several relationships between El Niño amplitude and the mean state across the different control simulations are still valid for simulations of the MH and the LGM. Results for the MH show a consistent El Niño amplitude decrease. It can be related to the large scale atmospheric circulation changes. While the Northern Hemisphere receives more insolation during the summer time, the Asian summer monsoon system is strengthened which leads to the enhancement of the Walker circulation. Easterlies prevailing over the central eastern Pacific induce an equatorial upwelling that damps the El Niño development. Results are less conclusive for 21ka. Large scale dynamic competes with changes in local heat fluxes, so that model shows a wide range of responses, as it is the case in future climate projections.
Resumo:
Planning is a vital element of project management but it is still not recognized as a process variable. Its objective should be to outperform the initially defined processes, and foresee and overcome possible undesirable events. Detailed task-level master planning is unrealistic since one cannot accurately predict all the requirements and obstacles before work has even started. The process planning methodology (PPM) has thus been developed in order to overcome common problems of the overwhelming project complexity. The essential elements of the PPM are the process planning group (PPG), including a control team that dynamically links the production/site and management, and the planning algorithm embodied within two continuous-improvement loops. The methodology was tested on a factory project in Slovenia and in four successive projects of a similar nature. In addition to a number of improvement ideas and enhanced communication, the applied PPM resulted in 32% higher total productivity, 6% total savings and created a synergistic project environment.
Resumo:
Smooth flow of production in construction is hampered by disparity between individual trade teams' goals and the goals of stable production flow for the project as a whole. This is exacerbated by the difficulty of visualizing the flow of work in a construction project. While the addresses some of the issues in Building information modeling provides a powerful platform for visualizing work flow in control systems that also enable pull flow and deeper collaboration between teams on and off site. The requirements for implementation of a BIM-enabled pull flow construction management software system based on the Last Planner System™, called ‘KanBIM’, have been specified, and a set of functional mock-ups of the proposed system has been implemented and evaluated in a series of three focus group workshops. The requirements cover the areas of maintenance of work flow stability, enabling negotiation and commitment between teams, lean production planning with sophisticated pull flow control, and effective communication and visualization of flow. The evaluation results show that the system holds the potential to improve work flow and reduce waste by providing both process and product visualization at the work face.
Resumo:
Asynchronous Optical Sampling (ASOPS) [1,2] and frequency comb spectrometry [3] based on dual Ti:saphire resonators operated in a master/slave mode have the potential to improve signal to noise ratio in THz transient and IR sperctrometry. The multimode Brownian oscillator time-domain response function described by state-space models is a mathematically robust framework that can be used to describe the dispersive phenomena governed by Lorentzian, Debye and Drude responses. In addition, the optical properties of an arbitrary medium can be expressed as a linear combination of simple multimode Brownian oscillator functions. The suitability of a range of signal processing schemes adopted from the Systems Identification and Control Theory community for further processing the recorded THz transients in the time or frequency domain will be outlined [4,5]. Since a femtosecond duration pulse is capable of persistent excitation of the medium within which it propagates, such approach is perfectly justifiable. Several de-noising routines based on system identification will be shown. Furthermore, specifically developed apodization structures will be discussed. These are necessary because due to dispersion issues, the time-domain background and sample interferograms are non-symmetrical [6-8]. These procedures can lead to a more precise estimation of the complex insertion loss function. The algorithms are applicable to femtosecond spectroscopies across the EM spectrum. Finally, a methodology for femtosecond pulse shaping using genetic algorithms aiming to map and control molecular relaxation processes will be mentioned.
Resumo:
A dynamic, deterministic, economic simulation model was developed to estimate the costs and benefits of controlling Mycobacterium avium subsp. paratuberculosis (Johne's disease) in a suckler beef herd. The model is intended as a demonstration tool for veterinarians to use with farmers. The model design process involved user consultation and participation and the model is freely accessible on a dedicated website. The 'user-friendly' model interface allows the input of key assumptions and farm specific parameters enabling model simulations to be tailored to individual farm circumstances. The model simulates the effect of Johne's disease and various measures for its control in terms of herd prevalence and the shedding states of animals within the herd, the financial costs of the disease and of any control measures and the likely benefits of control of Johne's disease for the beef suckler herd over a 10-year period. The model thus helps to make more transparent the 'hidden costs' of Johne's in a herd and the likely benefits to be gained from controlling the disease. The control strategies considered within the model are 'no control', 'testing and culling of diagnosed animals', 'improving management measures' or a dual strategy of 'testing and culling in association with improving management measures'. An example 'run' of the model shows that the strategy 'improving management measures', which reduces infection routes during the early stages, results in a marked fall in herd prevalence and total costs. Testing and culling does little to reduce prevalence and does not reduce total costs over the 10-year period.
Resumo:
Purpose – To evaluate the control strategy for a hybrid natural ventilation wind catchers and air-conditioning system and to assess the contribution of wind catchers to indoor air environments and energy savings if any. Design/methodology/approach – Most of the modeling techniques for assessing wind catchers performance are theoretical. Post-occupancy evaluation studies of buildings will provide an insight into the operation of these building components and help to inform facilities managers. A case study for POE was presented in this paper. Findings – The monitoring of the summer and winter month operations showed that the indoor air quality parameters were kept within the design target range. The design control strategy failed to record data regarding the operation, opening time and position of wind catchers system. Though the implemented control strategy was working effectively in monitoring the operation of mechanical ventilation systems, i.e. AHU, did not integrate the wind catchers with the mechanical ventilation system. Research limitations/implications – Owing to short-falls in the control strategy implemented in this project, it was found difficult to quantify and verify the contribution of the wind catchers to the internal conditions and, hence, energy savings. Practical implications – Controlling the operation of the wind catchers via the AHU will lead to isolation of the wind catchers in the event of malfunctioning of the AHU. Wind catchers will contribute to the ventilation of space, particularly in the summer months. Originality/value – This paper demonstrates the value of POE as indispensable tool for FM professionals. It further provides insight into the application of natural ventilation systems in building for healthier indoor environments at lower energy cost. The design of the control strategy for natural ventilation and air-conditioning should be considered at the design stage involving the FM personnel.
Resumo:
Quantitative control of aroma generation during the Maillard reaction presents great scientific and industrial interest. Although there have been many studies conducted in simplified model systems, the results are difficult to apply to complex food systems, where the presence of other components can have a significant impact. In this work, an aqueous extract of defatted beef liver was chosen as a simplified food matrix for studying the kinetics of the Mallard reaction. Aliquots of the extract were heated under different time and temperature conditions and analyzed for sugars, amino acids, and methylbutanals, which are important Maillard-derived aroma compounds formed in cooked meat. Multiresponse kinetic modeling, based on a simplified mechanistic pathway, gave a good fit with the experimental data, but only when additional steps were introduced to take into account the interactions of glucose and glucose-derived intermediates with protein and other amino compounds. This emphasizes the significant role of the food matrix in controlling the Maillard reaction.
Resumo:
Event-related functional magnetic resonance imaging (efMRI) has emerged as a powerful technique for detecting brains' responses to presented stimuli. A primary goal in efMRI data analysis is to estimate the Hemodynamic Response Function (HRF) and to locate activated regions in human brains when specific tasks are performed. This paper develops new methodologies that are important improvements not only to parametric but also to nonparametric estimation and hypothesis testing of the HRF. First, an effective and computationally fast scheme for estimating the error covariance matrix for efMRI is proposed. Second, methodologies for estimation and hypothesis testing of the HRF are developed. Simulations support the effectiveness of our proposed methods. When applied to an efMRI dataset from an emotional control study, our method reveals more meaningful findings than the popular methods offered by AFNI and FSL. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
In this paper new robust nonlinear model construction algorithms for a large class of linear-in-the-parameters models are introduced to enhance model robustness, including three algorithms using combined A- or D-optimality or PRESS statistic (Predicted REsidual Sum of Squares) with regularised orthogonal least squares algorithm respectively. A common characteristic of these algorithms is that the inherent computation efficiency associated with the orthogonalisation scheme in orthogonal least squares or regularised orthogonal least squares has been extended such that the new algorithms are computationally efficient. A numerical example is included to demonstrate effectiveness of the algorithms. Copyright (C) 2003 IFAC.
Resumo:
Multi-agent systems have been adopted to build intelligent environment in recent years. It was claimed that energy efficiency and occupants' comfort were the most important factors for evaluating the performance of modem work environment, and multi-agent systems presented a viable solution to handling the complexity of dynamic building environment. While previous research has made significant advance in some aspects, the proposed systems or models were often not applicable in a "shared environment". This paper introduces an ongoing project on multi-agent for building control, which aims to achieve both energy efficiency and occupants' comfort in a shared environment.
Resumo:
In this paper we propose an enhanced relay-enabled distributed coordination function (rDCF) for wireless ad hoc networks. The idea of rDCF is to use high data rate nodes to work as relays for the low data rate nodes. The relay helps to increase the throughput and lower overall blocking time of nodes due to faster dual-hop transmission. rDCF achieves higher throughput over IEEE 802.11 distributed coordination function (DCF). The protocol is further enhanced for higher throughput and reduced energy. These enhancements result from the use of a dynamic preamble (i.e. using short preamble for the relay transmission) and also by reducing unnecessary overhearing (by other nodes not involved in transmission). We have modeled the energy consumption of rDCF, showing that rDCF provides an energy efficiency of 21.7% at 50 nodes over 802.11 DCF. Compared with the existing rDCF, the enhanced rDCF (ErDCF) scheme proposed in this paper yields a throughput improvement of 16.54% (at the packet length of 1000 bytes) and an energy saving of 53% at 50 nodes.
Resumo:
Dense deployments of wireless local area networks (WLANs) are fast becoming a permanent feature of all developed cities around the world. While this increases capacity and coverage, the problem of increased interference, which is exacerbated by the limited number of channels available, can severely degrade the performance of WLANs if an effective channel assignment scheme is not employed. In an earlier work, an asynchronous, distributed and dynamic channel assignment scheme has been proposed that (1) is simple to implement, (2) does not require any knowledge of the throughput function, and (3) allows asynchronous channel switching by each access point (AP). In this paper, we present extensive performance evaluation of this scheme when it is deployed in the more practical non-uniform and dynamic topology scenarios. Specifically, we investigate its effectiveness (1) when APs are deployed in a nonuniform fashion resulting in some APs suffering from higher levels of interference than others and (2) when APs are effectively switched `on/off' due to the availability/lack of traffic at different times, which creates a dynamically changing network topology. Simulation results based on actual WLAN topologies show that robust performance gains over other channel assignment schemes can still be achieved even in these realistic scenarios.
Resumo:
Given that the next and current generation networks will coexist for a considerable period of time, it is important to improve the performance of existing networks. One such improvement recently proposed is to enhance the throughput of ad hoc networks by using dual-hop relay-based transmission schemes. Since in ad hoc networks throughput is normally related to their energy consumption, it is important to examine the impact of using relay-based transmissions on energy consumption. In this paper, we present an analytical energy consumption model for dual-hop relay-based medium access control (MAC) protocols. Based on the recently reported relay-enabled Distributed Coordination Function (rDCF), we have shown the efficacy of the proposed analytical model. This is a generalized model and can be used to predict energy consumption in saturated relay-based ad hoc networks. This model can predict energy consumption in ideal environment and with transmission errors. It is shown that using a relay results in not only better throughput but also better energy efficiency. Copyright (C) 2009 Rizwan Ahmad et al.