884 resultados para sliding mode control theory
Resumo:
Les restriccions reals quantificades (QRC) formen un formalisme matemàtic utilitzat per modelar un gran nombre de problemes físics dins els quals intervenen sistemes d'equacions no-lineals sobre variables reals, algunes de les quals podent ésser quantificades. Els QRCs apareixen en nombrosos contextos, com l'Enginyeria de Control o la Biologia. La resolució de QRCs és un domini de recerca molt actiu dins el qual es proposen dos enfocaments diferents: l'eliminació simbòlica de quantificadors i els mètodes aproximatius. Tot i això, la resolució de problemes de grans dimensions i del cas general, resten encara problemes oberts. Aquesta tesi proposa una nova metodologia aproximativa basada en l'Anàlisi Intervalar Modal, una teoria matemàtica que permet resoldre problemes en els quals intervenen quantificadors lògics sobre variables reals. Finalment, dues aplicacions a l'Enginyeria de Control són presentades. La primera fa referència al problema de detecció de fallades i la segona consisteix en un controlador per a un vaixell a vela.
Resumo:
The level of insolvencies in the construction industry is high, when compared to other industry sectors. Given the management expertise and experience that is available to the construction industry, it seems strange that, according to the literature, the major causes of failure are lack of financial control and poor management. This indicates that with a good cash flow management, companies could be kept operating and financially healthy. It is possible to prevent failure. Although there are financial models that can be used to predict failure, they are based on company accounts, which have been shown to be an unreliable source of data. There are models available for cash flow management and forecasting and these could be used as a starting point for managers in rethinking their cash flow management practices. The research reported here has reached the stage of formulating researchable questions for an in-depth study including issues such as how contractors manage their cash flow, how payment practices can be managed without damaging others in the supply chain and the relationships between companies" financial structures and the payment regimes to which they are subjected.
Resumo:
A study of the formation and propagation of volume anomalies in North Atlantic Mode Waters is presented, based on 100 yr of monthly mean fields taken from the control run of the Third Hadley Centre Coupled Ocean-Atmosphere GCM (HadCM3). Analysis of the temporal and. spatial variability in the thickness between pairs of isothermal surfaces bounding the central temperature of the three main North Atlantic subtropical mode waters shows that large-scale variability in formation occurs over time scales ranging from 5 to 20 yr. The largest formation anomalies are associated with a southward shift in the mixed layer isothermal distribution, possibly due to changes in the gyre dynamics and/or changes in the overlying wind field and air-sea heat fluxes. The persistence of these anomalies is shown to result from their subduction beneath the winter mixed layer base where they recirculate around the subtropical gyre in the background geostrophic flow. Anomalies in the warmest mode (18 degrees C) formed on the western side of the basin persist for up to 5 yr. They are removed by mixing transformation to warmer classes and are returned to the seasonal mixed layer near the Gulf Stream where the stored heat may be released to the atmosphere. Anomalies in the cooler modes (16 degrees and 14 degrees C) formed on the eastern side of the basin persist for up to 10 yr. There is no clear evidence of significant transformation of these cooler mode anomalies to adjacent classes. It has been proposed that the eastern anomalies are removed through a tropical-subtropical water mass exchange mechanism beneath the trade wind belt (south of 20 degrees N). The analysis shows that anomalous mode water formation plays a key role in the long-term storage of heat in the model, and that the release of heat associated with these anomalies suggests a predictable climate feedback mechanism.
Resumo:
The modelled El Nino-mean state-seasonal cycle interactions in 23 coupled ocean-atmosphere GCMs, including the recent IPCC AR4 models, are assessed and compared to observations and theory. The models show a clear improvement over previous generations in simulating the tropical Pacific climatology. Systematic biases still include too strong mean and seasonal cycle of trade winds. El Nino amplitude is shown to be an inverse function of the mean trade winds in agreement with the observed shift of 1976 and with theoretical studies. El Nino amplitude is further shown to be an inverse function of the relative strength of the seasonal cycle. When most of the energy is within the seasonal cycle, little is left for inter-annual signals and vice versa. An interannual coupling strength (ICS) is defined and its relation with the modelled El Nino frequency is compared to that predicted by theoretical models. An assessment of the modelled El Nino in term of SST mode (S-mode) or thermocline mode (T-mode) shows that most models are locked into a S-mode and that only a few models exhibit a hybrid mode, like in observations. It is concluded that several basic El Nino-mean state-seasonal cycle relationships proposed by either theory or analysis of observations seem to be reproduced by CGCMs. This is especially true for the amplitude of El Nino and is less clear for its frequency. Most of these relationships, first established for the pre-industrial control simulations, hold for the double and quadruple CO2 stabilized scenarios. The models that exhibit the largest El Nino amplitude change in these greenhouse gas (GHG) increase scenarios are those that exhibit a mode change towards a T-mode (either from S-mode to hybrid or hybrid to T-mode). This follows the observed 1976 climate shift in the tropical Pacific, and supports the-still debated-finding of studies that associated this shift to increased GHGs. In many respects, these models are also among those that best simulate the tropical Pacific climatology (ECHAM5/MPI-OM, GFDL-CM2.0, GFDL-CM2.1, MRI-CGM2.3.2, UKMO-HadCM3). Results from this large subset of models suggest the likelihood of increased El Nino amplitude in a warmer climate, though there is considerable spread of El Nino behaviour among the models and the changes in the subsurface thermocline properties that may be important for El Nino change could not be assessed. There are no clear indications of an El Nino frequency change with increased GHG.
Resumo:
We measured the movements of soccer players heading a football in a fully immersive virtual reality environment. In mid-flight the ball’s trajectory was altered from its normal quasi-parabolic path to a linear one, producing a jump in the rate of change of the angle of elevation of gaze (α) from player to ball. One reaction time later the players adjusted their speed so that the rate of change of α increased when it had been reduced and reduced it when it had been increased. Since the result of the player’s movement was to regain a value of the rate of change close to that before the disturbance, the data suggest that the players have an expectation of, and memory for, the pattern that the rate of change of α will follow during the flight. The results support the general claim that players intercepting balls use servo control strategies and are consistent with the particular claim of Optic Acceleration Cancellation theory that the servo strategy is to allow α to increase at a steadily decreasing rate.
Resumo:
Many pathogens transmit to new hosts by both infection (horizontal transmission) and transfer to the infected host's offspring (vertical transmission). These two transmission modes require speci®c adap- tations of the pathogen that can be mutually exclusive, resulting in a trade-off between horizontal and vertical transmission. We show that in mathematical models such trade-offs can lead to the simultaneous existence of two evolutionary stable states (evolutionary bi-stability) of allocation of resources to the two modes of transmission. We also show that jumping between evolutionary stable states can be induced by gradual environmental changes. Using quantitative PCR-based estimates of abundance in seed and vege- tative parts, we show that the pathogen of wheat, Phaeosphaeria nodorum, has jumped between two distinct states of transmission mode twice in the past 160 years, which, based on published evidence, we interpret as adaptation to environmental change. The ®nding of evolutionary bi-stability has impli- cations for human, animal and other plant diseases. An ill-judged change in a disease control programme could cause the pathogen to evolve a new, and possibly more damaging, combination of transmission modes. Similarly, environmental changes can shift the balance between transmission modes, with adverse effects on human, animal and plant health.
Resumo:
Networks are ubiquitous in natural, technological and social systems. They are of increasing relevance for improved understanding and control of infectious diseases of plants, animals and humans, given the interconnectedness of today's world. Recent modelling work on disease development in complex networks shows: the relative rapidity of pathogen spread in scale-free compared with random networks, unless there is high local clustering; the theoretical absence of an epidemic threshold in scale-free networks of infinite size, which implies that diseases with low infection rates can spread in them, but the emergence of a threshold when realistic features are added to networks (e.g. finite size, household structure or deactivation of links); and the influence on epidemic dynamics of asymmetrical interactions. Models suggest that control of pathogens spreading in scale-free networks should focus on highly connected individuals rather than on mass random immunization. A growing number of empirical applications of network theory in human medicine and animal disease ecology confirm the potential of the approach, and suggest that network thinking could also benefit plant epidemiology and forest pathology, particularly in human-modified pathosystems linked by commercial transport of plant and disease propagules. Potential consequences for the study and management of plant and tree diseases are discussed.
Resumo:
Mathematical models have been vitally important in the development of technologies in building engineering. A literature review identifies that linear models are the most widely used building simulation models. The advent of intelligent buildings has added new challenges in the application of the existing models as an intelligent building requires learning and self-adjusting capabilities based on environmental and occupants' factors. It is therefore argued that the linearity is an impropriate basis for any model of either complex building systems or occupant behaviours for control or whatever purpose. Chaos and complexity theory reflects nonlinear dynamic properties of the intelligent systems excised by occupants and environment and has been used widely in modelling various engineering, natural and social systems. It is proposed that chaos and complexity theory be applied to study intelligent buildings. This paper gives a brief description of chaos and complexity theory and presents its current positioning, recent developments in building engineering research and future potential applications to intelligent building studies, which provides a bridge between chaos and complexity theory and intelligent building research.
Resumo:
Firms form consortia in order to win contracts. Once a project has been awarded to a consortium each member then concentrates on his or her own contract with the client. Therefore, consortia are marketing devices, which present the impression of teamworking, but the production process is just as fragmented as under conventional procurement methods. In this way, the consortium forms a barrier between the client and the actual construction production process. Firms form consortia, not as a simple development of normal ways of working, but because the circumstances for specific projects make it a necessary vehicle. These circumstances include projects that are too large or too complex to undertake alone or projects that require on-going services which cannot be provided by the individual firms inhouse. It is not a preferred way of working, because participants carry extra risk in the form of liability for the actions of their partners in the consortium. The behaviour of members of consortia is determined by their relative power, based on several factors, including financial commitment and ease of replacement. The level of supply chain visibility to the public sector client and to the industry is reduced by the existence of a consortium because the consortium forms an additional obstacle between the client and the firms undertaking the actual construction work. Supply chain visibility matters to the client who otherwise loses control over the process of construction or service provision, while remaining accountable for cost overruns. To overcome this separation there is a convincing argument in favour of adopting the approach put forward in the Project Partnering Contract 2000 (PPC2000) Agreement. Members of consortia do not necessarily go on to work in the same consortia again because members need to respond flexibly to opportunities as and when they arise. Decision-making processes within consortia tend to be on an ad hoc basis. Construction risk is taken by the contractor and the construction supply chain but the reputational risk is carried by all the firms associated with a consortium. There is a wide variation in the manner that consortia are formed, determined by the individual circumstances of each project; its requirements, size and complexity, and the attitude of individual project leaders. However, there are a number of close working relationships based on generic models of consortia-like arrangements for the purpose of building production, such as the Housing Corporation Guidance Notes and the PPC2000.
Resumo:
This paper describes some of the preliminary outcomes of a UK project looking at control education. The focus is on two aspects: (i) the most important control concepts and theories for students doing just one or two courses and (ii) the effective use of software to improve student learning and engagement. There is also some discussion of the correct balance between teaching theory and practise. The paper gives examples from numerous UK universities and some industrial comment.
Resumo:
Wireless Personal Area Networks (WPANs) are offering high data rates suitable for interconnecting high bandwidth personal consumer devices (Wireless HD streaming, Wireless-USB and Bluetooth EDR). ECMA-368 is the Physical (PHY) and Media Access Control (MAC) backbone of many of these wireless devices. WPAN devices tend to operate in an ad-hoc based network and therefore it is important to successfully latch onto the network and become part of one of the available piconets. This paper presents a new algorithm for detecting the Packet/Fame Sync (PFS) signal in ECMA-368 to identify piconets and aid symbol timing. The algorithm is based on correlating the received PFS symbols with the expected locally stored symbols over the 24 or 12 PFS symbols, but selecting the likely TFC based on the highest statistical mode from the 24 or 12 best correlation results. The results are very favorable showing an improvement margin in the order of 11.5dB in reference sensitivity tests between the required performance using this algorithm and the performance of comparable systems.
Resumo:
This paper introduces a procedure for filtering electromyographic (EMG) signals. Its key element is the Empirical Mode Decomposition, a novel digital signal processing technique that can decompose my time-series into a set of functions designated as intrinsic mode functions. The procedure for EMG signal filtering is compared to a related approach based on the wavelet transform. Results obtained from the analysis of synthetic and experimental EMG signals show that Our method can be Successfully and easily applied in practice to attenuation of background activity in EMG signals. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
Tremor is a clinical feature characterized by oscillations of a part of the body. The detection and study of tremor is an important step in investigations seeking to explain underlying control strategies of the central nervous system under natural (or physiological) and pathological conditions. It is well established that tremorous activity is composed of deterministic and stochastic components. For this reason, the use of digital signal processing techniques (DSP) which take into account the nonlinearity and nonstationarity of such signals may bring new information into the signal analysis which is often obscured by traditional linear techniques (e.g. Fourier analysis). In this context, this paper introduces the application of the empirical mode decomposition (EMD) and Hilbert spectrum (HS), which are relatively new DSP techniques for the analysis of nonlinear and nonstationary time-series, for the study of tremor. Our results, obtained from the analysis of experimental signals collected from 31 patients with different neurological conditions, showed that the EMD could automatically decompose acquired signals into basic components, called intrinsic mode functions (IMFs), representing tremorous and voluntary activity. The identification of a physical meaning for IMFs in the context of tremor analysis suggests an alternative and new way of detecting tremorous activity. These results may be relevant for those applications requiring automatic detection of tremor. Furthermore, the energy of IMFs was visualized as a function of time and frequency by means of the HS. This analysis showed that the variation of energy of tremorous and voluntary activity could be distinguished and characterized on the HS. Such results may be relevant for those applications aiming to identify neurological disorders. In general, both the HS and EMD demonstrated to be very useful to perform objective analysis of any kind of tremor and can therefore be potentially used to perform functional assessment.
Resumo:
One of the major aims of BCI research is devoted to achieving faster and more efficient control of external devices. The identification of individual tap events in a motor imagery BCI is therefore a desirable goal. EEG is recorded from subjects performing and imagining finger taps with their left and right hands. A Differential Evolution based feature selection wrapper is used in order to identify optimal features in the spatial and frequency domains for tap identification. Channel-frequency band combinations are found which allow differentiation of tap vs. no-tap control conditions for executed and imagined taps. Left vs. right hand taps may also be differentiated with features found in this manner. A sliding time window is then used to accurately identify individual taps in the executed tap and imagined tap conditions. Highly statistically significant classification accuracies are achieved with time windows of 0.5 s and more allowing taps to be identified on a single trial basis.
Resumo:
A theory based healthy eating leaflet was evaluated against an existing publicly available standard leaflet. The intervention leaflet was designed to encourage healthy eating in 18-30 year olds and was developed by modifying an existing British Nutrition Foundation leaflet. The intervention leaflet targeted attitudes and self-efficacy. Participants (n=104) were randomly assigned either to the intervention, Foundation or a local food leaflet control condition. Cognitions were measured pre-intervention, immediately after reading the corresponding leaflet, and once again at two weeks follow-up. Critically, intentions to eat healthily were significantly greater at follow-up in the Intervention group compared to the other two groups, with the former leaflet also being perceived as more persuasive. The Intervention group also showed evidence of healthier eating at two weeks compared to the other two groups. Collectively the results illustrate the utility of a targeted theory-based approach.