141 resultados para constraint
Resumo:
A novel Neuropredictive Teleoperation (NPT) Scheme is presented. The design results from two key ideas: the exploitation of the measured or estimated neural input to the human arm or its electromyograph (EMG) as the system input and the employment of a predictor of the arm movement, based on this neural signal and an arm model, to compensate for time delays in the system. Although a multitude of such models, as well as measuring devices for the neural signals and the EMG, have been proposed, current telemanipulator research has only been considering highly simplified arm models. In the present design, the bilateral constraint that the master and slave are simultaneously compliant to each other's state (equal positions and forces) is abandoned, thus obtaining a simple to analyzesuccession of only locally controlled modules, and a robustness to time delays of up to 500 ms. The proposed designs were inspired by well established physiological evidence that the brain, rather than controlling the movement on-line, programs the arm with an action plan of a complete movement, which is then executed largely in open loop, regulated only by local reflex loops. As a model of the human arm the well-established Stark model is employed, whose mathematical representation is modified to make it suitable for an engineering application. The proposed scheme is however valid for any arm model. BIBO-stability and passivity results for a variety of local control laws are reported. Simulation results and comparisons with traditional designs also highlight the advantages of the proposed design.
Resumo:
The blind minimum output energy (MOE) adaptive detector for code division multiple access (CDMA) signals requires exact knowledge of the received spreading code of the desired user. This requirement can be relaxed by constraining the so-called surplus energy of the adaptive tap-weight vector, but the ideal constraint value is not easily obtained in practice. An algorithm is proposed to adaptively track this value and hence to approach the best possible performance for this class of CDMA detector.
Resumo:
Three potential explanations of past reforms of the Common Agricultural Policy (CAP) can be identified in the literature: a budget constraint, pressure from General Agreement on Tariffs and Trade/World Trade Organization (GATT/WTO) negotiations or commitments and a paradigm shift emphasising agriculture’s provision of public goods. This discussion on the driving forces of CAP reform links to broader theoretical questions on the role of budgetary politics, globalisation of public policy and paradigm shift in explaining policy change. In this article, the Health Check reforms of 2007/2008 are assessed. They were probably more ambitious than first supposed, although it was a watered-down package agreed by ministers in November 2008. We conclude that the Health Check was not primarily driven by budget concerns or by the supposed switch from the state-assisted to the multifunctional policy paradigm. The European Commission’s wish to adopt an offensive negotiating stance in the closing phases of the Doha Round was a more likely explanatory factor. The shape and purpose of the CAP post-2013 is contested with divergent views among the Member States.
Resumo:
This paper reports on a survey of 17 value management exercises recently carried out within the UK construction industry. Twelve leading value management practitioners were asked to describe an example of a value management study which ‘worked well’ and one which ‘did not work well’. They were further asked to explain the underlying factors which they considered had influenced the eventual outcome of the value management study. The subsequent analysis of the interview transcripts reveals six recurring themes which were held to have had a significant influence: expectations, implementation, participation, power, time constraint and uncertainty. Whilst caution is necessary in extracting the themes from their individual contexts, they do provide a valuable insight into the factors which influence the outcome of value management studies.
Resumo:
Strokes affect thousands of people worldwide leaving sufferers with severe disabilities affecting their daily activities. In recent years, new rehabilitation techniques have emerged such as constraint-induced therapy, biofeedback therapy and robot-aided therapy. In particular, robotic techniques allow precise recording of movements and application of forces to the affected limb, making it a valuable tool for motor rehabilitation. In addition, robot-aided therapy can utilise visual cues conveyed on a computer screen to convert repetitive movement practice into an engaging task such as a game. Visual cues can also be used to control the information sent to the patient about exercise performance and to potentially address psychosomatic variables influencing therapy. This paper overviews the current state-of-the-art on upper limb robot-mediated therapy with a focal point on the technical requirements of robotic therapy devices leading to the development of upper limb rehabilitation techniques that facilitate reach-to-touch, fine motor control, whole-arm movements and promote rehabilitation beyond hospital stay. The reviewed literature suggest that while there is evidence supporting the use of this technology to reduce functional impairment, besides the technological push, the challenge ahead lies on provision of effective assessment of outcome and modalities that have a stronger impact transferring functional gains into functional independence.
Resumo:
Identifying a periodic time-series model from environmental records, without imposing the positivity of the growth rate, does not necessarily respect the time order of the data observations. Consequently, subsequent observations, sampled in the environmental archive, can be inversed on the time axis, resulting in a non-physical signal model. In this paper an optimization technique with linear constraints on the signal model parameters is proposed that prevents time inversions. The activation conditions for this constrained optimization are based upon the physical constraint of the growth rate, namely, that it cannot take values smaller than zero. The actual constraints are defined for polynomials and first-order splines as basis functions for the nonlinear contribution in the distance-time relationship. The method is compared with an existing method that eliminates the time inversions, and its noise sensitivity is tested by means of Monte Carlo simulations. Finally, the usefulness of the method is demonstrated on the measurements of the vessel density, in a mangrove tree, Rhizophora mucronata, and the measurement of Mg/Ca ratios, in a bivalve, Mytilus trossulus.
Resumo:
Background. With diffusion-tensor imaging (DTi) it is possible to estimate the structural characteristics of fiber bundles in vivo. This study used DTi to infer damage to the corticospinal tract (CST) and relates this parameter to (a) the level of residual motor ability at least 1 year poststroke and (b) the outcome of intensive motor rehabilitation with constraint-induced movement therapy (CIMT). Objective. To explore the role of CST damage in recovery and CIMT efficacy. Methods. Ten patients with low-functioning hemiparesis were scanned and tested at baseline, before and after CIMT. Lesion overlap with the CST was indexed as reduced anisotropy compared with a CST variability map derived from 26 controls. Residual motor ability was measured through the Wolf Motor Function Test (WMFT) and the Motor Activity Log (MAL) acquired at baseline. CIMT benefit was assessed through the pre—post treatment comparison of WMFT and MAL performance. Results. Lesion overlap with the CST correlated with residual motor ability at baseline, with greater deficits observed in patients with more extended CST damage. Infarct volume showed no systematic association with residual motor ability. CIMT led to significant improvements in motor function but outcome was not associated with the extent of CST damage or infarct volume. Conclusion. The study gives in vivo support for the proposition that structural CST damage, not infarct volume, is a major predictor for residual functional ability in the chronic state. The results provide initial evidence for positive effects of CIMT in patients with varying, including more severe, CST damage.
Resumo:
Background: Poor diet quality is a major public health concern that has prompted governments to introduce a range of measures to promote healthy eating. For these measures to be effective, they should target segments of the population with messages relevant to their needs, aspirations and circumstances. The present study investigates the extent to which attitudes and constraints influence healthy eating, as well as how these vary by demographic characteristics of the UK population. It further considers how such information may be used in segmented diet and health policy messages. Methods: A survey of 250 UK adults elicited information on conformity to dietary guidelines, attitudes towards healthy eating, constraints to healthy eating and demographic characteristics. Ordered logit regressions were estimated to determine the importance of attitudes and constraints in determining how closely respondents follow healthy eating guidelines. Further regressions explored the demographic characteristics associated with the attitudinal and constraint variables. Results: People who attach high importance to their own health and appearance eat more healthily than those who do not. Risk-averse people and those able to resist temptation also eat more healthily. Shortage of time is considered an important barrier to healthy eating, although the cost of a healthy diet is not. These variables are associated with a number of demographic characteristics of the population; for example, young adults are more motivated to eat healthily by concerns over their appearance than their health. Conclusions: The approach employed in the present study could be used to inform future healthy eating campaigns. For example, messages to encourage the young to eat more healthily could focus on the impact of diets on their appearance rather than health.
Resumo:
Water vapour modulates energy flows in Earth's climate system through transfer of latent heat by evaporation and condensation and by modifying the flows of radiative energy both in the longwave and shortwave portions of the electromagnetic spectrum. This article summarizes the role of water vapour in Earth's energy flows with particular emphasis on (1) the powerful thermodynamic constraint of the Clausius Clapeyron equation, (2) dynamical controls on humidity above the boundary layer (or free-troposphere), (3) uncertainty in continuum absorption in the relatively transparent "window" regions of the radiative spectrum and (4) implications for changes in the atmospheric hydrological cycle.
Resumo:
During the Last Glacial Maximum (LGM, ∼21,000 years ago) the cold climate was strongly tied to low atmospheric CO2 concentration (∼190 ppm). Although it is generally assumed that this low CO2 was due to an expansion of the oceanic carbon reservoir, simulating the glacial level has remained a challenge especially with the additional δ13C constraint. Indeed the LGM carbon cycle was also characterized by a modern-like δ13C in the atmosphere and a higher surface to deep Atlantic δ13C gradient indicating probable changes in the thermohaline circulation. Here we show with a model of intermediate complexity, that adding three oceanic mechanisms: brine induced stratification, stratification-dependant diffusion and iron fertilization to the standard glacial simulation (which includes sea level drop, temperature change, carbonate compensation and terrestrial carbon release) decreases CO2 down to the glacial value of ∼190 ppm and simultaneously matches glacial atmospheric and oceanic δ13C inferred from proxy data. LGM CO2 and δ13C can at last be successfully reconciled.
Resumo:
We consider the linear equality-constrained least squares problem (LSE) of minimizing ${\|c - Gx\|}_2 $, subject to the constraint $Ex = p$. A preconditioned conjugate gradient method is applied to the Kuhn–Tucker equations associated with the LSE problem. We show that our method is well suited for structural optimization problems in reliability analysis and optimal design. Numerical tests are performed on an Alliant FX/8 multiprocessor and a Cray-X-MP using some practical structural analysis data.
Resumo:
Models which define fitness in terms of per capita rate of increase of phenotypes are used to analyse patterns of individual growth. It is shown that sigmoid growth curves are an optimal strategy (i.e. maximize fitness) if (Assumption 1a) mortality decreases with body size; (2a) mortality is a convex function of specific growth rate, viewed from above; (3) there is a constraint on growth rate, which is attained in the first phase of growth. If the constraint is not attained then size should increase at a progressively reducing rate. These predictions are biologically plausible. Catch-up growth, for retarded individuals, is generally not an optimal strategy though in special cases (e.g. seasonal breeding) it might be. Growth may be advantageous after first breeding if birth rate is a convex function of G (the fraction of production devoted to growth) viewed from above (Assumption 5a), or if mortality rate is a convex function of G, viewed from above (Assumption 6c). If assumptions 5a and 6c are both false, growth should cease at the age of first reproduction. These predictions could be used to evaluate the incidence of indeterminate versus determinate growth in the animal kingdom though the data currently available do not allow quantitative tests. In animals with invariant adult size a method is given which allows one to calculate whether an increase in body size is favoured given that fecundity and developmental time are thereby increased.
Resumo:
For data assimilation in numerical weather prediction, the initial forecast-error covariance matrix Pf is required. For variational assimilation it is particularly important to prescribe an accurate initial matrix Pf, since Pf is either static (in the 3D-Var case) or constant at the beginning of each assimilation window (in the 4D-Var case). At large scales the atmospheric flow is well approximated by hydrostatic balance and this balance is strongly enforced in the initial matrix Pf used in operational variational assimilation systems such as that of the Met Office. However, at convective scales this balance does not necessarily hold any more. Here we examine the extent to which hydrostatic balance is valid in the vertical forecast-error covariances for high-resolution models in order to determine whether there is a need to relax this balance constraint in convective-scale data assimilation. We use the Met Office Global and Regional Ensemble Prediction System (MOGREPS) and a 1.5 km resolution version of the Unified Model for a case study characterized by the presence of convective activity. An ensemble of high-resolution forecasts valid up to three hours after the onset of convection is produced. We show that at 1.5 km resolution hydrostatic balance does not hold for forecast errors in regions of convection. This indicates that in the presence of convection hydrostatic balance should not be enforced in the covariance matrix used for variational data assimilation at this scale. The results show the need to investigate covariance models that may be better suited for convective-scale data assimilation. Finally, we give a measure of the balance present in the forecast perturbations as a function of the horizontal scale (from 3–90 km) using a set of diagnostics. Copyright © 2012 Royal Meteorological Society and British Crown Copyright, the Met Office
Resumo:
We present molecular dynamics (MD) and slip-springs model simulations of the chain segmental dynamics in entangled linear polymer melts. The time-dependent behavior of the segmental orientation autocorrelation functions and mean-square segmental displacements are analyzed for both flexible and semiflexible chains, with particular attention paid to the scaling relations among these dynamic quantities. Effective combination of the two simulation methods at different coarse-graining levels allows us to explore the chain dynamics for chain lengths ranging from Z ≈ 2 to 90 entanglements. For a given chain length of Z ≈ 15, the time scales accessed span for more than 10 decades, covering all of the interesting relaxation regimes. The obtained time dependence of the monomer mean square displacements, g1(t), is in good agreement with the tube theory predictions. Results on the first- and second-order segmental orientation autocorrelation functions, C1(t) and C2(t), demonstrate a clear power law relationship of C2(t) C1(t)m with m = 3, 2, and 1 in the initial, free Rouse, and entangled (constrained Rouse) regimes, respectively. The return-to-origin hypothesis, which leads to inverse proportionality between the segmental orientation autocorrelation functions and g1(t) in the entangled regime, is convincingly verified by the simulation result of C1(t) g1(t)−1 t–1/4 in the constrained Rouse regime, where for well-entangled chains both C1(t) and g1(t) are rather insensitive to the constraint release effects. However, the second-order correlation function, C2(t), shows much stronger sensitivity to the constraint release effects and experiences a protracted crossover from the free Rouse to entangled regime. This crossover region extends for at least one decade in time longer than that of C1(t). The predicted time scaling behavior of C2(t) t–1/4 is observed in slip-springs simulations only at chain length of 90 entanglements, whereas shorter chains show higher scaling exponents. The reported simulation work can be applied to understand the observations of the NMR experiments.
Resumo:
The Allied bombing of France between 1940 and 1945 has received comparatively little attention from historians, although the civilian death toll, at about 60,000, was comparable to that of German raids on the UK. This article considers how Allied, and particularly British, bombing policy towards France was developed, what its objectives were and how French concerns about attacks on their territory were (or were not) addressed. It argues that while British policymakers were sensitive to the delicate political implications of attacking France, perceived military necessities tended to trump political misgivings; that Vichy, before November 1942, was a stronger constraint on Allied bombing than the Free French at any time and that the bombing programme largely escaped political control from May 1944.