978 resultados para Normalization constraint
Resumo:
This paper reports on a survey of 17 value management exercises recently carried out within the UK construction industry. Twelve leading value management practitioners were asked to describe an example of a value management study which ‘worked well’ and one which ‘did not work well’. They were further asked to explain the underlying factors which they considered had influenced the eventual outcome of the value management study. The subsequent analysis of the interview transcripts reveals six recurring themes which were held to have had a significant influence: expectations, implementation, participation, power, time constraint and uncertainty. Whilst caution is necessary in extracting the themes from their individual contexts, they do provide a valuable insight into the factors which influence the outcome of value management studies.
Resumo:
Strokes affect thousands of people worldwide leaving sufferers with severe disabilities affecting their daily activities. In recent years, new rehabilitation techniques have emerged such as constraint-induced therapy, biofeedback therapy and robot-aided therapy. In particular, robotic techniques allow precise recording of movements and application of forces to the affected limb, making it a valuable tool for motor rehabilitation. In addition, robot-aided therapy can utilise visual cues conveyed on a computer screen to convert repetitive movement practice into an engaging task such as a game. Visual cues can also be used to control the information sent to the patient about exercise performance and to potentially address psychosomatic variables influencing therapy. This paper overviews the current state-of-the-art on upper limb robot-mediated therapy with a focal point on the technical requirements of robotic therapy devices leading to the development of upper limb rehabilitation techniques that facilitate reach-to-touch, fine motor control, whole-arm movements and promote rehabilitation beyond hospital stay. The reviewed literature suggest that while there is evidence supporting the use of this technology to reduce functional impairment, besides the technological push, the challenge ahead lies on provision of effective assessment of outcome and modalities that have a stronger impact transferring functional gains into functional independence.
Resumo:
Identifying a periodic time-series model from environmental records, without imposing the positivity of the growth rate, does not necessarily respect the time order of the data observations. Consequently, subsequent observations, sampled in the environmental archive, can be inversed on the time axis, resulting in a non-physical signal model. In this paper an optimization technique with linear constraints on the signal model parameters is proposed that prevents time inversions. The activation conditions for this constrained optimization are based upon the physical constraint of the growth rate, namely, that it cannot take values smaller than zero. The actual constraints are defined for polynomials and first-order splines as basis functions for the nonlinear contribution in the distance-time relationship. The method is compared with an existing method that eliminates the time inversions, and its noise sensitivity is tested by means of Monte Carlo simulations. Finally, the usefulness of the method is demonstrated on the measurements of the vessel density, in a mangrove tree, Rhizophora mucronata, and the measurement of Mg/Ca ratios, in a bivalve, Mytilus trossulus.
Resumo:
Background. With diffusion-tensor imaging (DTi) it is possible to estimate the structural characteristics of fiber bundles in vivo. This study used DTi to infer damage to the corticospinal tract (CST) and relates this parameter to (a) the level of residual motor ability at least 1 year poststroke and (b) the outcome of intensive motor rehabilitation with constraint-induced movement therapy (CIMT). Objective. To explore the role of CST damage in recovery and CIMT efficacy. Methods. Ten patients with low-functioning hemiparesis were scanned and tested at baseline, before and after CIMT. Lesion overlap with the CST was indexed as reduced anisotropy compared with a CST variability map derived from 26 controls. Residual motor ability was measured through the Wolf Motor Function Test (WMFT) and the Motor Activity Log (MAL) acquired at baseline. CIMT benefit was assessed through the pre—post treatment comparison of WMFT and MAL performance. Results. Lesion overlap with the CST correlated with residual motor ability at baseline, with greater deficits observed in patients with more extended CST damage. Infarct volume showed no systematic association with residual motor ability. CIMT led to significant improvements in motor function but outcome was not associated with the extent of CST damage or infarct volume. Conclusion. The study gives in vivo support for the proposition that structural CST damage, not infarct volume, is a major predictor for residual functional ability in the chronic state. The results provide initial evidence for positive effects of CIMT in patients with varying, including more severe, CST damage.
Resumo:
Background: Poor diet quality is a major public health concern that has prompted governments to introduce a range of measures to promote healthy eating. For these measures to be effective, they should target segments of the population with messages relevant to their needs, aspirations and circumstances. The present study investigates the extent to which attitudes and constraints influence healthy eating, as well as how these vary by demographic characteristics of the UK population. It further considers how such information may be used in segmented diet and health policy messages. Methods: A survey of 250 UK adults elicited information on conformity to dietary guidelines, attitudes towards healthy eating, constraints to healthy eating and demographic characteristics. Ordered logit regressions were estimated to determine the importance of attitudes and constraints in determining how closely respondents follow healthy eating guidelines. Further regressions explored the demographic characteristics associated with the attitudinal and constraint variables. Results: People who attach high importance to their own health and appearance eat more healthily than those who do not. Risk-averse people and those able to resist temptation also eat more healthily. Shortage of time is considered an important barrier to healthy eating, although the cost of a healthy diet is not. These variables are associated with a number of demographic characteristics of the population; for example, young adults are more motivated to eat healthily by concerns over their appearance than their health. Conclusions: The approach employed in the present study could be used to inform future healthy eating campaigns. For example, messages to encourage the young to eat more healthily could focus on the impact of diets on their appearance rather than health.
Resumo:
Water vapour modulates energy flows in Earth's climate system through transfer of latent heat by evaporation and condensation and by modifying the flows of radiative energy both in the longwave and shortwave portions of the electromagnetic spectrum. This article summarizes the role of water vapour in Earth's energy flows with particular emphasis on (1) the powerful thermodynamic constraint of the Clausius Clapeyron equation, (2) dynamical controls on humidity above the boundary layer (or free-troposphere), (3) uncertainty in continuum absorption in the relatively transparent "window" regions of the radiative spectrum and (4) implications for changes in the atmospheric hydrological cycle.
Resumo:
It is well known that gut bacteria contribute significantly to the host homeostasis, providing a range of benefits such as immune protection and vitamin synthesis. They also supply the host with a considerable amount of nutrients, making this ecosystem an essential metabolic organ. In the context of increasing evidence of the link between the gut flora and the metabolic syndrome, understanding the metabolic interaction between the host and its gut microbiota is becoming an important challenge of modern biology.1-4 Colonization (also referred to as normalization process) designates the establishment of micro-organisms in a former germ-free animal. While it is a natural process occurring at birth, it is also used in adult germ-free animals to control the gut floral ecosystem and further determine its impact on the host metabolism. A common procedure to control the colonization process is to use the gavage method with a single or a mixture of micro-organisms. This method results in a very quick colonization and presents the disadvantage of being extremely stressful5. It is therefore useful to minimize the stress and to obtain a slower colonization process to observe gradually the impact of bacterial establishment on the host metabolism. In this manuscript, we describe a procedure to assess the modification of hepatic metabolism during a gradual colonization process using a non-destructive metabolic profiling technique. We propose to monitor gut microbial colonization by assessing the gut microbial metabolic activity reflected by the urinary excretion of microbial co-metabolites by 1H NMR-based metabolic profiling. This allows an appreciation of the stability of gut microbial activity beyond the stable establishment of the gut microbial ecosystem usually assessed by monitoring fecal bacteria by DGGE (denaturing gradient gel electrophoresis).6 The colonization takes place in a conventional open environment and is initiated by a dirty litter soiled by conventional animals, which will serve as controls. Rodents being coprophagous animals, this ensures a homogenous colonization as previously described.7 Hepatic metabolic profiling is measured directly from an intact liver biopsy using 1H High Resolution Magic Angle Spinning NMR spectroscopy. This semi-quantitative technique offers a quick way to assess, without damaging the cell structure, the major metabolites such as triglycerides, glucose and glycogen in order to further estimate the complex interaction between the colonization process and the hepatic metabolism7-10. This method can also be applied to any tissue biopsy11,12.
Resumo:
During the Last Glacial Maximum (LGM, ∼21,000 years ago) the cold climate was strongly tied to low atmospheric CO2 concentration (∼190 ppm). Although it is generally assumed that this low CO2 was due to an expansion of the oceanic carbon reservoir, simulating the glacial level has remained a challenge especially with the additional δ13C constraint. Indeed the LGM carbon cycle was also characterized by a modern-like δ13C in the atmosphere and a higher surface to deep Atlantic δ13C gradient indicating probable changes in the thermohaline circulation. Here we show with a model of intermediate complexity, that adding three oceanic mechanisms: brine induced stratification, stratification-dependant diffusion and iron fertilization to the standard glacial simulation (which includes sea level drop, temperature change, carbonate compensation and terrestrial carbon release) decreases CO2 down to the glacial value of ∼190 ppm and simultaneously matches glacial atmospheric and oceanic δ13C inferred from proxy data. LGM CO2 and δ13C can at last be successfully reconciled.
Resumo:
The technique of constructing a transformation, or regrading, of a discrete data set such that the histogram of the transformed data matches a given reference histogram is commonly known as histogram modification. The technique is widely used for image enhancement and normalization. A method which has been previously derived for producing such a regrading is shown to be “best” in the sense that it minimizes the error between the cumulative histogram of the transformed data and that of the given reference function, over all single-valued, monotone, discrete transformations of the data. Techniques for smoothed regrading, which provide a means of balancing the error in matching a given reference histogram against the information lost with respect to a linear transformation are also examined. The smoothed regradings are shown to optimize certain cost functionals. Numerical algorithms for generating the smoothed regradings, which are simple and efficient to implement, are described, and practical applications to the processing of LANDSAT image data are discussed.
Resumo:
We consider the linear equality-constrained least squares problem (LSE) of minimizing ${\|c - Gx\|}_2 $, subject to the constraint $Ex = p$. A preconditioned conjugate gradient method is applied to the Kuhn–Tucker equations associated with the LSE problem. We show that our method is well suited for structural optimization problems in reliability analysis and optimal design. Numerical tests are performed on an Alliant FX/8 multiprocessor and a Cray-X-MP using some practical structural analysis data.
Resumo:
Methods for producing nonuniform transformations, or regradings, of discrete data are discussed. The transformations are useful in image processing, principally for enhancement and normalization of scenes. Regradings which “equidistribute” the histogram of the data, that is, which transform it into a constant function, are determined. Techniques for smoothing the regrading, dependent upon a continuously variable parameter, are presented. Generalized methods for constructing regradings such that the histogram of the data is transformed into any prescribed function are also discussed. Numerical algorithms for implementing the procedures and applications to specific examples are described.
Resumo:
Models which define fitness in terms of per capita rate of increase of phenotypes are used to analyse patterns of individual growth. It is shown that sigmoid growth curves are an optimal strategy (i.e. maximize fitness) if (Assumption 1a) mortality decreases with body size; (2a) mortality is a convex function of specific growth rate, viewed from above; (3) there is a constraint on growth rate, which is attained in the first phase of growth. If the constraint is not attained then size should increase at a progressively reducing rate. These predictions are biologically plausible. Catch-up growth, for retarded individuals, is generally not an optimal strategy though in special cases (e.g. seasonal breeding) it might be. Growth may be advantageous after first breeding if birth rate is a convex function of G (the fraction of production devoted to growth) viewed from above (Assumption 5a), or if mortality rate is a convex function of G, viewed from above (Assumption 6c). If assumptions 5a and 6c are both false, growth should cease at the age of first reproduction. These predictions could be used to evaluate the incidence of indeterminate versus determinate growth in the animal kingdom though the data currently available do not allow quantitative tests. In animals with invariant adult size a method is given which allows one to calculate whether an increase in body size is favoured given that fecundity and developmental time are thereby increased.
Resumo:
For data assimilation in numerical weather prediction, the initial forecast-error covariance matrix Pf is required. For variational assimilation it is particularly important to prescribe an accurate initial matrix Pf, since Pf is either static (in the 3D-Var case) or constant at the beginning of each assimilation window (in the 4D-Var case). At large scales the atmospheric flow is well approximated by hydrostatic balance and this balance is strongly enforced in the initial matrix Pf used in operational variational assimilation systems such as that of the Met Office. However, at convective scales this balance does not necessarily hold any more. Here we examine the extent to which hydrostatic balance is valid in the vertical forecast-error covariances for high-resolution models in order to determine whether there is a need to relax this balance constraint in convective-scale data assimilation. We use the Met Office Global and Regional Ensemble Prediction System (MOGREPS) and a 1.5 km resolution version of the Unified Model for a case study characterized by the presence of convective activity. An ensemble of high-resolution forecasts valid up to three hours after the onset of convection is produced. We show that at 1.5 km resolution hydrostatic balance does not hold for forecast errors in regions of convection. This indicates that in the presence of convection hydrostatic balance should not be enforced in the covariance matrix used for variational data assimilation at this scale. The results show the need to investigate covariance models that may be better suited for convective-scale data assimilation. Finally, we give a measure of the balance present in the forecast perturbations as a function of the horizontal scale (from 3–90 km) using a set of diagnostics. Copyright © 2012 Royal Meteorological Society and British Crown Copyright, the Met Office
Resumo:
We present molecular dynamics (MD) and slip-springs model simulations of the chain segmental dynamics in entangled linear polymer melts. The time-dependent behavior of the segmental orientation autocorrelation functions and mean-square segmental displacements are analyzed for both flexible and semiflexible chains, with particular attention paid to the scaling relations among these dynamic quantities. Effective combination of the two simulation methods at different coarse-graining levels allows us to explore the chain dynamics for chain lengths ranging from Z ≈ 2 to 90 entanglements. For a given chain length of Z ≈ 15, the time scales accessed span for more than 10 decades, covering all of the interesting relaxation regimes. The obtained time dependence of the monomer mean square displacements, g1(t), is in good agreement with the tube theory predictions. Results on the first- and second-order segmental orientation autocorrelation functions, C1(t) and C2(t), demonstrate a clear power law relationship of C2(t) C1(t)m with m = 3, 2, and 1 in the initial, free Rouse, and entangled (constrained Rouse) regimes, respectively. The return-to-origin hypothesis, which leads to inverse proportionality between the segmental orientation autocorrelation functions and g1(t) in the entangled regime, is convincingly verified by the simulation result of C1(t) g1(t)−1 t–1/4 in the constrained Rouse regime, where for well-entangled chains both C1(t) and g1(t) are rather insensitive to the constraint release effects. However, the second-order correlation function, C2(t), shows much stronger sensitivity to the constraint release effects and experiences a protracted crossover from the free Rouse to entangled regime. This crossover region extends for at least one decade in time longer than that of C1(t). The predicted time scaling behavior of C2(t) t–1/4 is observed in slip-springs simulations only at chain length of 90 entanglements, whereas shorter chains show higher scaling exponents. The reported simulation work can be applied to understand the observations of the NMR experiments.
Resumo:
The Allied bombing of France between 1940 and 1945 has received comparatively little attention from historians, although the civilian death toll, at about 60,000, was comparable to that of German raids on the UK. This article considers how Allied, and particularly British, bombing policy towards France was developed, what its objectives were and how French concerns about attacks on their territory were (or were not) addressed. It argues that while British policymakers were sensitive to the delicate political implications of attacking France, perceived military necessities tended to trump political misgivings; that Vichy, before November 1942, was a stronger constraint on Allied bombing than the Free French at any time and that the bombing programme largely escaped political control from May 1944.