13 resultados para low back problems

em CentAUR: Central Archive University of Reading - UK


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Objective: To examine the interpretation of the verbal anchors used in the Borg rating of perceived exertion (RPE) scales in different clinical groups and a healthy control group. Design: Prospective experimental study. Setting: Rehabilitation center. Participants: Nineteen subjects with brain injury, 16 with chronic low back pain (CLBP), and 20 healthy controls. Interventions: Not applicable. Main Outcome Measures: Subjects used a visual analog scale (VAS) to rate their interpretation of the verbal anchors from the Borg RPE 6-20 and the newer 10-point category ratio scale. Results: All groups placed the verbal anchors in the order that they occur on the scales. There were significant within-group differences (P > .05) between VAS scores for 4 verbal anchors in the control group, 8 in the CLBP group, and 2 in the brain injury group. There was no significant difference in rating of each verbal anchor between the groups (P > .05). Conclusions: All subjects rated the verbal anchors in the order they occur on the scales, but there was less agreement in rating of each verbal anchor among subjects in the brain injury group. Clinicians should consider the possibility of small discrepancies in the meaning of the verbal anchors to subjects, particularly those recovering from brain injury, when they evaluate exercise perceptions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indicators are commonly recommended as tools for assessing the attainment of development, and the current vogue is for aggregating a number of indicators together into a single index. It is claimed that such indices of development help facilitate maximum impact in policy terms by appealing to those who may not necessarily have technical expertise in data collection, analysis and interpretation. In order to help counter criticisms of over-simplification, those advocating such indices also suggest that the raw data be provided so as to allow disaggregation into component parts and hence facilitate a more subtle interpretation if a reader so desires. This paper examines the problems involved with interpreting indices of development by focusing on the United Nations Development Programmes (UNDP) Human Development Index (HDI) published each year in the Human Development Reports (HDRs). The HDI was intended to provide an alternative to the more economic based indices, such as GDP, commonly used within neo-liberal development agendas. The paper explores the use of the HDI as a gauge of human development by making comparisons between two major political and economic communities in Africa (ECOWAS and SADC). While the HDI did help highlight important changes in human development as expressed by the HDI over 10 years, it is concluded that the HDI and its components are difficult to interpret as methodologies have changed significantly and the 'averaging' nature of the HDI could hide information unless care is taken. The paper discusses the applicability of alternative models to the HDI such as the more neo-populist centred methods commonly advocated for indicators of sustainable development. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Airborne scanning laser altimetry (LiDAR) is an important new data source for river flood modelling. LiDAR can give dense and accurate DTMs of floodplains for use as model bathymetry. Spatial resolutions of 0.5m or less are possible, with a height accuracy of 0.15m. LiDAR gives a Digital Surface Model (DSM), so vegetation removal software (e.g. TERRASCAN) must be used to obtain a DTM. An example used to illustrate the current state of the art will be the LiDAR data provided by the EA, which has been processed by their in-house software to convert the raw data to a ground DTM and separate vegetation height map. Their method distinguishes trees from buildings on the basis of object size. EA data products include the DTM with or without buildings removed, a vegetation height map, a DTM with bridges removed, etc. Most vegetation removal software ignores short vegetation less than say 1m high. We have attempted to extend vegetation height measurement to short vegetation using local height texture. Typically most of a floodplain may be covered in such vegetation. The idea is to assign friction coefficients depending on local vegetation height, so that friction is spatially varying. This obviates the need to calibrate a global floodplain friction coefficient. It’s not clear at present if the method is useful, but it’s worth testing further. The LiDAR DTM is usually determined by looking for local minima in the raw data, then interpolating between these to form a space-filling height surface. This is a low pass filtering operation, in which objects of high spatial frequency such as buildings, river embankments and walls may be incorrectly classed as vegetation. The problem is particularly acute in urban areas. A solution may be to apply pattern recognition techniques to LiDAR height data fused with other data types such as LiDAR intensity or multispectral CASI data. We are attempting to use digital map data (Mastermap structured topography data) to help to distinguish buildings from trees, and roads from areas of short vegetation. The problems involved in doing this will be discussed. A related problem of how best to merge historic river cross-section data with a LiDAR DTM will also be considered. LiDAR data may also be used to help generate a finite element mesh. In rural area we have decomposed a floodplain mesh according to taller vegetation features such as hedges and trees, so that e.g. hedge elements can be assigned higher friction coefficients than those in adjacent fields. We are attempting to extend this approach to urban area, so that the mesh is decomposed in the vicinity of buildings, roads, etc as well as trees and hedges. A dominant points algorithm is used to identify points of high curvature on a building or road, which act as initial nodes in the meshing process. A difficulty is that the resulting mesh may contain a very large number of nodes. However, the mesh generated may be useful to allow a high resolution FE model to act as a benchmark for a more practical lower resolution model. A further problem discussed will be how best to exploit data redundancy due to the high resolution of the LiDAR compared to that of a typical flood model. Problems occur if features have dimensions smaller than the model cell size e.g. for a 5m-wide embankment within a raster grid model with 15m cell size, the maximum height of the embankment locally could be assigned to each cell covering the embankment. But how could a 5m-wide ditch be represented? Again, this redundancy has been exploited to improve wetting/drying algorithms using the sub-grid-scale LiDAR heights within finite elements at the waterline.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we are mainly concerned with the development of efficient computer models capable of accurately predicting the propagation of low-to-middle frequency sound in the sea, in axially symmetric (2D) and in fully 3D environments. The major physical features of the problem, i.e. a variable bottom topography, elastic properties of the subbottom structure, volume attenuation and other range inhomogeneities are efficiently treated. The computer models presented are based on normal mode solutions of the Helmholtz equation on the one hand, and on various types of numerical schemes for parabolic approximations of the Helmholtz equation on the other. A new coupled mode code is introduced to model sound propagation in range-dependent ocean environments with variable bottom topography, where the effects of an elastic bottom, of volume attenuation, surface and bottom roughness are taken into account. New computer models based on finite difference and finite element techniques for the numerical solution of parabolic approximations are also presented. They include an efficient modeling of the bottom influence via impedance boundary conditions, they cover wide angle propagation, elastic bottom effects, variable bottom topography and reverberation effects. All the models are validated on several benchmark problems and versus experimental data. Results thus obtained were compared with analogous results from standard codes in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a novel approach to calculating Low-Energy Electron Diffraction (LEED) intensities for ordered molecular adsorbates. First, the intra-molecular multiple scattering is computed to obtain a non-diagonal molecular T-matrix. This is then used to represent the entire molecule as a single scattering object in a conventional LEED calculation, where the Layer Doubling technique is applied to assemble the different layers, including the molecular ones. A detailed comparison with conventional layer-type LEED calculations is provided to ascertain the accuracy of this scheme of calculation. Advantages of this scheme for problems involving ordered arrays of molecules adsorbed on surfaces are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent literature has described a “transition zone” between the average top of deep convection in the Tropics and the stratosphere. Here transport across this zone is investigated using an offline trajectory model. Particles were advected by the resolved winds from the European Centre for Medium-Range Weather Forecasts reanalyses. For each boreal winter clusters of particles were released in the upper troposphere over the four main regions of tropical deep convection (Indonesia, central Pacific, South America, and Africa). Most particles remain in the troposphere, descending on average for every cluster. The horizontal components of 5-day trajectories are strongly influenced by the El Niño–Southern Oscillation (ENSO), but the Lagrangian average descent does not have a clear ENSO signature. Tropopause crossing locations are first identified by recording events when trajectories from the same release regions cross the World Meteorological Organization lapse rate tropopause. Most crossing events occur 5–15 days after release, and 30-day trajectories are sufficiently long to estimate crossing number densities. In a further two experiments slight excursions across the lapse rate tropopause are differentiated from the drift deeper into the stratosphere by defining the “tropopause zone” as a layer bounded by the average potential temperature of the lapse rate tropopause and the profile temperature minimum. Transport upward across this zone is studied using forward trajectories released from the lower bound and back trajectories arriving at the upper bound. Histograms of particle potential temperature (θ) show marked differences between the transition zone, where there is a slow spread in θ values about a peak that shifts slowly upward, and the troposphere below 350 K. There forward trajectories experience slow radiative cooling interspersed with bursts of convective heating resulting in a well-mixed distribution. In contrast θ histograms for back trajectories arriving in the stratosphere have two distinct peaks just above 300 and 350 K, indicating the sharp change from rapid convective heating in the well-mixed troposphere to slow ascent in the transition zone. Although trajectories slowly cross the tropopause zone throughout the Tropics, all three experiments show that most trajectories reaching the stratosphere from the lower troposphere within 30 days do so over the west Pacific warm pool. This preferred location moves about 30°–50° farther east in an El Niño year (1982/83) and about 30° farther west in a La Niña year (1988/89). These results could have important implications for upper-troposphere–lower-stratosphere pollution and chemistry studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Numerical forecasts of the atmosphere based on the fundamental dynamical and thermodynamical equations have now been carried for almost 30 years. The very first models which were used were drastic simplifications of the governing equations and permitting only the prediction of the geostrophic wind in the middle of the troposphere based on the conservation of absolute vorticity. Since then we have seen a remarkable development in models predicting the large-scale synoptic flow. Verification carried out at NMC Washington indicates an improvement of about 40% in 24h forecasts for the 500mb geopotential since the end of the 1950’s. The most advanced models of today use the equations of motion in their more original form (i.e. primitive equations) which are better suited to predicting the atmosphere at low latitudes as well as small scale systems. The model which we have developed at the Centre, for instance, will be able to predict weather systems from a scale of 500-1000 km and a vertical extension of a few hundred millibars up to global weather systems extending through the whole depth of the atmosphere. With a grid resolution of 1.5 and 15 vertical levels and covering the whole globe it is possible to describe rather accurately the thermodynamical processes associated with cyclone development. It is further possible to incorporate sub-grid-scale processes such as radiation, exchange of sensible heat, release of latent heat etc. in order to predict the development of new weather systems and the decay of old ones. Later in this introduction I will exemplify this by showing some results of forecasts by the Centre’s model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The behavior of the Sun and near-Earth space during grand solar minima is not understood; however, the recent long and low minimum of the decadal-scale solar cycle gives some important clues, with implications for understanding the solar dynamo and predicting space weather conditions. The speed of the near-Earth solar wind and the strength of the interplanetary magnetic field (IMF) embedded within it can be reliably reconstructed for before the advent of spacecraft monitoring using observations of geomagnetic activity that extend back to the mid-19th century. We show that during the solar cycle minima around 1879 and 1901 the average solar wind speed was exceptionally low, implying the Earth remained within the streamer belt of slow solar wind flow for extended periods. This is consistent with a broader streamer belt, which was also a feature of the recent low minimum (2009), and yields a prediction that the low near-Earth IMF during the Maunder minimum (1640-1700), as derived from models and deduced from cosmogenic isotopes, was accompanied by a persistent and relatively constant solar wind of speed roughly half the average for the modern era.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives To investigate whether sleep disturbances previously found to characterise high risk infants: (a) persist into childhood; (b) are influenced by early maternal settling strategies and (c) predict cognitive and emotional/behavioural functioning. Methods Mothers experiencing high and low levels of psychosocial adversity (risk) were recruited antenatally and longitudinally assessed with their children. Mothers completed measures of settling strategies and infant sleep postnatally, and at 12 and 18 months, infant age. At five years, child sleep characteristics were measured via an actigraphy and maternal report; IQ and child adjustment were also assessed. Results Sleep disturbances observed in high-risk infants persisted at five years. Maternal involvement in infant settling was greater in high risk mothers, and predicted less optimal sleep at five years. Poorer five year sleep was associated with concurrent child anxiety/depression and aggression, but there was limited evidence for an influence of early sleep problems. Associations between infant/child sleep characteristics and IQ were also limited. Conclusions Early maternal over-involvement in infant settling is associated with less optimal sleep in children, which in turn, is related to child adjustment. The findings highlight the importance of supporting parents in the early development of good settling practices, particularly in high-risk populations.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Mothers' self-reported stroking of their infants over the first weeks of life modifies the association between prenatal depression and physiological and emotional reactivity at 7 months, consistent with animal studies of the effects of tactile stimulation. We now investigate whether the effects of maternal stroking persist to 2.5 years. Given animal and human evidence for sex differences in the effects of prenatal stress we compare associations in boys and girls. Method From a general population sample of 1233 first-time mothers recruited at 20 weeks gestation we drew a random sample of 316 for assessment at 32 weeks, stratified by reported inter-partner psychological abuse, a risk indicator for child development. Of these mothers, 243 reported at 5 and 9 weeks how often they stroked their infants, and completed the Child Behavior Checklist (CBCL) at 2.5 years post-delivery. Results There was a significant interaction between prenatal anxiety and maternal stroking in the prediction of CBCL internalizing (p = 0.001) and anxious/depressed scores (p < 0.001). The effects were stronger in females than males, and the three-way interaction prenatal anxiety × maternal stroking × sex of infant was significant for internalizing symptoms (p = 0.003). The interactions arose from an association between prenatal anxiety and internalizing symptoms only in the presence of low maternal stroking. Conclusions The findings are consistent with stable epigenetic effects, many sex specific, reported in animal studies. While epigenetic mechanisms may be underlying the associations, it remains to be established whether stroking affects gene expression in humans.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Atmospheric pollution over South Asia attracts special attention due to its effects on regional climate, water cycle and human health. These effects are potentially growing owing to rising trends of anthropogenic aerosol emissions. In this study, the spatio-temporal aerosol distributions over South Asia from seven global aerosol models are evaluated against aerosol retrievals from NASA satellite sensors and ground-based measurements for the period of 2000–2007. Overall, substantial underestimations of aerosol loading over South Asia are found systematically in most model simulations. Averaged over the entire South Asia, the annual mean aerosol optical depth (AOD) is underestimated by a range 15 to 44% across models compared to MISR (Multi-angle Imaging SpectroRadiometer), which is the lowest bound among various satellite AOD retrievals (from MISR, SeaWiFS (Sea-Viewing Wide Field-of-View Sensor), MODIS (Moderate Resolution Imaging Spectroradiometer) Aqua and Terra). In particular during the post-monsoon and wintertime periods (i.e., October–January), when agricultural waste burning and anthropogenic emissions dominate, models fail to capture AOD and aerosol absorption optical depth (AAOD) over the Indo–Gangetic Plain (IGP) compared to ground-based Aerosol Robotic Network (AERONET) sunphotometer measurements. The underestimations of aerosol loading in models generally occur in the lower troposphere (below 2 km) based on the comparisons of aerosol extinction profiles calculated by the models with those from Cloud–Aerosol Lidar with Orthogonal Polarization (CALIOP) data. Furthermore, surface concentrations of all aerosol components (sulfate, nitrate, organic aerosol (OA) and black carbon (BC)) from the models are found much lower than in situ measurements in winter. Several possible causes for these common problems of underestimating aerosols in models during the post-monsoon and wintertime periods are identified: the aerosol hygroscopic growth and formation of secondary inorganic aerosol are suppressed in the models because relative humidity (RH) is biased far too low in the boundary layer and thus foggy conditions are poorly represented in current models, the nitrate aerosol is either missing or inadequately accounted for, and emissions from agricultural waste burning and biofuel usage are too low in the emission inventories. These common problems and possible causes found in multiple models point out directions for future model improvements in this important region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many young children appear to have skills sufficient to engage in basic elements of cognitive behaviour therapy (CBT). Previous research has, however, typically used children from non-clinical populations. It is important to assess children with mental health problems on cognitive skills relevant to CBT and to compare their performance to children who are not identified as having mental health difficulties. In this study 193 6 and 7 year old children were assessed using a thought–feeling–behaviour discrimination task [Quakley et al. Behav. Res. Therapy 42 (2004) 343] and a brief IQ test (the WASI). Children were assigned to groups (at risk, borderline, low risk) according to ratings of their mental health made by their teachers and parents on the Strengths and Difficulties Questionnaire [Goodman, J. Am. Acad. Child Adolescent Psych. 40 (2001) 1337]. After controlling for IQ, children ‘at risk’ of mental health problems performed significantly less well than children with a ‘low risk’ of mental health problems. Before receiving CBT, children’s meta-cognitive development should be assessed and additional help provided to those with meta-cognitive difficulties.