151 resultados para balance scales


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Patients want and need comprehensive and accurate information about their medicines so that they can participate in decisions about their healthcare: In particular, they require information about the likely risks and benefits that are associated with the different treatment options. However, to provide this information in a form that people can readily understand and use is a considerable challenge to healthcare professionals. One recent attempt to standardise the Language of risk has been to produce sets of verbal descriptors that correspond to specific probability ranges, such as those outlined in the European Commission (EC) Pharmaceutical Committee guidelines in 1998 for describing the incidence of adverse effects. This paper provides an overview of a number of studies involving members of the general public, patients, and hospital doctors, that evaluated the utility of the EC guideline descriptors (very common, common, uncommon, rare, very rare). In all studies it was found that people significantly over-estimated the likelihood of adverse effects occurring, given specific verbal descriptors. This in turn resulted in significantly higher ratings of their perceived risks to health and significantly lower ratings of their likelihood of taking the medicine. Such problems of interpretation are not restricted to the EC guideline descriptors. Similar levels of misinterpretation have also been demonstrated with two other recently advocated risk scales (Caiman's verbal descriptor scale and Barclay, Costigan and Davies' lottery scale). In conclusion, the challenge for risk communicators and for future research will be to produce a language of risk that is sufficiently flexible to take into account different perspectives, as well as changing circumstances and contexts of illness and its treatments. In the meantime, we urge the EC and other legislative bodies to stop recommending the use of specific verbal labels or phrases until there is a stronger evidence base to support their use.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The THz water content index of a sample is defined and advantages in using such metric in estimating a sample's relative water content are discussed. The errors from reflectance measurements performed at two different THz frequencies using a quasi-optical null-balance reflectometer are propagated to the errors in estimating the sample water content index.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Measuring pollinator performance has become increasingly important with emerging needs for risk assessment in conservation and sustainable agriculture that require multi-year and multi-site comparisons across studies. However, comparing pollinator performance across studies is difficult because of the diversity of concepts and disparate methods in use. Our review of the literature shows many unresolved ambiguities. Two different assessment concepts predominate: the first estimates stigmatic pollen deposition and the underlying pollinator behaviour parameters, while the second estimates the pollinator’s contribution to plant reproductive success, for example in terms of seed set. Both concepts include a number of parameters combined in diverse ways and named under a diversity of synonyms and homonyms. However, these concepts are overlapping because pollen deposition success is the most frequently used proxy for assessing the pollinator’s contribution to plant reproductive success. We analyse the diverse concepts and methods in the context of a new proposed conceptual framework with a modular approach based on pollen deposition, visit frequency, and contribution to seed set relative to the plant’s maximum female reproductive potential. A system of equations is proposed to optimize the balance between idealised theoretical concepts and practical operational methods. Our framework permits comparisons over a range of floral phenotypes, and spatial and temporal scales, because scaling up is based on the same fundamental unit of analysis, the single visit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Stokes drift induced by surface waves distorts turbulence in the wind-driven mixed layer of the ocean, leading to the development of streamwise vortices, or Langmuir circulations, on a wide range of scales. We investigate the structure of the resulting Langmuir turbulence, and contrast it with the structure of shear turbulence, using rapid distortion theory (RDT) and kinematic simulation of turbulence. Firstly, these linear models show clearly why elongated streamwise vortices are produced in Langmuir turbulence, when Stokes drift tilts and stretches vertical vorticity into horizontal vorticity, whereas elongated streaky structures in streamwise velocity fluctuations (u) are produced in shear turbulence, because there is a cancellation in the streamwise vorticity equation and instead it is vertical vorticity that is amplified. Secondly, we develop scaling arguments, illustrated by analysing data from LES, that indicate that Langmuir turbulence is generated when the deformation of the turbulence by mean shear is much weaker than the deformation by the Stokes drift. These scalings motivate a quantitative RDT model of Langmuir turbulence that accounts for deformation of turbulence by Stokes drift and blocking by the air–sea interface that is shown to yield profiles of the velocity variances in good agreement with LES. The physical picture that emerges, at least in the LES, is as follows. Early in the life cycle of a Langmuir eddy initial turbulent disturbances of vertical vorticity are amplified algebraically by the Stokes drift into elongated streamwise vortices, the Langmuir eddies. The turbulence is thus in a near two-component state, with suppressed and . Near the surface, over a depth of order the integral length scale of the turbulence, the vertical velocity (w) is brought to zero by blocking of the air–sea interface. Since the turbulence is nearly two-component, this vertical energy is transferred into the spanwise fluctuations, considerably enhancing at the interface. After a time of order half the eddy decorrelation time the nonlinear processes, such as distortion by the strain field of the surrounding eddies, arrest the deformation and the Langmuir eddy decays. Presumably, Langmuir turbulence then consists of a statistically steady state of such Langmuir eddies. The analysis then provides a dynamical connection between the flow structures in LES of Langmuir turbulence and the dominant balance between Stokes production and dissipation in the turbulent kinetic energy budget, found by previous authors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Kodar Mountains in eastern Siberia accommodate 30 small, cold-based glaciers with a combined surface area of about 19 km2. Very little is known about these glaciers, with the first survey conducted in the late 1950s. In this paper, we use terrestrial photogrammetry to calculate changes in surface area, elevation, volume and geodetic mass balance of the Azarova Glacier between 1979 and 2007 and relate these to meteorological data from nearby Chara weather station (1938-2007). The glacier surface area declined by 20±6.9% and surface lowered on average by 20±1.8 m (mean thinning: 0.71 m a-1) resulting in a strongly negative cumulative and average mass balance of -18±1.6 m w.e. and -640±60 mm w.e.a-1 respectively. The July-August air temperature increased at a rate of 0.036oC a-1 between 1979 and 2007 and the 1980-2007 period was, on average, around 1oC warmer than 1938-1979. The regional climate projections for A2 and B2 CO2 emission scenarios developed using PRECIS regional climate model indicate that summer temperatures will increase in 2071–2100 by 2.6-4.7°C and 4.9-6.2°C respectively in comparison with 1961–1990. The annual total of solid precipitation will increase by 20% under B2 scenario but decline by 3% under A2 scenario. The length of the ablation season will extend from July–August to June-September. The Azarova Glacier exhibits high sensitivity to climatic warming due to its low elevation, exposure to comparatively high summer temperatures, and the absence of a compensating impact of cold season precipitation. Further summer warming and decline of solid precipitation projected under the A2 scenario will force Azarova to retreat further while impacts of an increase in solid precipitation projected under the B2 scenario require further investigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Several previous studies have attempted to assess the sublimation depth-scales of ice particles from clouds into clear air. Upon examining the sublimation depth-scales in the Met Office Unified Model (MetUM), it was found that the MetUM has evaporation depth-scales 2–3 times larger than radar observations. Similar results can be seen in the European Centre for Medium-Range Weather Forecasts (ECMWF), Regional Atmospheric Climate Model (RACMO) and Météo-France models. In this study, we use radar simulation (converting model variables into radar observations) and one-dimensional explicit microphysics numerical modelling to test and diagnose the cause of the deep sublimation depth-scales in the forecast model. The MetUM data and parametrization scheme are used to predict terminal velocity, which can be compared with the observed Doppler velocity. This can then be used to test the hypothesis as to why the sublimation depth-scale is too large within the MetUM. Turbulence could lead to dry air entrainment and higher evaporation rates; particle density may be wrong, particle capacitance may be too high and lead to incorrect evaporation rates or the humidity within the sublimating layer may be incorrectly represented. We show that the most likely cause of deep sublimation zones is an incorrect representation of model humidity in the layer. This is tested further by using a one-dimensional explicit microphysics model, which tests the sensitivity of ice sublimation to key atmospheric variables and is capable of including sonde and radar measurements to simulate real cases. Results suggest that the MetUM grid resolution at ice cloud altitudes is not sufficient enough to maintain the sharp drop in humidity that is observed in the sublimation zone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A large number of urban surface energy balance models now exist with different assumptions about the important features of the surface and exchange processes that need to be incorporated. To date, no com- parison of these models has been conducted; in contrast, models for natural surfaces have been compared extensively as part of the Project for Intercomparison of Land-surface Parameterization Schemes. Here, the methods and first results from an extensive international comparison of 33 models are presented. The aim of the comparison overall is to understand the complexity required to model energy and water exchanges in urban areas. The degree of complexity included in the models is outlined and impacts on model performance are discussed. During the comparison there have been significant developments in the models with resulting improvements in performance (root-mean-square error falling by up to two-thirds). Evaluation is based on a dataset containing net all-wave radiation, sensible heat, and latent heat flux observations for an industrial area in Vancouver, British Columbia, Canada. The aim of the comparison is twofold: to identify those modeling ap- proaches that minimize the errors in the simulated fluxes of the urban energy balance and to determine the degree of model complexity required for accurate simulations. There is evidence that some classes of models perform better for individual fluxes but no model performs best or worst for all fluxes. In general, the simpler models perform as well as the more complex models based on all statistical measures. Generally the schemes have best overall capability to model net all-wave radiation and least capability to model latent heat flux.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The retention of peatland carbon (C) and the ability to continue to draw down and store C from the atmosphere is not only important for the UK terrestrial carbon inventory, but also for a range of ecosystem services, the landscape value and the ecology and hydrology of ~15% of the land area of the UK. Here we review the current state of knowledge on the C balance of UK peatlands using several studies which highlight not only the importance of making good flux measurements, but also the spatial and temporal variability of different flux terms that characterise a landscape affected by a range of natural and anthropogenic processes and threats. Our data emphasise the importance of measuring (or accurately estimating) all components of the peatland C budget. We highlight the role of the aquatic pathway and suggest that fluxes are higher than previously thought. We also compare the contemporary C balance of several UK peatlands with historical rates of C accumulation measured using peat cores, thus providing a long-term context for present-day measurements and their natural year-on-year variability. Contemporary measurements from 2 sites suggest that current accumulation rates (–56 to –72 g C m–2 yr–1) are at the lower end of those seen over the last 150 yr in peat cores (–35 to –209 g C m–2 yr–1). Finally, we highlight significant current gaps in knowledge and identify where levels of uncertainty are high, as well as emphasise the research challenges that need to be addressed if we are to improve the measurement and prediction of change in the peatland C balance over future decades.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A 24-member ensemble of 1-h high-resolution forecasts over the Southern United Kingdom is used to study short-range forecast error statistics. The initial conditions are found from perturbations from an ensemble transform Kalman filter. Forecasts from this system are assumed to lie within the bounds of forecast error of an operational forecast system. Although noisy, this system is capable of producing physically reasonable statistics which are analysed and compared to statistics implied from a variational assimilation system. The variances for temperature errors for instance show structures that reflect convective activity. Some variables, notably potential temperature and specific humidity perturbations, have autocorrelation functions that deviate from 3-D isotropy at the convective-scale (horizontal scales less than 10 km). Other variables, notably the velocity potential for horizontal divergence perturbations, maintain 3-D isotropy at all scales. Geostrophic and hydrostatic balances are studied by examining correlations between terms in the divergence and vertical momentum equations respectively. Both balances are found to decay as the horizontal scale decreases. It is estimated that geostrophic balance becomes less important at scales smaller than 75 km, and hydrostatic balance becomes less important at scales smaller than 35 km, although more work is required to validate these findings. The implications of these results for high-resolution data assimilation are discussed.