53 resultados para Mean occupancy time


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Previous assessments of the impacts of climate change on heat-related mortality use the "delta method" to create temperature projection time series that are applied to temperature-mortality models to estimate future mortality impacts. The delta method means that climate model bias in the modelled present does not influence the temperature projection time series and impacts. However, the delta method assumes that climate change will result only in a change in the mean temperature but there is evidence that there will also be changes in the variability of temperature with climate change. The aim of this paper is to demonstrate the importance of considering changes in temperature variability with climate change in impacts assessments of future heat-related mortality. We investigate future heatrelated mortality impacts in six cities (Boston, Budapest, Dallas, Lisbon, London and Sydney) by applying temperature projections from the UK Meteorological Office HadCM3 climate model to the temperature-mortality models constructed and validated in Part 1. We investigate the impacts for four cases based on various combinations of mean and variability changes in temperature with climate change. The results demonstrate that higher mortality is attributed to increases in the mean and variability of temperature with climate change rather than with the change in mean temperature alone. This has implications for interpreting existing impacts estimates that have used the delta method. We present a novel method for the creation of temperature projection time series that includes changes in the mean and variability of temperature with climate change and is not influenced by climate model bias in the modelled present. The method should be useful for future impacts assessments. Few studies consider the implications that the limitations of the climate model may have on the heatrelated mortality impacts. Here, we demonstrate the importance of considering this by conducting an evaluation of the daily and extreme temperatures from HadCM3, which demonstrates that the estimates of future heat-related mortality for Dallas and Lisbon may be overestimated due to positive climate model bias. Likewise, estimates for Boston and London may be underestimated due to negative climate model bias. Finally, we briefly consider uncertainties in the impacts associated with greenhouse gas emissions and acclimatisation. The uncertainties in the mortality impacts due to different emissions scenarios of greenhouse gases in the future varied considerably by location. Allowing for acclimatisation to an extra 2°C in mean temperatures reduced future heat-related mortality by approximately half that of no acclimatisation in each city.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The sensitivity of the UK Universities Global Atmospheric Modelling Programme (UGAMP) General Circulation Model (UGCM) to two very different approaches to convective parametrization is described. Comparison is made between a Kuo scheme, which is constrained by large-scale moisture convergence, and a convective-adjustment scheme, which relaxes to observed thermodynamic states. Results from 360-day integrations with perpetual January conditions are used to describe the model's tropical time-mean climate and its variability. Both convection schemes give reasonable simulations of the time-mean climate, but the representation of the main modes of tropical variability is markedly different. The Kuo scheme has much weaker variance, confined to synoptic frequencies near 4 days, and a poor simulation of intraseasonal variability. In contrast, the convective-adjustment scheme has much more transient activity at all time-scales. The various aspects of the two schemes which might explain this difference are discussed. The particular closure on moisture convergence used in this version of the Kuo scheme is identified as being inappropriate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Robert–Asselin time filter is widely used in numerical models of weather and climate. It successfully suppresses the spurious computational mode associated with the leapfrog time-stepping scheme. Unfortunately, it also weakly suppresses the physical mode and severely degrades the numerical accuracy. These two concomitant problems are shown to occur because the filter does not conserve the mean state, averaged over the three time slices on which it operates. The author proposes a simple modification to the Robert–Asselin filter, which does conserve the three-time-level mean state. When used in conjunction with the leapfrog scheme, the modification vastly reduces the impacts on the physical mode and increases the numerical accuracy for amplitude errors by two orders, yielding third-order accuracy. The modified filter could easily be incorporated into existing general circulation models of the atmosphere and ocean. In principle, it should deliver more faithful simulations at almost no additional computational expense. Alternatively, it may permit the use of longer time steps with no loss of accuracy, reducing the computational expense of a given simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Direct numerical simulations of turbulent flow over regular arrays of urban-like, cubical obstacles are reported. Results are analysed in terms of a formal spatial averaging procedure to enable interpretation of the flow within the arrays as a canopy flow, and of the flow above as a rough wall boundary layer. Spatial averages of the mean velocity, turbulent stresses and pressure drag are computed. The statistics compare very well with data from wind-tunnel experiments. Within the arrays the time-averaged flow structure gives rise to significant 'dispersive stress' whereas above the Reynolds stress dominates. The mean flow structure and turbulence statistics depend significantly on the layout of the cubes. Unsteady effects are important, especially in the lower canopy layer where turbulent fluctuations dominate over the mean flow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective To assess the impact of a closed-loop electronic prescribing and automated dispensing system on the time spent providing a ward pharmacy service and the activities carried out. Setting Surgical ward, London teaching hospital. Method All data were collected two months pre- and one year post-intervention. First, the ward pharmacist recorded the time taken each day for four weeks. Second, an observational study was conducted over 10 weekdays, using two-dimensional work sampling, to identify the ward pharmacist's activities. Finally, medication orders were examined to identify pharmacists' endorsements that should have been, and were actually, made. Key findings Mean time to provide a weekday ward pharmacy service increased from 1 h 8 min to 1 h 38 min per day (P = 0.001; unpaired t-test). There were significant increases in time spent prescription monitoring, recommending changes in therapy/monitoring, giving advice or information, and non-productive time. There were decreases for supply, looking for charts and checking patients' own drugs. There was an increase in the amount of time spent with medical and pharmacy staff, and with 'self'. Seventy-eight per cent of patients' medication records could be assessed for endorsements pre- and 100% post-intervention. Endorsements were required for 390 (50%) of 787 medication orders pre-intervention and 190 (21%) of 897 afterwards (P < 0.0001; chi-square test). Endorsements were made for 214 (55%) of endorsement opportunities pre-intervention and 57 (30%) afterwards (P < 0.0001; chi-square test). Conclusion The intervention increased the overall time required to provide a ward pharmacy service and changed the types of activity undertaken. Contact time with medical and pharmacy staff increased. There was no significant change in time spent with patients. Fewer pharmacy endorsements were required post-intervention, but a lower percentage were actually made. The findings have important implications for the design, introduction and use of similar systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A multivariate fit to the variation in global mean surface air temperature anomaly over the past half century is presented. The fit procedure allows for the effect of response time on the waveform, amplitude and lag of each radiative forcing input, and each is allowed to have its own time constant. It is shown that the contribution of solar variability to the temperature trend since 1987 is small and downward; the best estimate is -1.3% and the 2sigma confidence level sets the uncertainty range of -0.7 to -1.9%. The result is the same if one quantifies the solar variation using galactic cosmic ray fluxes (for which the analysis can be extended back to 1953) or the most accurate total solar irradiance data composite. The rise in the global mean air surface temperatures is predominantly associated with a linear increase that represents the combined effects of changes in anthropogenic well-mixed greenhouse gases and aerosols, although, in recent decades, there is also a considerable contribution by a relative lack of major volcanic eruptions. The best estimate is that the anthropogenic factors contribute 75% of the rise since 1987, with an uncertainty range (set by the 2sigma confidence level using an AR(1) noise model) of 49–160%; thus, the uncertainty is large, but we can state that at least half of the temperature trend comes from the linear term and that this term could explain the entire rise. The results are consistent with the intergovernmental panel on climate change (IPCC) estimates of the changes in radiative forcing (given for 1961–1995) and are here combined with those estimates to find the response times, equilibrium climate sensitivities and pertinent heat capacities (i.e. the depth into the oceans to which a given radiative forcing variation penetrates) of the quasi-periodic (decadal-scale) input forcing variations. As shown by previous studies, the decadal-scale variations do not penetrate as deeply into the oceans as the longer term drifts and have shorter response times. Hence, conclusions about the response to century-scale forcing changes (and hence the associated equilibrium climate sensitivity and the temperature rise commitment) cannot be made from studies of the response to shorter period forcing changes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time to flowering and maturity is an important adaptive feature in annual crops, including cowpeas (Vigna unguiculata (L.) Walp.). In West and Central Africa, photoperiod is the most important environmental variable affecting time to flowering in cowpea. The inheritance of time from sowing to flowering (f) in cowpeas was studied by crossing a photoperiod-sensitive genotype Kanannnado to a photoperiod-insensitive variety IT97D-941-1. Sufficient seed of F-1, F-2, F-3 and backcross populations were generated. The parental, F-1, F-2, F-3 and the backcross populations were screened for f under long natural days (mean daylength 13.4 h per day) in the field and the parents, F-1, F-2 and backcross populations under short day (10 h per day) conditions. The result of the screening showed that photoperiod in the field was long enough to delay flowering of photoperiod-sensitive genotypes. Photoperiod-sensitivity was found to be partially dominant to insensitivity. Frequency distribution of the trait in the various populations indicated quantitative inheritance. Additive (d) and additive x dominance (j) interactions were the most important gene actions conditioning time to flowering. A narrow sense heritability of 86% was estimated for this trait. This will result in 26 days gain in time to flowering with 5% selection intensity from the F-2 to F-3 generation. At least seven major gene pairs, with an average delay of 6 days each, were estimated to control time to flowering in this cross.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To explore the extent and nature of change in cognitive-motor interference (CMI) among rehabilitating stroke patients who showed dual-task gait decrement at initial assessment. Design: Experimental, with in-subjects, repeated measures design. Setting: Rehabilitation centre for adults with acquired, nonprogressive brain injury. Subjects: Ten patients with unilateral stroke, available for reassessment 1-9 months following their participation in a study of CMI after brain injury. Measures: Median stride duration; mean word generation. Methods: Two x one-minute walking trials, two x one-minute word generation trials, two x one-minute trials of simultaneous walking and word generation; 10-metre walking time; Barthel ADL Scale score. Results: Seven out of ten patients showed reduction over time in dual-task gait decrement. Three out of ten showed reduction in cognitive decrement. Only one showed concomitant reduction in gait and word generation decrement. Conclusion: Extent of CMI during relearning to walk after a stroke reduced over time in the majority of patients. Effects were more evident in improved stride duration than improved cognitive performance. Measures of multiple task performance should be included in assessment for functional recovery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The climatology of a stratosphere-resolving version of the Met Office’s climate model is studied and validated against ECMWF reanalysis data. Ensemble integrations are carried out at two different horizontal resolutions. Along with a realistic climatology and annual cycle in zonal mean zonal wind and temperature, several physical effects are noted in the model. The time of final warming of the winter polar vortex is found to descend monotonically in the Southern Hemisphere, as would be expected for purely radiative forcing. In the Northern Hemisphere, however, the time of final warming is driven largely by dynamical effects in the lower stratosphere and radiative effects in the upper stratosphere, leading to the earliest transition to westward winds being seen in the midstratosphere. A realistic annual cycle in stratospheric water vapor concentrations—the tropical “tape recorder”—is captured. Tropical variability in the zonal mean zonal wind is found to be in better agreement with the reanalysis for the model run at higher horizontal resolution because the simulated quasi-biennial oscillation has a more realistic amplitude. Unexpectedly, variability in the extratropics becomes less realistic under increased resolution because of reduced resolved wave drag and increased orographic gravity wave drag. Overall, the differences in climatology between the simulations at high and moderate horizontal resolution are found to be small.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time correlation functions yield profound information about the dynamics of a physical system and hence are frequently calculated in computer simulations. For systems whose dynamics span a wide range of time, currently used methods require significant computer time and memory. In this paper, we discuss the multiple-tau correlator method for the efficient calculation of accurate time correlation functions on the fly during computer simulations. The multiple-tau correlator is efficacious in terms of computational requirements and can be tuned to the desired level of accuracy. Further, we derive estimates for the error arising from the use of the multiple-tau correlator and extend it for use in the calculation of mean-square particle displacements and dynamic structure factors. The method described here, in hardware implementation, is routinely used in light scattering experiments but has not yet found widespread use in computer simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We show that an analysis of the mean and variance of discrete wavelet coefficients of coaveraged time-domain interferograms can be used as a specification for determining when to stop coaveraging. We also show that, if a prediction model built in the wavelet domain is used to determine the composition of unknown samples, a stopping criterion for the coaveraging process can be developed with respect to the uncertainty tolerated in the prediction.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In an adaptive equaliser, the time lag is an important parameter that significantly influences the performance. Only with the optimum time lag that corresponds to the best minimum-mean-square-error (MMSE) performance, can there be best use of the available resources. Many designs, however, choose the time lag either based on preassumption of the channel or simply based on average experience. The relation between the MMSE performance and the time lag is investigated using a new interpretation of the MMSE equaliser, and then a novel adaptive time lag algorithm is proposed based on gradient search. The proposed algorithm can converge to the optimum time lag in the mean and is verified by the numerical simulations provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A time-dependent climate-change experiment with a coupled ocean–atmosphere general circulation model has been used to study changes in the occurrence of drought in summer in southern Europe and central North America. In both regions, precipitation and soil moisture are reduced in a climate of greater atmospheric carbon dioxide. A detailed investigation of the hydrology of the model shows that the drying of the soil comes about through an increase in evaporation in winter and spring, caused by higher temperatures and reduced snow cover, and a decrease in the net input of water in summer. Evaporation is reduced in summer because of the drier soil, but the reduction in precipitation is larger. Three extreme statistics are used to define drought, namely the frequency of low summer precipitation, the occurrence of long dry spells, and the probability of dry soil. The last of these is arguably of the greatest practical importance, but since it is based on soil moisture, of which there are very few observations, the authors’ simulation of it has the least confidence. Furthermore, long time series for daily observed precipitation are not readily available from a sufficient number of stations to enable a thorough evaluation of the model simulation, especially for the frequency of long dry spells, and this increases the systematic uncertainty of the model predictions. All three drought statistics show marked increases owing to the sensitivity of extreme statistics to changes in their distributions. However, the greater likelihood of long dry spells is caused by a tendency in the character of daily rainfall toward fewer events, rather than by the reduction in mean precipitation. The results should not be taken as firm predictions because extreme statistics for small regions cannot be calculated reliably from the output of the current generation of GCMs, but they point to the possibility of large increases in the severity of drought conditions as a consequence of climate change caused by increased CO2.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synoptic-scale air flow variability over the United Kingdom is measured on a daily time scale by following previous work to define 3 indices: geostrophic flow strength, vorticity and direction. Comparing the observed distribution of air flow index values with those determined from a simulation with the Hadley Centre’s global climate model (HadCM2) identifies some minor systematic biases in the model’s synoptic circulation but demonstrates that the major features are well simulated. The relationship between temperature and precipitation from parts of the United Kingdom and these air flow indices (either singly or in pairs) is found to be very similar in both the observations and model output; indeed the simulated and observed precipitation relationships are found to be almost interchangeable in a quantitative sense. These encouraging results imply that some reliability can be assumed for single grid-box and regional output from this climate model; this applies only to those grid boxes evaluated here (which do not have high or complex orography), only to the portion of variability that is controlled by synoptic air flow variations, and only to those surface variables considered here (temperature and precipitation).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Decision theory is the study of models of judgement involved in, and leading to, deliberate and (usually) rational choice. In real estate investment there are normative models for the allocation of assets. These asset allocation models suggest an optimum allocation between the respective asset classes based on the investors’ judgements of performance and risk. Real estate is selected, as other assets, on the basis of some criteria, e.g. commonly its marginal contribution to the production of a mean variance efficient multi asset portfolio, subject to the investor’s objectives and capital rationing constraints. However, decisions are made relative to current expectations and current business constraints. Whilst a decision maker may believe in the required optimum exposure levels as dictated by an asset allocation model, the final decision may/will be influenced by factors outside the parameters of the mathematical model. This paper discusses investors' perceptions and attitudes toward real estate and highlights the important difference between theoretical exposure levels and pragmatic business considerations. It develops a model to identify “soft” parameters in decision making which will influence the optimal allocation for that asset class. This “soft” information may relate to behavioural issues such as the tendency to mirror competitors; a desire to meet weight of money objectives; a desire to retain the status quo and many other non-financial considerations. The paper aims to establish the place of property in multi asset portfolios in the UK and examine the asset allocation process in practice, with a view to understanding the decision making process and to look at investors’ perceptions based on an historic analysis of market expectation; a comparison with historic data and an analysis of actual performance.