836 resultados para Random time change
Resumo:
Change within the construction sector has been a central concern of governments and a select few private-sector clients for a considerable time. The discourse of change emanating from organizations concerned with reform in the construction sector reflects these ongoing concerns for change in the sector. The underlying assumptions of the content of change and appropriate change mechanisms in the UK are critically examined and challenged. In particular, the limitations of measurement and best practice are explored. The allegiance to approaches based on measurement and best practice is acontextual, unreflective and insufficient in providing wholly reliable explanations for the relationship between practice and performance. Claims for the use of measurement and best practice by the reform movement must therefore be understood to have limitations and their use approached with caution. The emphasis on best practice is also understood to direct attention away from understanding the legitimacy of current practice and change within the UK construction sector. An agenda for change in the UK construction sector will need to engage with and be more reflective of current managerial practice and past change initiatives. Contextual approaches such as structuration theory offer a way in which to underpin a research framework that could support the reform movement in setting such an agenda.
Resumo:
This research examines dynamics associated with new representational technologies in complex organizations through a study of the use of a Single Model Environment, prototyping and simulation tools in the mega-project to construct Terminal 5 at Heathrow Airport, London. The ambition of the client, BAA. was to change industrial practices reducing project costs and time to delivery through new contractual arrangements and new digitally-enabled collaborative ways of working. The research highlights changes over time and addresses two areas of 'turbulence' in the use of: 1) technologies, where there is a dynamic tension between desires to constantly improve, change and update digital technologies and the need to standardise practices, maintaining and defending the overall integrity of the system; and 2) representations, where dynamics result from the responsibilities and liabilities associated with sharing of digital representations and a lack of trust in the validity of data from other firms. These dynamics are tracked across three stages of this well-managed and innovative project and indicate the generic need to treat digital infrastructure as an ongoing strategic issue.
Resumo:
Objective: To explore the extent and nature of change in cognitive-motor interference (CMI) among rehabilitating stroke patients who showed dual-task gait decrement at initial assessment. Design: Experimental, with in-subjects, repeated measures design. Setting: Rehabilitation centre for adults with acquired, nonprogressive brain injury. Subjects: Ten patients with unilateral stroke, available for reassessment 1-9 months following their participation in a study of CMI after brain injury. Measures: Median stride duration; mean word generation. Methods: Two x one-minute walking trials, two x one-minute word generation trials, two x one-minute trials of simultaneous walking and word generation; 10-metre walking time; Barthel ADL Scale score. Results: Seven out of ten patients showed reduction over time in dual-task gait decrement. Three out of ten showed reduction in cognitive decrement. Only one showed concomitant reduction in gait and word generation decrement. Conclusion: Extent of CMI during relearning to walk after a stroke reduced over time in the majority of patients. Effects were more evident in improved stride duration than improved cognitive performance. Measures of multiple task performance should be included in assessment for functional recovery.
Resumo:
Methane is the second most important anthropogenic greenhouse gas in the atmosphere next to carbon dioxide. Its global warming potential (GWP) for a time horizon of 100 years is 25, which makes it an attractive target for climate mitigation policies. Although the methane GWP traditionally includes the methane indirect effects on the concentrations of ozone and stratospheric water vapour, it does not take into account the production of carbon dioxide from methane oxidation. We argue here that this CO2-induced effect should be included for fossil sources of methane, which results in slightly larger GWP values for all time horizons. If the global temperature change potential is used as an alternative climate metric, then the impact of the CO2-induced effect is proportionally much larger. We also discuss what the correction term should be for methane from anthropogenic biogenic sources.
Resumo:
A distributed Lagrangian moving-mesh finite element method is applied to problems involving changes of phase. The algorithm uses a distributed conservation principle to determine nodal mesh velocities, which are then used to move the nodes. The nodal values are obtained from an ALE (Arbitrary Lagrangian-Eulerian) equation, which represents a generalization of the original algorithm presented in Applied Numerical Mathematics, 54:450--469 (2005). Having described the details of the generalized algorithm it is validated on two test cases from the original paper and is then applied to one-phase and, for the first time, two-phase Stefan problems in one and two space dimensions, paying particular attention to the implementation of the interface boundary conditions. Results are presented to demonstrate the accuracy and the effectiveness of the method, including comparisons against analytical solutions where available.
Resumo:
Blanket peatlands are rain-fed mires that cover the landscape almost regardless of topography. The geographical extent of this type of peatland is highly sensitive to climate. We applied a global process-based bioclimatic envelope model, PeatStash, to predict the distribution of British blanket peatlands. The model captures the present areal extent (Kappa = 0.77) and is highly sensitive to both temperature and precipitation changes. When the model is run using the UKCIP02 climate projections for the time periods 2011–2040, 2041–2070 and 2071–2100, the geographical distribution of blanket peatlands gradually retreats towards the north and the west. In the UKCIP02 high emissions scenario for 2071–2100, the blanket peatland bioclimatic space is ~84% smaller than contemporary conditions (1961–1990); only parts of the west of Scotland remain inside this space. Increasing summer temperature is the main driver of the projected changes in areal extent. Simulations using 7 climate model outputs resulted in generally similar patterns of declining aereal extent of the bioclimatic space, although differing in degree. The results presented in this study should be viewed as a first step towards understanding the trends likely to affect the blanket peatland distribution in Great Britain. The eventual fate of existing blanket peatlands left outside their bioclimatic space remains uncertain.
Resumo:
Geographic distributions of pathogens are the outcome of dynamic processes involving host availability, susceptibility and abundance, suitability of climate conditions, and historical contingency including evolutionary change. Distributions have changed fast and are changing fast in response to many factors, including climatic change. The response time of arable agriculture is intrinsically fast, but perennial crops and especially forests are unlikely to adapt easily. Predictions of many of the variables needed to predict changes in pathogen range are still rather uncertain, and their effects will be profoundly modified by changes elsewhere in the agricultural system, including both economic changes affecting growing systems and hosts and evolutionary changes in pathogens and hosts. Tools to predict changes based on environmental correlations depend on good primary data, which is often absent, and need to be checked against the historical record, which remains very poor for almost all pathogens. We argue that at present the uncertainty in predictions of change is so great that the important adaptive response is to monitor changes and to retain the capacity to innovate, both by access to economic capital with reasonably long-term rates of return and by retaining wide scientific expertise, including currently less fashionable specialisms.
Resumo:
A time-dependent climate-change experiment with a coupled ocean–atmosphere general circulation model has been used to study changes in the occurrence of drought in summer in southern Europe and central North America. In both regions, precipitation and soil moisture are reduced in a climate of greater atmospheric carbon dioxide. A detailed investigation of the hydrology of the model shows that the drying of the soil comes about through an increase in evaporation in winter and spring, caused by higher temperatures and reduced snow cover, and a decrease in the net input of water in summer. Evaporation is reduced in summer because of the drier soil, but the reduction in precipitation is larger. Three extreme statistics are used to define drought, namely the frequency of low summer precipitation, the occurrence of long dry spells, and the probability of dry soil. The last of these is arguably of the greatest practical importance, but since it is based on soil moisture, of which there are very few observations, the authors’ simulation of it has the least confidence. Furthermore, long time series for daily observed precipitation are not readily available from a sufficient number of stations to enable a thorough evaluation of the model simulation, especially for the frequency of long dry spells, and this increases the systematic uncertainty of the model predictions. All three drought statistics show marked increases owing to the sensitivity of extreme statistics to changes in their distributions. However, the greater likelihood of long dry spells is caused by a tendency in the character of daily rainfall toward fewer events, rather than by the reduction in mean precipitation. The results should not be taken as firm predictions because extreme statistics for small regions cannot be calculated reliably from the output of the current generation of GCMs, but they point to the possibility of large increases in the severity of drought conditions as a consequence of climate change caused by increased CO2.
Resumo:
In this study we quantify the relationship between the aerosol optical depth increase from a volcanic eruption and the severity of the subsequent surface temperature decrease. This investigation is made by simulating 10 different sizes of eruption in a global circulation model (GCM) by changing stratospheric sulfate aerosol optical depth at each time step. The sizes of the simulated eruptions range from Pinatubo‐sized up to the magnitude of supervolcanic eruptions around 100 times the size of Pinatubo. From these simulations we find that there is a smooth monotonic relationship between the global mean maximum aerosol optical depth anomaly and the global mean temperature anomaly and we derive a simple mathematical expression which fits this relationship well. We also construct similar relationships between global mean aerosol optical depth and the temperature anomaly at every individual model grid box to produce global maps of best‐fit coefficients and fit residuals. These maps are used with caution to find the eruption size at which a local temperature anomaly is clearly distinct from the local natural variability and to approximate the temperature anomalies which the model may simulate following a Tambora‐sized eruption. To our knowledge, this is the first study which quantifies the relationship between aerosol optical depth and resulting temperature anomalies in a simple way, using the wealth of data that is available from GCM simulations.
Resumo:
This paper assesses the implications of climate policy for exposure to water resources stresses. It compares a Reference scenario which leads to an increase in global mean temperature of 4oC by the end of the 21st century with a Mitigation scenario which stabilises greenhouse gas concentrations at around 450ppm CO2e and leads to a 2oC increase in 2100. Associated changes in river runoff are simulated using a global hydrological model, for four spatial patterns of change in temperature and rainfall. There is a considerable difference in hydrological change between these four patterns, but the percentages of change avoided at the global scale are relatively robust. By the 2050s, the Mitigation scenario typically avoids between 16 and 30% of the change in runoff under the Reference scenario, and by 2100 it avoids between 43 and 65%. Two different measures of exposure to water resources stress are calculated, based on resources per capita and the ratio of withdrawals to resources. Using the first measure, the Mitigation scenario avoids 8-17% of the impact in 2050 and 20-31% in 2100; with the second measure, the avoided impacts are 5-21% and 15-47% respectively. However, at the same time, the Mitigation scenario also reduces the positive impacts of climate change on water scarcity in other areas. The absolute numbers and locations of people affected by climate change and climate policy vary considerably between the four climate model patterns.
Resumo:
Increased tidal levels and storm surges related to climate change are projected to result in extremely adverse effects on coastal regions. Predictions of such extreme and small-scale events, however, are exceedingly challenging, even for relatively short time horizons. Here we use data from observations, ERA-40 reanalysis, climate scenario simulations, and a simple feature model to find that the frequency of extreme storm surge events affecting Venice is projected to decrease by about 30% by the end of the twenty-first century. In addition, through a trend assessment based on tidal observations we found a reduction in extreme tidal levels. Extrapolating the current +17 cm/century sea level trend, our results suggest that the frequency of extreme tides in Venice might largely remain unaltered under the projected twenty-first century climate simulations.
Resumo:
Public water supplies in England and Wales are provided by around 25 private-sector companies, regulated by an economic regulator (Ofwat) and and environmental regulator (Environment Agency). As part of the regulatory process, companies are required periodically to review their investment needs to maintain safe and secure supplies, and this involves an assessment of the future balance between water supply and demand. The water industry and regulators have developed an agreed set of procedures for this assessment. Climate change has been incorporated into these procedures since the late 1990s, although has been included increasingly seriously over time and it has been an effective legal requirement to consider climate change since the 2003 Water Act. In the most recent assessment in 2009, companies were required explicitly to plan for a defined amount of climate change, taking into account climate change uncertainty. A “medium” climate change scenario was defined, together with “wet” and “dry” extremes, based on scenarios developed from a number of climate models. The water industry and its regulators are now gearing up to exploit the new UKCP09 probabilistic climate change projections – but these pose significant practical and conceptual challenges. This paper outlines how the procedures for incorporating climate change information into water resources planning have evolved, and explores the issues currently facing the industry in adapting to climate change.
Resumo:
The recent solar minimum was the longest and deepest of the space age, with the lowest average sunspot numbers for nearly a century. The Sun appears to be exiting a grand solar maximum (GSM) of activity which has persisted throughout the space age, and is headed into a significantly quieter period. Indeed, initial observations of solar cycle 24 (SC24) continue to show a relatively low heliospheric magnetic field strength and sunspot number (R), despite the average latitude of sunspots and the inclination of the heliospheric current sheet showing the rise to solar maximum is well underway. We extrapolate the available SC24 observations forward in time by assuming R will continue to follow a similar form to previous cycles, despite the end of the GSM, and predict a very weak cycle 24, with R peaking at ∼65–75 around the middle/end of 2012. Similarly, we estimate the heliospheric magnetic field strength will peak around 6nT. We estimate that average galactic cosmic ray fluxes above 1GV rigidity will be ∼10% higher in SC24 than SC23 and that the probability of a large SEP event during this cycle is 0.8, compared to 0.5 for SC23. Comparison of the SC24 R estimates with previous ends of GSMs inferred from 9300 years of cosmogenic isotope data places the current evolution of the Sun and heliosphere in the lowest 5% of cases, suggesting Maunder Minimum conditions are likely within the next 40 years.
Resumo:
Climate change projections are usually presented as 'snapshots' of change at a particular time in the future. Instead, we consider the key question 'when will specific temperature thresholds will be exceeded?'. Framing the question as "when might something happen (either permanently or temporarily)?" rather than "what might happen?" demonstrates that lowering future emissions will delay the crossing of temperature thresholds and buy valuable time for planning adaptation. For example, in higher greenhouse gas emission scenarios, a global average 2°C warming threshold is likely to be crossed by 2060, whereas in a lower emissions scenario, the crossing of this threshold is delayed up to several decades. On regional scales, however, the 2°C threshold will probably be exceeded over large parts of Eurasia, North Africa and Canada by 2040 if emissions continue to increase- well within the lifetime of many people living now.
Resumo:
The time at which the signal of climate change emerges from the noise of natural climate variability (Time of Emergence, ToE) is a key variable for climate predictions and risk assessments. Here we present a methodology for estimating ToE for individual climate models, and use it to make maps of ToE for surface air temperature (SAT) based on the CMIP3 global climate models. Consistent with previous studies we show that the median ToE occurs several decades sooner in low latitudes, particularly in boreal summer, than in mid-latitudes. We also show that the median ToE in the Arctic occurs sooner in boreal winter than in boreal summer. A key new aspect of our study is that we quantify the uncertainty in ToE that arises not only from inter-model differences in the magnitude of the climate change signal, but also from large differences in the simulation of natural climate variability. The uncertainty in ToE is at least 30 years in the regions examined, and as much as 60 years in some regions. Alternative emissions scenarios lead to changes in both the median ToE (by a decade or more) and its uncertainty. The SRES B1 scenario is associated with a very large uncertainty in ToE in some regions. Our findings have important implications for climate modelling and climate policy which we discuss.