58 resultados para Yan, Can, fl. 1248.
em CentAUR: Central Archive University of Reading - UK
Resumo:
Several global quantities are computed from the ERA40 reanalysis for the period 1958-2001 and explored for trends. These are discussed in the context of changes to the global observing system. Temperature, integrated water vapor (IWV), and kinetic energy are considered. The ERA40 global mean temperature in the lower troposphere has a trend of +0.11 K per decade over the period of 1979-2001, which is slightly higher than the MSU measurements, but within the estimated error limit. For the period 1958 2001 the warming trend is 0.14 K per decade but this is likely to be an artifact of changes in the observing system. When this is corrected for, the warming trend is reduced to 0.10 K per decade. The global trend in IWV for the period 1979-2001 is +0.36 mm per decade. This is about twice as high as the trend determined from the Clausius-Clapeyron relation assuming conservation of relative humidity. It is also larger than results from free climate model integrations driven by the same observed sea surface temperature as used in ERA40. It is suggested that the large trend in IWV does not represent a genuine climate trend but an artifact caused by changes in the global observing system such as the use of SSM/I and more satellite soundings in later years. Recent results are in good agreement with GPS measurements. The IWV trend for the period 1958-2001 is still higher but reduced to +0.16 mm per decade when corrected for changes in the observing systems. Total kinetic energy shows an increasing global trend. Results from data assimilation experiments strongly suggest that this trend is also incorrect and mainly caused by the huge changes in the global observing system in 1979. When this is corrected for, no significant change in global kinetic energy from 1958 onward can be found.
Resumo:
Two wavelet-based control variable transform schemes are described and are used to model some important features of forecast error statistics for use in variational data assimilation. The first is a conventional wavelet scheme and the other is an approximation of it. Their ability to capture the position and scale-dependent aspects of covariance structures is tested in a two-dimensional latitude-height context. This is done by comparing the covariance structures implied by the wavelet schemes with those found from the explicit forecast error covariance matrix, and with a non-wavelet- based covariance scheme used currently in an operational assimilation scheme. Qualitatively, the wavelet-based schemes show potential at modeling forecast error statistics well without giving preference to either position or scale-dependent aspects. The degree of spectral representation can be controlled by changing the number of spectral bands in the schemes, and the least number of bands that achieves adequate results is found for the model domain used. Evidence is found of a trade-off between the localization of features in positional and spectral spaces when the number of bands is changed. By examining implied covariance diagnostics, the wavelet-based schemes are found, on the whole, to give results that are closer to diagnostics found from the explicit matrix than from the nonwavelet scheme. Even though the nature of the covariances has the right qualities in spectral space, variances are found to be too low at some wavenumbers and vertical correlation length scales are found to be too long at most scales. The wavelet schemes are found to be good at resolving variations in position and scale-dependent horizontal length scales, although the length scales reproduced are usually too short. The second of the wavelet-based schemes is often found to be better than the first in some important respects, but, unlike the first, it has no exact inverse transform.
Resumo:
Ventilation of the boundary layer has an important effect on local and regional air quality and is a prerequisite for long-range pollution transport. Once in the free troposphere, pollutants can alter the chemical composition of the troposphere and impact on the Earth's radiative forcing. Idealised baroclinic life cycles, LC1 and LC2, have been simulated in a three-dimensional dry hemispheric model in the presence of boundary-layer turbulent fluxes. A passive tracer is added to the simulations to represent pollution emitted at, or near, the surface. A simple conveyor-belt diagnostic is developed to objectively identify regions of the boundary layer that can be ventilated by either warm or cold conveyor belts. Transport of pollutants within and above the boundary layer is examined on synoptic scales. Three different physical mechanisms are found to interact with each other to ventilate pollutants out of the boundary layer. These mechanisms are turbulent mixing within the boundary layer, horizontal advection due to Ekman convergence and divergence within the boundary layer, and advection by the warm conveyor belt. The mass of tracer ventilated by the two life cycles is remarkably similar given the differences in frontal structure, suggesting that the large-scale baroclinicity is an effective constraint on ventilation.
Resumo:
Recent analysis of the Arctic Oscillation (AO) in the stratosphere and troposphere has suggested that predictability of the state of the tropospheric AO may be obtained from the state of the stratospheric AO. However, much of this research has been of a purely qualitative nature. We present a more thorough statistical analysis of a long AO amplitude dataset which seeks to establish the magnitude of such a link. A relationship between the AO in the lower stratosphere and on the 1000 hPa surface on a 10-45 day time-scale is revealed. The relationship accounts for 5% of the variance of the 1000 hPa time series at its peak value and is significant at the 5% level. Over a similar time-scale the 1000 hPa time series accounts for 1% of itself and is not significant at the 5% level. Further investigation of the relationship reveals that it is only present during the winter season and in particular during February and March. It is also demonstrated that using stratospheric AO amplitude data as a predictor in a simple statistical model results in a gain of skill of 5% over a troposphere-only statistical model. This gain in skill is not repeated if an unrelated time series is included as a predictor in the model. Copyright © 2003 Royal Meteorological Society
Resumo:
Recent numerical experiments have demonstrated that the state of the stratosphere has a dynamical impact on the state of the troposphere. To account for such an effect, a number of mechanisms have been proposed in the literature, all of which amount to a large-scale adjustment of the troposphere to potential vorticity (PV) anomalies in the stratosphere. This paper analyses whether a simple PV adjustment suffices to explain the actual dynamical response of the troposphere to the state of the stratosphere, the actual response being determined by ensembles of numerical experiments run with an atmospheric general-circulation model. For this purpose, a new PV inverter is developed. It is shown that the simple PV adjustment hypothesis is inadequate. PV anomalies in the stratosphere induce, by inversion, flow anomalies in the troposphere that do not coincide spatially with the tropospheric changes determined by the numerical experiments. Moreover, the tropospheric anomalies induced by PV inversion are on a larger scale than the changes found in the numerical experiments, which are linked to the Atlantic and Pacific storm-tracks. These findings imply that the impact of the stratospheric state on the troposphere is manifested through the impact on individual synoptic-scale systems and their self-organization in the storm-tracks. Changes in these weather systems in the troposphere are not merely synoptic-scale noise on a larger scale tropospheric response, but an integral part of the mechanism by which the state of the stratosphere impacts that of the troposphere.
Resumo:
A new field of study, “decadal prediction,” is emerging in climate science. Decadal prediction lies between seasonal/interannual forecasting and longer-term climate change projections, and focuses on time-evolving regional climate conditions over the next 10–30 yr. Numerous assessments of climate information user needs have identified this time scale as being important to infrastructure planners, water resource managers, and many others. It is central to the information portfolio required to adapt effectively to and through climatic changes. At least three factors influence time-evolving regional climate at the decadal time scale: 1) climate change commitment (further warming as the coupled climate system comes into adjustment with increases of greenhouse gases that have already occurred), 2) external forcing, particularly from future increases of greenhouse gases and recovery of the ozone hole, and 3) internally generated variability. Some decadal prediction skill has been demonstrated to arise from the first two of these factors, and there is evidence that initialized coupled climate models can capture mechanisms of internally generated decadal climate variations, thus increasing predictive skill globally and particularly regionally. Several methods have been proposed for initializing global coupled climate models for decadal predictions, all of which involve global time-evolving three-dimensional ocean data, including temperature and salinity. An experimental framework to address decadal predictability/prediction is described in this paper and has been incorporated into the coordinated Coupled Model Intercomparison Model, phase 5 (CMIP5) experiments, some of which will be assessed for the IPCC Fifth Assessment Report (AR5). These experiments will likely guide work in this emerging field over the next 5 yr.
Resumo:
Measurements of the top‐of‐the‐atmosphere outgoing longwave radiation (OLR) for July 2003 from Meteosat‐7 are used to assess the performance of the numerical weather prediction version of the Met Office Unified Model. A significant difference is found over desert regions of northern Africa where the model emits too much OLR by up to 35 Wm−2 in the monthly mean. By cloud‐screening the data we find an error of up to 50 Wm−2 associated with cloud‐free areas, which suggests an error in the model surface temperature, surface emissivity, or atmospheric transmission. By building up a physical model of the radiative properties of mineral dust based on in situ, and surface‐based and satellite remote sensing observations we show that the most plausible explanation for the discrepancy in OLR is due to the neglect of mineral dust in the model. The calculations suggest that mineral dust can exert a longwave radiative forcing by as much as 50 Wm−2 in the monthly mean for 1200 UTC in cloud‐free regions, which accounts for the discrepancy between the model and the Meteosat‐7 observations. This suggests that inclusion of the radiative effects of mineral dust will lead to a significant improvement in the radiation balance of numerical weather prediction models with subsequent improvements in performance.
Resumo:
The frequency of persistent atmospheric blocking events in the 40-yr ECMWF Re-Analysis (ERA-40) is compared with the blocking frequency produced by a simple first-order Markov model designed to predict the time evolution of a blocking index [defined by the meridional contrast of potential temperature on the 2-PVU surface (1 PVU ≡ 1 × 10−6 K m2 kg−1 s−1)]. With the observed spatial coherence built into the model, it is able to reproduce the main regions of blocking occurrence and the frequencies of sector blocking very well. This underlines the importance of the climatological background flow in determining the locations of high blocking occurrence as being the regions where the mean midlatitude meridional potential vorticity (PV) gradient is weak. However, when only persistent blocking episodes are considered, the model is unable to simulate the observed frequencies. It is proposed that this persistence beyond that given by a red noise model is due to the self-sustaining nature of the blocking phenomenon.
Resumo:
Can infants below age 1 year learn words in one context and understand them in another? To investigate this question, two groups of parents trained infants from age 9 months on 8 categories of common objects. A control group received no training. At 12 months, infants in the experimental groups, but not in the control group, showed comprehension of the words in a new context. It appears that infants under 1 year old can learn words in a decontextualized, as distinct from a context-bound, fashion. Perceptual variability within the to-be-learned categories, and the perceptual similarity between training sets and the novel test items, did not appear to affect this learning.
Resumo:
The Integrated Catchment Model of Nitrogen (INCA-N) was applied to the River Lambourn, a Chalk river-system in southern England. The model's abilities to simulate the long-term trend and seasonal patterns in observed stream water nitrate concentrations from 1920 to 2003 were tested. This is the first time a semi-distributed, daily time-step model has been applied to simulate such a long time period and then used to calculate detailed catchment nutrient budgets which span the conversion of pasture to arable during the late 1930s and 1940s. Thus, this work goes beyond source apportionment and looks to demonstrate how such simulations can be used to assess the state of the catchment and develop an understanding of system behaviour. The mass-balance results from 1921, 1922, 1991, 2001 and 2002 are presented and those for 1991 are compared to other modelled and literature values of loads associated with nitrogen soil processes and export. The variations highlighted the problem of comparing modelled fluxes with point measurements but proved useful for identifying the most poorly understood inputs and processes thereby providing an assessment of input data and model structural uncertainty. The modelled terrestrial and instream mass-balances also highlight the importance of the hydrological conditions in pollutant transport. Between 1922 and 2002, increased inputs of nitrogen from fertiliser, livestock and deposition have altered the nitrogen balance with a shift from possible reduction in soil fertility but little environmental impact in 1922, to a situation of nitrogen accumulation in the soil, groundwater and instream biota in 2002. In 1922 and 2002 it was estimated that approximately 2 and 18 kg N ha(-1) yr(-1) respectively were exported from the land to the stream. The utility of the approach and further considerations for the best use of models are discussed. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Genetically modified (GM) crops and sustainable development remain the foci of much media attention, especially given current concerns about a global food crisis. However, whilst the latter is embraced with enthusiasm by almost all groups, GM crops generate very mixed views. Some countries have welcomed GM, but others, notably those in Europe, adopt a cautious stance. This article aims to review the contribution that GM crops can make to agricultural sustainability in the developing world. Following brief reviews of both issues and their linkages, notably the pros and cons of GM cotton as a contributory factor in sustainability, a number of case studies from resourcepoor cotton farmers in Makhathini Flats, South Africa, is presented for a six-year period. Data on expenditure, productivity and income indicate that Bacillus thuringiensis (Bt) cotton is advantageous because it reduces costs, for example, of pesticides, and increases income, and the indications are that those benefits continued over at least the six years covered by the studies. There are repercussions of the additional income in the households; debts are reduced and money is invested in children's education and in the farms. However, in the general GM debate, the results show that GM crops are not miracle products which alleviate poverty at a stroke, but nor is there evidence that they will cause the scale of environmental damage associated with indiscriminate pesticide use. Indeed, for some GM antagonists, perhaps even the majority, such debates are irrelevant – the transfer of genes between species is unnatural and unethical. For them, GM crops will never be acceptable despite the evidence and pressure to increase world food production.