997 resultados para Simulated experiment


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We analyze here the polar stratospheric temperatures in an ensemble of three 150-year integrations of the Canadian Middle Atmosphere Model (CMAM), an interactive chemistry-climate model which simulates ozone depletion and recovery, as well as climate change. A key motivation is to understand possible mechanisms for the observed trend in the extent of conditions favourable for polar stratospheric cloud (PSC) formation in the Arctic winter lower stratosphere. We find that in the Antarctic winter lower stratosphere, the low temperature extremes required for PSC formation increase in the model as ozone is depleted, but remain steady through the twenty-first century as the warming from ozone recovery roughly balances the cooling from climate change. Thus, ozone depletion itself plays a major role in the Antarctic trends in low temperature extremes. The model trend in low temperature extremes in the Arctic through the latter half of the twentieth century is weaker and less statistically robust than the observed trend. It is not projected to continue into the future. Ozone depletion in the Arctic is weaker in the CMAM than in observations, which may account for the weak past trend in low temperature extremes. In the future, radiative cooling in the Arctic winter due to climate change is more than compensated by an increase in dynamically driven downwelling over the pole.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our knowledge of stratospheric O3-N2O correlations is extended, and their potential for model-measurement comparison assessed, using data from the Atmospheric Chemistry Experiment (ACE) satellite and the Canadian Middle Atmosphere Model (CMAM). ACE provides the first comprehensive data set for the investigation of interhemispheric, interseasonal, and height-resolved differences of the O_3-N_2O correlation structure. By subsampling the CMAM data, the representativeness of the ACE data is evaluated. In the middle stratosphere, where the correlations are not compact and therefore mainly reflect the data sampling, joint probability density functions provide a detailed picture of key aspects of transport and mixing, but also trace polar ozone loss. CMAM captures these important features, but exhibits a displacement of the tropical pipe into the Southern Hemisphere (SH). Below about 21 km, the ACE data generally confirm the compactness of the correlations, although chemical ozone loss tends to destroy the compactness during late winter/spring, especially in the SH. This allows a quantitative comparison of the correlation slopes in the lower and lowermost stratosphere (LMS), which exhibit distinct seasonal cycles that reveal the different balances between diabatic descent and horizontal mixing in these two regions in the Northern Hemisphere (NH), reconciling differences found in aircraft measurements, and the strong role of chemical ozone loss in the SH. The seasonal cycles are qualitatively well reproduced by CMAM, although their amplitude is too weak in the NH LMS. The correlation slopes allow a "chemical" definition of the LMS, which is found to vary substantially in vertical extent with season.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A high resolution general circulation model has been used to study intense tropical storms. A five-year-long global integration with a spatial resolution of 125 km has been analysed. The geographical and seasonal distribution of tropical storms agrees remarkably well with observations. The structure of individual storms also agrees with observations, but the storms are generally more extensive in coverage and less extreme than the observed ones. A few additional calculations have also been done by a very high resolution limited-area version of the same model, where the boundary conditions successively have been interpolated from the global model. These results are very realistic in many details of the structure of the storms including simulated rain-bands and an eye structure. The global model has also been used in another five-year integration to study the influence of greenhouse warming. The sea surface temperatures have been taken from a transient climate change experiment carried out with a low resolution coupled ocean-atmosphere model. The result is a significant reduction in the number of hurricanes, particularly in the Southern Hemisphere. Main reasons for this can be found in changes in the largescale circulation, i.e. a weakening of the Hadley circulation, and a more intense warming of the upper tropical troposphere. A similar effect can be seen during warm ENSO events, where fewer North Atlantic hurricanes have been reported.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Hamburg atmospheric general circulation model ECHAM3 at T106 resolution (1.125' lat.Aon.) has considerable skill in reproducing the observed seasonal reversal of mean sea level pressure, the location of the summer heat low as well as the position of the monsoon trough over the Indian subcontinent. The present-day climate and its seasonal cycle are realistically simulated by the model over this region. The model simulates the structure, intensity, frequency, movement and lifetime of monsoon depressions remarkably well. The number of monsoon depressions/storms simulated by the model in a year ranged from 5 to 12 with an average frequency of 8.4 yr-', not significantly different from the observed climatology. The model also simulates the interannual variability in the formation of depressions over the north Bay of Bengal during the summer monsoon season. In the warmer atmosphere under doubled CO2 conditions, the number of monsoon depressions/cyclonic storms forming in Indian seas in a year ranged from 5 to 11 with an average frequency of 7.6 yr-', not significantly different from those inferred in the control run of the model. However, under doubled CO2 conditions, fewer depressions formed in the month of June. Neither the lowest central pressure nor the maximum wind speed changes appreciably in monsoon depressions identified under simulated enhanced greenhouse conditions. The analysis suggests there will be no significant changes in the number and intensity of monsoon depressions in a warmer atmosphere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dynamics of Northern Hemisphere major midwinter stratospheric sudden warmings (SSWs) are examined using transient climate change simulations from the Canadian Middle Atmosphere Model (CMAM). The simulated SSWs show good overall agreement with reanalysis data in terms of composite structure, statistics, and frequency. Using observed or model sea surface temperatures (SSTs) is found to make no significant difference to the SSWs, indicating that the use of model SSTs in the simulations extending into the future is not an issue. When SSWs are defined by the standard (wind based) definition, an absolute criterion, their frequency is found to increase by;60% by the end of this century, in conjunction with a;25% decrease in their temperature amplitude. However, when a relative criterion based on the northern annular mode index is used to define the SSWs, no future increase in frequency is found. The latter is consistent with the fact that the variance of 100-hPa daily heat flux anomalies is unaffected by climate change. The future increase in frequency of SSWs using the standard method is a result of the weakened climatological mean winds resulting from climate change, which make it easier for the SSW criterion to be met. A comparison of winters with and without SSWs reveals that the weakening of the climatological westerlies is not a result of SSWs. The Brewer–Dobson circulation is found to be stronger by ;10% during winters with SSWs, which is a value that does not change significantly in the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulations of ozone loss rates using a three-dimensional chemical transport model and a box model during recent Antarctic and Arctic winters are compared with experimental loss rates. The study focused on the Antarctic winter 2003, during which the first Antarctic Match campaign was organized, and on Arctic winters 1999/2000, 2002/2003. The maximum ozone loss rates retrieved by the Match technique for the winters and levels studied reached 6 ppbv/sunlit hour and both types of simulations could generally reproduce the observations at 2-sigma error bar level. In some cases, for example, for the Arctic winter 2002/2003 at 475 K level, an excellent agreement within 1-sigma standard deviation level was obtained. An overestimation was also found with the box model simulation at some isentropic levels for the Antarctic winter and the Arctic winter 1999/2000, indicating an overestimation of chlorine activation in the model. Loss rates in the Antarctic show signs of saturation in September, which have to be considered in the comparison. Sensitivity tests were performed with the box model in order to assess the impact of kinetic parameters of the ClO-Cl2O2 catalytic cycle and total bromine content on the ozone loss rate. These tests resulted in a maximum change in ozone loss rates of 1.2 ppbv/sunlit hour, generally in high solar zenith angle conditions. In some cases, a better agreement was achieved with fastest photolysis of Cl2O2 and additional source of total inorganic bromine but at the expense of overestimation of smaller ozone loss rates derived later in the winter.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite many decades investigating scalp recordable 8–13-Hz (alpha) electroencephalographic activity, no consensus has yet emerged regarding its physiological origins nor its functional role in cognition. Here we outline a detailed, physiologically meaningful, theory for the genesis of this rhythm that may provide important clues to its functional role. In particular we find that electroencephalographically plausible model dynamics, obtained with physiological admissible parameterisations, reveals a cortex perched on the brink of stability, which when perturbed gives rise to a range of unanticipated complex dynamics that include 40-Hz (gamma) activity. Preliminary experimental evidence, involving the detection of weak nonlinearity in resting EEG using an extension of the well-known surrogate data method, suggests that nonlinear (deterministic) dynamics are more likely to be associated with weakly damped alpha activity. Thus rather than the “alpha rhythm” being an idling rhythm it may be more profitable to conceive it as a readiness rhythm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: To quantify to what extent the new registration method, DARTEL (Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra), may reduce the smoothing kernel width required and investigate the minimum group size necessary for voxel-based morphometry (VBM) studies. Materials and Methods: A simulated atrophy approach was employed to explore the role of smoothing kernel, group size, and their interactions on VBM detection accuracy. Group sizes of 10, 15, 25, and 50 were compared for kernels between 0–12 mm. Results: A smoothing kernel of 6 mm achieved the highest atrophy detection accuracy for groups with 50 participants and 8–10 mm for the groups of 25 at P < 0.05 with familywise correction. The results further demonstrated that a group size of 25 was the lower limit when two different groups of participants were compared, whereas a group size of 15 was the minimum for longitudinal comparisons but at P < 0.05 with false discovery rate correction. Conclusion: Our data confirmed DARTEL-based VBM generally benefits from smaller kernels and different kernels perform best for different group sizes with a tendency of smaller kernels for larger groups. Importantly, the kernel selection was also affected by the threshold applied. This highlighted that the choice of kernel in relation to group size should be considered with care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Global warming is expected to enhance fluxes of fresh water between the surface and atmosphere, causing wet regions to become wetter and dry regions drier, with serious implications for water resource management. Defining the wet and dry regions as the upper 30% and lower 70% of the precipitation totals across the tropics (30° S–30° N) each month we combine observations and climate model simulations to understand changes in the wet and dry regions over the period 1850–2100. Observed decreases in precipitation over dry tropical land (1950–2010) are also simulated by coupled atmosphere–ocean climate models (−0.3%/decade) with trends projected to continue into the 21st century. Discrepancies between observations and simulations over wet land regions since 1950 exist, relating to decadal fluctuations in El Niño southern oscillation, the timing of which is not represented by the coupled simulations. When atmosphere-only simulations are instead driven by observed sea surface temperature they are able to adequately represent this variability over land. Global distributions of precipitation trends are dominated by spatial changes in atmospheric circulation. However, the tendency for already wet regions to become wetter (precipitation increases with warming by 3% K−1 over wet tropical oceans) and the driest regions drier (precipitation decreases of −2% K−1 over dry tropical land regions) emerges over the 21st century in response to the substantial surface warming.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we have proposed and analyzed a simple mathematical model consisting of four variables, viz., nutrient concentration, toxin producing phytoplankton (TPP), non-toxic phytoplankton (NTP), and toxin concentration. Limitation in the concentration of the extracellular nutrient has been incorporated as an environmental stress condition for the plankton population, and the liberation of toxic chemicals has been described by a monotonic function of extracellular nutrient. The model is analyzed and simulated to reproduce the experimental findings of Graneli and Johansson [Graneli, E., Johansson, N., 2003. Increase in the production of allelopathic Prymnesium parvum cells grown under N- or P-deficient conditions. Harmful Algae 2, 135–145]. The robustness of the numerical experiments are tested by a formal parameter sensitivity analysis. As the first theoretical model consistent with the experiment of Graneli and Johansson (2003), our results demonstrate that, when nutrient-deficient conditions are favorable for the TPP population to release toxic chemicals, the TPP species control the bloom of other phytoplankton species which are non-toxic. Consistent with the observations made by Graneli and Johansson (2003), our model overcomes the limitation of not incorporating the effect of nutrient-limited toxic production in several other models developed on plankton dynamics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Forgetting immediate physical reality and having awareness of one�s location in the simulated world is critical to enjoyment and performance in virtual environments be it an interactive 3D game such as Quake or an online virtual 3d community space such as Second Life. Answer to the question "where am I?" at two levels, whether the locus is in the immediate real world as opposed to the virtual world and whether one is aware of the spatial co-ordinates of that locus, hold the key to any virtual 3D experience. While 3D environments, especially virtual environments and their impact on spatial comprehension has been studied in disciplines such as architecture, it is difficult to determine the relative contributions of specific attributes such as screen size or stereoscopy towards spatial comprehension since most of them treat the technology as monolith (box-centered). Using a variable-centered approach put forth by Nass and Mason (1990) which breaks down the technology into its component variables and their corresponding values as its theoretical basis, this paper looks at the contributions of five variables (Stereoscopy, screen size, field of view, level of realism and level of detail) common to most virtual environments on spatial comprehension and presence. The variable centered approach can be daunting as the increase in the number of variables can exponentially increase the number of conditions and resources required. We overcome this drawback posed by adoption of such a theoretical approach by the use of a fractional factorial design for the experiment. This study has completed the first wave of data collection and starting the next phase in January 2007 and expected to complete by February 2007. Theoretical and practical implications of the study are discussed.