613 resultados para 1175
Resumo:
Composites of wind speeds, equivalent potential temperature, mean sea level pressure, vertical velocity, and relative humidity have been produced for the 100 most intense extratropical cyclones in the Northern Hemisphere winter for the 40-yr ECMWF Re-Analysis (ERA-40) and the high resolution global environment model (HiGEM). Features of conceptual models of cyclone structure—the warm conveyor belt, cold conveyor belt, and dry intrusion—have been identified in the composites from ERA-40 and compared to HiGEM. Such features can be identified in the composite fields despite the smoothing that occurs in the compositing process. The surface features and the three-dimensional structure of the cyclones in HiGEM compare very well with those from ERA-40. The warm conveyor belt is identified in the temperature and wind fields as a mass of warm air undergoing moist isentropic uplift and is very similar in ERA-40 and HiGEM. The rate of ascent is lower in HiGEM, associated with a shallower slope of the moist isentropes in the warm sector. There are also differences in the relative humidity fields in the warm conveyor belt. In ERA-40, the high values of relative humidity are strongly associated with the moist isentropic uplift, whereas in HiGEM these are not so strongly associated. The cold conveyor belt is identified as rearward flowing air that undercuts the warm conveyor belt and produces a low-level jet, and is very similar in HiGEM and ERA-40. The dry intrusion is identified in the 500-hPa vertical velocity and relative humidity. The structure of the dry intrusion compares well between HiGEM and ERA-40 but the descent is weaker in HiGEM because of weaker along-isentrope flow behind the composite cyclone. HiGEM’s ability to represent the key features of extratropical cyclone structure can give confidence in future predictions from this model.
Resumo:
A large ensemble of general circulation model (GCM) integrations coupled to a fully interactive sulfur cycle scheme were run on the climateprediction.net platform to investigate the uncertainty in the climate response to sulfate aerosol and carbon dioxide (CO2) forcing. The sulfate burden within the model (and the atmosphere) depends on the balance between formation processes and deposition (wet and dry). The wet removal processes for sulfate aerosol are much faster than dry removal and so any changes in atmospheric circulation, cloud cover, and precipitation will feed back on the sulfate burden. When CO2 is doubled in the Hadley Centre Slab Ocean Model (HadSM3), global mean precipitation increased by 5%; however, the global mean sulfate burden increased by 10%. Despite the global mean increase in precipitation, there were large areas of the model showing decreases in precipitation (and cloud cover) in the Northern Hemisphere during June–August, which reduced wet deposition and allowed the sulfate burden to increase. Further experiments were also undertaken with and without doubling CO2 while including a future anthropogenic sulfur emissions scenario. Doubling CO2 further enhanced the increases in sulfate burden associated with increased anthropogenic sulfur emissions as observed in the doubled CO2-only experiment. The implications are that the climate response to doubling CO2 can influence the amount of sulfate within the atmosphere and, despite increases in global mean precipitation, may act to increase it.
Resumo:
A low resolution coupled ocean-atmosphere general circulation model OAGCM is used to study the characteristics of the large scale ocean circulation and its climatic impacts in a series of global coupled aquaplanet experiments. Three configurations, designed to produce fundamentally different ocean circulation regimes, are considered. The first has no obstruction to zonal flow, the second contains a low barrier that blocks zonal flow in the ocean at all latitudes, creating a single enclosed basin, whilst the third contains a gap in the barrier to allow circumglobal flow at high southern latitudes. Warm greenhouse climates with a global average air surface temperature of around 27C result in all cases. Equator to pole temperature gradients are shallower than that of a current climate simulation. Whilst changes in the land configuration cause regional changes in temperature, winds and rainfall, heat transports within the system are little affected. Inhibition of all ocean transport on the aquaplanet leads to a reduction in global mean surface temperature of 8C, along with a sharpening of the meridional temperature gradient. This results from a reduction in global atmospheric water vapour content and an increase in tropical albedo, both of which act to reduce global surface temperatures. Fitting a simple radiative model to the atmospheric characteristics of the OAGCM solutions suggests that a simpler atmosphere model, with radiative parameters chosen a priori based on the changing surface configuration, would have produced qualitatively different results. This implies that studies with reduced complexity atmospheres need to be guided by more complex OAGCM results on a case by case basis.
Resumo:
A systematic modular approach to investigate the respective roles of the ocean and atmosphere in setting El Niño characteristics in coupled general circulation models is presented. Several state-of-the-art coupled models sharing either the same atmosphere or the same ocean are compared. Major results include 1) the dominant role of the atmosphere model in setting El Niño characteristics (periodicity and base amplitude) and errors (regularity) and 2) the considerable improvement of simulated El Niño power spectra—toward lower frequency—when the atmosphere resolution is significantly increased. Likely reasons for such behavior are briefly discussed. It is argued that this new modular strategy represents a generic approach to identifying the source of both coupled mechanisms and model error and will provide a methodology for guiding model improvement.
Resumo:
In this study, the processes affecting sea surface temperature variability over the 1992–98 period, encompassing the very strong 1997–98 El Niño event, are analyzed. A tropical Pacific Ocean general circulation model, forced by a combination of weekly ERS1–2 and TAO wind stresses, and climatological heat and freshwater fluxes, is first validated against observations. The model reproduces the main features of the tropical Pacific mean state, despite a weaker than observed thermal stratification, a 0.1 m s−1 too strong (weak) South Equatorial Current (North Equatorial Countercurrent), and a slight underestimate of the Equatorial Undercurrent. Good agreement is found between the model dynamic height and TOPEX/Poseidon sea level variability, with correlation/rms differences of 0.80/4.7 cm on average in the 10°N–10°S band. The model sea surface temperature variability is a bit weak, but reproduces the main features of interannual variability during the 1992–98 period. The model compares well with the TAO current variability at the equator, with correlation/rms differences of 0.81/0.23 m s−1 for surface currents. The model therefore reproduces well the observed interannual variability, with wind stress as the only interannually varying forcing. This good agreement with observations provides confidence in the comprehensive three-dimensional circulation and thermal structure of the model. A close examination of mixed layer heat balance is thus undertaken, contrasting the mean seasonal cycle of the 1993–96 period and the 1997–98 El Niño. In the eastern Pacific, cooling by exchanges with the subsurface (vertical advection, mixing, and entrainment), the atmospheric forcing, and the eddies (mainly the tropical instability waves) are the three main contributors to the heat budget. In the central–western Pacific, the zonal advection by low-frequency currents becomes the main contributor. Westerly wind bursts (in December 1996 and March and June 1997) were found to play a decisive role in the onset of the 1997–98 El Niño. They contributed to the early warming in the eastern Pacific because the downwelling Kelvin waves that they excited diminished subsurface cooling there. But it is mainly through eastward advection of the warm pool that they generated temperature anomalies in the central Pacific. The end of El Niño can be linked to the large-scale easterly anomalies that developed in the western Pacific and spread eastward, from the end of 1997 onward. In the far-western Pacific, because of the shallower than normal thermocline, these easterlies cooled the SST by vertical processes. In the central Pacific, easterlies pushed the warm pool back to the west. In the east, they led to a shallower thermocline, which ultimately allowed subsurface cooling to resume and to quickly cool the surface layer.
Resumo:
Canopy interception of incident precipitation is a critical component of the forest water balance during each of the four seasons. Models have been developed to predict precipitation interception from standard meteorological variables because of acknowledged difficulty in extrapolating direct measurements of interception loss from forest to forest. No known study has compared and validated canopy interception models for a leafless deciduous forest stand in the eastern United States. Interception measurements from an experimental plot in a leafless deciduous forest in northeastern Maryland (39°42'N, 75°5'W) for 11 rainstorms in winter and early spring 2004/05 were compared to predictions from three models. The Mulder model maintains a moist canopy between storms. The Gash model requires few input variables and is formulated for a sparse canopy. The WiMo model optimizes the canopy storage capacity for the maximum wind speed during each storm. All models showed marked underestimates and overestimates for individual storms when the measured ratio of interception to gross precipitation was far more or less, respectively, than the specified fraction of canopy cover. The models predicted the percentage of total gross precipitation (PG) intercepted to within the probable standard error (8.1%) of the measured value: the Mulder model overestimated the measured value by 0.1% of PG; the WiMo model underestimated by 0.6% of PG; and the Gash model underestimated by 1.1% of PG. The WiMo model’s advantage over the Gash model indicates that the canopy storage capacity increases logarithmically with the maximum wind speed. This study has demonstrated that dormant-season precipitation interception in a leafless deciduous forest may be satisfactorily predicted by existing canopy interception models.
Resumo:
Atmospheric factors Governing Banded Orographic Convection The three-dimensional structure of shallow orographic convection is investigated through simulations performed with a cloud-resolving numerical model. In moist flows that overcome a given topographic barrier to form statically unstable cap clouds, the organization of the convection depends on both the atmospheric structure and the mechanism by which the convection is initiated. Convection initiated by background thermal fluctuations embedded in the flow over a smooth mountain (without any small-scale topographic features) tends to be cellular and disorganized except that shear-parallel bands may form in flows with strong unidirectional vertical shear. The development of well-organized bands is favored when there is weak static instability inside the cloud and when the dry air surrounding the cloud is strongly stable. These bands move with the flow and distribute their cumulative precipitation evenly over the mountain upslope. Similar shear-parallel bands also develop in flows where convection is initiated by small-scale topographic noise superimposed onto the main mountain profile, but in this case stronger circulations are also triggered that create stationary rainbands parallel to the low-level flow. This second dominant mode, which is less sensitive to the atmospheric structure and the strength of forcing, is triggered by lee waves that form over small-scale topographic bumps near the upstream edge of the main orographic cloud. Due to their stationarity, these flow-parallel bands can produce locally heavy precipitation amounts.
Resumo:
Radar images and numerical simulations of three shallow convective precipitation events over the Coastal Range in western Oregon are presented. In one of these events, unusually well-defined quasi-stationary banded formations produced large precipitation enhancements in favored locations, while varying degrees of band organization and lighter precipitation accumulations occurred in the other two cases. The difference between the more banded and cellular cases appeared to depend on the vertical shear within the orographic cap cloud and the susceptibility of the flow to convection upstream of the mountain. Numerical simulations showed that the rainbands, which appeared to be shear-parallel convective roll circulations that formed within the unstable orographic cap cloud, developed even over smooth mountains. However, these banded structures were better organized, more stationary, and produced greater precipitation enhancement over mountains with small-scale topographic obstacles. Low-amplitude random topographic roughness elements were found to be just as effective as more prominent subrange-scale peaks at organizing and fixing the location of the orographic rainbands.
Resumo:
The development of shallow cellular convection in warm orographic clouds is investigated through idealized numerical simulations of moist flow over topography using a cloud-resolving numerical model. Buoyant instability, a necessary element for moist convection, is found to be diagnosed most accurately through analysis of the moist Brunt–Väisälä frequency (N_m) rather than the vertical profile of θ_e. In statically unstable orographic clouds (N_m^2) < 0), additional environmental and terrain-related factors are shown to have major effects on the amount of cellularity that occurs in 2D simulations. One of these factors, the basic-state wind shear, may suppress convection in 2D yet allow for longitudinal convective roll circulations in 3D. The presence of convective structures within an orographic cloud substantially enhanced the maximum rainfall rates, precipitation efficiencies, and precipitation accumulations in all simulations.
Resumo:
This paper presents a new analysis of ocean heat content changes over the last 50 yr using isotherms by calculating the mean temperature above the 148C isotherm and the depth of the 148C isotherm as separate variables. A new quantity called the ‘‘relative heat content’’ (‘‘RHC’’) is introduced, which represents the minimum local heat content change over time, relative to a fixed isotherm. It is shown how mean temperature and isotherm depth changes make separable and additive contributions to changes in RHC. Maps of RHC change between 1970 and 2000 show similar spatial patterns to a traditional fixed-depth ocean heat content change to 220 m. However, the separate contributions to RHC suggest a more spatially uniform contribution from warming above the isotherm, while isotherm depth changes show wind-driven signals, of which some are identifiable as being related to the North Atlantic Oscillation. The time series show that the warming contribution to RHC dominates the global trend, while the depth contribution only dominates on the basin scale in the North Atlantic. The RHC shows minima associated with the major volcanic eruptions (particularly in the Indian Ocean), and these are entirely contributed by mean temperature changes rather than isotherm depth changes. The depth change contributions to RHC are strongly affected by the recently reported XBT fall-rate bias, whereas the mean temperature contributions are not. Therefore, only the isotherm depth change contributions toRHCwill need to be reassessed as fall-rate-corrected data become available.
Resumo:
The application of particle filters in geophysical systems is reviewed. Some background on Bayesian filtering is provided, and the existing methods are discussed. The emphasis is on the methodology, and not so much on the applications themselves. It is shown that direct application of the basic particle filter (i.e., importance sampling using the prior as the importance density) does not work in high-dimensional systems, but several variants are shown to have potential. Approximations to the full problem that try to keep some aspects of the particle filter beyond the Gaussian approximation are also presented and discussed.
Resumo:
The existence of inertial steady currents that separate from a coast and meander afterward is investigated. By integrating the zonal momentum equation over a suitable area, it is shown that retroflecting currents cannot be steady in a reduced gravity or in a barotropic model of the ocean. Even friction cannot negate this conclusion. Previous literature on this subject, notably the discrepancy between several articles by Nof and Pichevin on the unsteadiness of retroflecting currents and steady solutions presented in other papers, is critically discussed. For more general separating current systems, a local analysis of the zonal momentum balance shows that given a coastal current with a specific zonal momentum structure, an inertial, steady, separating current is unlikely, and the only analytical solution provided in the literature is shown to be inconsistent. In a basin-wide view of these separating current systems, a scaling analysis reveals that steady separation is impossible when the interior flow is nondissipative (e.g., linear Sverdrup-like). These findings point to the possibility that a large part of the variability in the world’s oceans is due to the separation process rather than to instability of a free jet.
Resumo:
This paper presents a first attempt to estimate mixing parameters from sea level observations using a particle method based on importance sampling. The method is applied to an ensemble of 128 members of model simulations with a global ocean general circulation model of high complexity. Idealized twin experiments demonstrate that the method is able to accurately reconstruct mixing parameters from an observed mean sea level field when mixing is assumed to be spatially homogeneous. An experiment with inhomogeneous eddy coefficients fails because of the limited ensemble size. This is overcome by the introduction of local weighting, which is able to capture spatial variations in mixing qualitatively. As the sensitivity of sea level for variations in mixing is higher for low values of mixing coefficients, the method works relatively well in regions of low eddy activity.
Resumo:
The propagation velocity and propagation mechanism for vortices on a β plane are determined for a reduced-gravity model by integrating the momentum equations over the β plane. Isolated vortices, vortices in a background current, and initial vortex propagation from rest are studied. The propagation mechanism for isolated anticyclones as well as cyclones, which has been lacking up to now, is presented. It is shown that, to first order, the vortex moves to generate a Coriolis force on the mass anomaly of the vortex to compensate for the force on the vortex due to the variation of the Coriolis parameter. Only the mass anomaly of the vortex is of importance, because the Coriolis force due to the motion of the bulk of the layer moving with the vortex is almost fully compensated by the Coriolis force on the motion of the exterior flow. Because the mass anomaly of a cyclone is negative the force and acceleration have opposite sign. The role of dipolar structures in steadily moving vortices is discussed, and it is shown that their overall structure is fixed by the steady westward motion of the mass anomaly. Furthermore, it is shown that reduced-gravity vortices are not advected with a background flow. The reason for this behavior is that the background flow changes the ambient vorticity gradient such that the vortex obtains an extra self-propagation term that exactly cancels the advection by the background flow. Last, it is shown that a vortex initially at rest will accelerate equatorward first, after which a westward motion is generated. This result is independent of the sign of the vortex.
Resumo:
The Observing System Research and Predictability Experiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) is a World Weather Research Programme project. One of its main objectives is to enhance collaboration on the development of ensemble prediction between operational centers and universities by increasing the availability of ensemble prediction system (EPS) data for research. This study analyzes the prediction of Northern Hemisphere extratropical cyclones by nine different EPSs archived as part of the TIGGE project for the 6-month time period of 1 February 2008–31 July 2008, which included a sample of 774 cyclones. An objective feature tracking method has been used to identify and track the cyclones along the forecast trajectories. Forecast verification statistics have then been produced [using the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis as the truth] for cyclone position, intensity, and propagation speed, showing large differences between the different EPSs. The results show that the ECMWF ensemble mean and control have the highest level of skill for all cyclone properties. The Japanese Meteorological Administration (JMA), the National Centers for Environmental Prediction (NCEP), the Met Office (UKMO), and the Canadian Meteorological Centre (CMC) have 1 day less skill for the position of cyclones throughout the forecast range. The relative performance of the different EPSs remains the same for cyclone intensity except for NCEP, which has larger errors than for position. NCEP, the Centro de Previsão de Tempo e Estudos Climáticos (CPTEC), and the Australian Bureau of Meteorology (BoM) all have faster intensity error growth in the earlier part of the forecast. They are also very underdispersive and significantly underpredict intensities, perhaps due to the comparatively low spatial resolutions of these EPSs not being able to accurately model the tilted structure essential to cyclone growth and decay. There is very little difference between the levels of skill of the ensemble mean and control for cyclone position, but the ensemble mean provides an advantage over the control for all EPSs except CPTEC in cyclone intensity and there is an advantage for propagation speed for all EPSs. ECMWF and JMA have an excellent spread–skill relationship for cyclone position. The EPSs are all much more underdispersive for cyclone intensity and propagation speed than for position, with ECMWF and CMC performing best for intensity and CMC performing best for propagation speed. ECMWF is the only EPS to consistently overpredict cyclone intensity, although the bias is small. BoM, NCEP, UKMO, and CPTEC significantly underpredict intensity and, interestingly, all the EPSs underpredict the propagation speed, that is, the cyclones move too slowly on average in all EPSs.