119 resultados para horizontal-vertical
Resumo:
There is currently an increased interest of Government and Industry in the UK, as well as at the European Community level and International Agencies (i.e. Department of Energy, American International Energy Agency), to improve the performance and uptake of Ground Coupled Heat Pumps (GCHP), in order to meet the 2020 renewable energy target. A sound knowledge base is required to help inform the Government Agencies and advisory bodies; detailed site studies providing reliable data for model verification have an important role to play in this. In this study we summarise the effect of heat extraction by a horizontal ground heat exchanger (installed at 1 m depth) on the soil physical environment (between 0 and 1 m depth) for a site in the south of the UK. Our results show that the slinky influences the surrounding soil by significantly decreasing soil temperatures. Furthermore, soil moisture contents were lower for the GCHP soil profile, most likely due to temperature-gradient related soil moisture migration effects and a decreased hydraulic conductivity, the latter as a result of increased viscosity (caused by the lower temperatures for the GCHP soil profile). The effects also caused considerable differences in soil thermal properties. This is the first detailed mechanistic study conducted in the UK with the aim to understand the interactions between the soil, horizontal heat exchangers and the aboveground environment. An increased understanding of these interactions will help to achieve an optimum and sustainable use of the soil heat resources in the future. The results of this study will help to calibrate and verify a simulation model that will provide UK-wide recommendations to improve future GCHP uptake and performance, while safeguarding the soil physical resources.
Resumo:
The interactions between shear-free turbulence in two regions (denoted as + and − on either side of a nearly flat horizontal interface are shown here to be controlled by several mechanisms, which depend on the magnitudes of the ratios of the densities, ρ+/ρ−, and kinematic viscosities of the fluids, μ+/μ−, and the root mean square (r.m.s.) velocities of the turbulence, u0+/u0−, above and below the interface. This study focuses on gas–liquid interfaces so that ρ+/ρ− ≪ 1 and also on where turbulence is generated either above or below the interface so that u0+/u0− is either very large or very small. It is assumed that vertical buoyancy forces across the interface are much larger than internal forces so that the interface is nearly flat, and coupling between turbulence on either side of the interface is determined by viscous stresses. A formal linearized rapid-distortion analysis with viscous effects is developed by extending the previous study by Hunt & Graham (J. Fluid Mech., vol. 84, 1978, pp. 209–235) of shear-free turbulence near rigid plane boundaries. The physical processes accounted for in our model include both the blocking effect of the interface on normal components of the turbulence and the viscous coupling of the horizontal field across thin interfacial viscous boundary layers. The horizontal divergence in the perturbation velocity field in the viscous layer drives weak inviscid irrotational velocity fluctuations outside the viscous boundary layers in a mechanism analogous to Ekman pumping. The analysis shows the following. (i) The blocking effects are similar to those near rigid boundaries on each side of the interface, but through the action of the thin viscous layers above and below the interface, the horizontal and vertical velocity components differ from those near a rigid surface and are correlated or anti-correlated respectively. (ii) Because of the growth of the viscous layers on either side of the interface, the ratio uI/u0, where uI is the r.m.s. of the interfacial velocity fluctuations and u0 the r.m.s. of the homogeneous turbulence far from the interface, does not vary with time. If the turbulence is driven in the lower layer with ρ+/ρ− ≪ 1 and u0+/u0− ≪ 1, then uI/u0− ~ 1 when Re (=u0−L−/ν−) ≫ 1 and R = (ρ−/ρ+)(v−/v+)1/2 ≫ 1. If the turbulence is driven in the upper layer with ρ+/ρ− ≪ 1 and u0+/u0− ≫ 1, then uI/u0+ ~ 1/(1 + R). (iii) Nonlinear effects become significant over periods greater than Lagrangian time scales. When turbulence is generated in the lower layer, and the Reynolds number is high enough, motions in the upper viscous layer are turbulent. The horizontal vorticity tends to decrease, and the vertical vorticity of the eddies dominates their asymptotic structure. When turbulence is generated in the upper layer, and the Reynolds number is less than about 106–107, the fluctuations in the viscous layer do not become turbulent. Nonlinear processes at the interface increase the ratio uI/u0+ for sheared or shear-free turbulence in the gas above its linear value of uI/u0+ ~ 1/(1 + R) to (ρ+/ρ−)1/2 ~ 1/30 for air–water interfaces. This estimate agrees with the direct numerical simulation results from Lombardi, De Angelis & Bannerjee (Phys. Fluids, vol. 8, no. 6, 1996, pp. 1643–1665). Because the linear viscous–inertial coupling mechanism is still significant, the eddy motions on either side of the interface have a similar horizontal structure, although their vertical structure differs.
Resumo:
The effect of multiple haptic distractors on target selection performance was examined in terms of times to select the target and the associated cursor movement patterns. Two experiments examined: a) The effect of multiple haptic distractors around a single target and b) the effect of inter-item spacing in a linear selection task. It was found that certain target-distractor arrangements hindered performance and that this could be associated with specific, explanatory cursor patterns. In particular, it was found that the presence of distractors along the task axis in front of the target was detrimental to performance, and that there was evidence to suggest that this could sometimes be associated with consequent cursor oscillation between distractors adjacent to a desired target. A further experiment examined the effect of target-distractor spacing in two orientations on a user’s ability to select a target when caught in the gravity well of a distractor. Times for movements in the vertical direction were found to be faster than those in the horizontal direction. In addition, although times for the vertical direction appeared equivalent across five target-distractor distances, times for the horizontal direction exhibited peaks at certain distances. The implications of these results for the design and implementation of haptically enhanced interfaces using the force feedback mouse are discussed.
Resumo:
In this paper the authors exploit two equivalent formulations of the average rate of material entropy production in the climate system to propose an approximate splitting between contributions due to vertical and eminently horizontal processes. This approach is based only on 2D radiative fields at the surface and at the top of atmosphere. Using 2D fields at the top of atmosphere alone, lower bounds to the rate of material entropy production and to the intensity of the Lorenz energy cycle are derived. By introducing a measure of the efficiency of the planetary system with respect to horizontal thermodynamic processes, it is possible to gain insight into a previous intuition on the possibility of defining a baroclinic heat engine extracting work from the meridional heat flux. The approximate formula of the material entropy production is verified and used for studying the global thermodynamic properties of climate models (CMs) included in the Program for Climate Model Diagnosis and Intercomparison (PCMDI)/phase 3 of the Coupled Model Intercomparison Project (CMIP3) dataset in preindustrial climate conditions. It is found that about 90% of the material entropy production is due to vertical processes such as convection, whereas the large-scale meridional heat transport contributes to only about 10% of the total. This suggests that the traditional two-box models used for providing a minimal representation of entropy production in planetary systems are not appropriate, whereas a basic—but conceptually correct—description can be framed in terms of a four-box model. The total material entropy production is typically 55 mW m−2 K−1, with discrepancies on the order of 5%, and CMs’ baroclinic efficiencies are clustered around 0.055. The lower bounds on the intensity of the Lorenz energy cycle featured by CMs are found to be around 1.0–1.5 W m−2, which implies that the derived inequality is rather stringent. When looking at the variability and covariability of the considered thermodynamic quantities, the agreement among CMs is worse, suggesting that the description of feedbacks is more uncertain. The contributions to material entropy production from vertical and horizontal processes are positively correlated, so that no compensation mechanism seems in place. Quite consistently among CMs, the variability of the efficiency of the system is a better proxy for variability of the entropy production due to horizontal processes than that of the large-scale heat flux. The possibility of providing constraints on the 3D dynamics of the fluid envelope based only on 2D observations of radiative fluxes seems promising for the observational study of planets and for testing numerical models.
Resumo:
For data assimilation in numerical weather prediction, the initial forecast-error covariance matrix Pf is required. For variational assimilation it is particularly important to prescribe an accurate initial matrix Pf, since Pf is either static (in the 3D-Var case) or constant at the beginning of each assimilation window (in the 4D-Var case). At large scales the atmospheric flow is well approximated by hydrostatic balance and this balance is strongly enforced in the initial matrix Pf used in operational variational assimilation systems such as that of the Met Office. However, at convective scales this balance does not necessarily hold any more. Here we examine the extent to which hydrostatic balance is valid in the vertical forecast-error covariances for high-resolution models in order to determine whether there is a need to relax this balance constraint in convective-scale data assimilation. We use the Met Office Global and Regional Ensemble Prediction System (MOGREPS) and a 1.5 km resolution version of the Unified Model for a case study characterized by the presence of convective activity. An ensemble of high-resolution forecasts valid up to three hours after the onset of convection is produced. We show that at 1.5 km resolution hydrostatic balance does not hold for forecast errors in regions of convection. This indicates that in the presence of convection hydrostatic balance should not be enforced in the covariance matrix used for variational data assimilation at this scale. The results show the need to investigate covariance models that may be better suited for convective-scale data assimilation. Finally, we give a measure of the balance present in the forecast perturbations as a function of the horizontal scale (from 3–90 km) using a set of diagnostics. Copyright © 2012 Royal Meteorological Society and British Crown Copyright, the Met Office
Resumo:
This paper argues that the direct, vertical toleration of certain types of citizen by the Rawlsian liberal state is appropriate and required in circumstances in which these types of citizen pose a threat to the stability of the state. By countering the claim that vertical toleration is redundant given a commitment to the Rawlsian version of the liberal democratic ideal, and by articulating a version of that ideal that shows this claim to be false, the paper reaffirms the centrality of vertical toleration in the Rawlsian liberal account of state-citizen relations.
Resumo:
A multimodel assessment of the performance of chemistry-climate models (CCMs) in the extratropical upper troposphere/lower stratosphere (UTLS) is conducted for the first time. Process-oriented diagnostics are used to validate dynamical and transport characteristics of 18 CCMs using meteorological analyses and aircraft and satellite observations. The main dynamical and chemical climatological characteristics of the extratropical UTLS are generally well represented by the models, despite the limited horizontal and vertical resolution. The seasonal cycle of lowermost stratospheric mass is realistic, however with a wide spread in its mean value. A tropopause inversion layer is present in most models, although the maximum in static stability is located too high above the tropopause and is somewhat too weak, as expected from limited model resolution. Similar comments apply to the extratropical tropopause transition layer. The seasonality in lower stratospheric chemical tracers is consistent with the seasonality in the Brewer-Dobson circulation. Both vertical and meridional tracer gradients are of similar strength to those found in observations. Models that perform less well tend to use a semi-Lagrangian transport scheme and/or have a very low resolution. Two models, and the multimodel mean, score consistently well on all diagnostics, while seven other models score well on all diagnostics except the seasonal cycle of water vapor. Only four of the models are consistently below average. The lack of tropospheric chemistry in most models limits their evaluation in the upper troposphere. Finally, the UTLS is relatively sparsely sampled by observations, limiting our ability to quantitatively evaluate many aspects of model performance.
Resumo:
The mesospheric response to the 2002 Antarctic Stratospheric Sudden Warming (SSW) is analysed using the Canadian Middle Atmosphere Model Data Assimilation System (CMAM-DAS), where it represents a vertical propagation of information from the observations into the data-free mesosphere. The CMAM-DAS simulates a cooling in the lowest part of the mesosphere which is accomplished by resolved motions, but which is extended to the mid- to upper mesosphere by the response of the model's non-orographic gravity-wave drag parameterization to the change in zonal winds. The basic mechanism is that elucidated by Holton consisting of a net eastward wave-drag anomaly in the mesosphere during the SSW, although in this case there is a net upwelling in the polar mesosphere. Since the zonal-mean mesospheric response is shown to be predictable, this demonstrates that variations in the mesospheric state can be slaved to the lower atmosphere through gravity-wave drag.
Resumo:
The ability to run General Circulation Models (GCMs) at ever-higher horizontal resolutions has meant that tropical cyclone simulations are increasingly credible. A hierarchy of atmosphere-only GCMs, based on the Hadley Centre Global Environmental Model (HadGEM1), with horizontal resolution increasing from approximately 270km to 60km (at 50N), is used to systematically investigate the impact of spatial resolution on the simulation of global tropical cyclone activity, independent of model formulation. Tropical cyclones are extracted from ensemble simulations and reanalyses of comparable resolutions using a feature-tracking algorithm. Resolution is critical for simulating storm intensity and convergence to observed storm intensities is not achieved with the model hierarchy. Resolution is less critical for simulating the annual number of tropical cyclones and their geographical distribution, which are well captured at resolutions of 135km or higher, particularly for Northern Hemisphere basins. Simulating the interannual variability of storm occurrence requires resolutions of 100km or higher; however, the level of skill is basin dependent. Higher resolution GCMs are increasingly able to capture the interannual variability of the large-scale environmental conditions that contribute to tropical cyclogenesis. Different environmental factors contribute to the interannual variability of tropical cyclones in the different basins: in the North Atlantic basin the vertical wind shear, potential intensity and low-level absolute vorticity are dominant, while in the North Pacific basins mid-level relative humidity and low-level absolute vorticity are dominant. Model resolution is crucial for a realistic simulation of tropical cyclone behaviour, and high-resolution GCMs are found to be valuable tools for investigating the global location and frequency of tropical cyclones.
Resumo:
A rapid-distortion model is developed to investigate the interaction of weak turbulence with a monochromatic irrotational surface water wave. The model is applicable when the orbital velocity of the wave is larger than the turbulence intensity, and when the slope of the wave is sufficiently high that the straining of the turbulence by the wave dominates over the straining of the turbulence by itself. The turbulence suffers two distortions. Firstly, vorticity in the turbulence is modulated by the wave orbital motions, which leads to the streamwise Reynolds stress attaining maxima at the wave crests and minima at the wave troughs; the Reynolds stress normal to the free surface develops minima at the wave crests and maxima at the troughs. Secondly, over several wave cycles the Stokes drift associated with the wave tilts vertical vorticity into the horizontal direction, subsequently stretching it into elongated streamwise vortices, which come to dominate the flow. These results are shown to be strikingly different from turbulence distorted by a mean shear flow, when `streaky structures' of high and low streamwise velocity fluctuations develop. It is shown that, in the case of distortion by a mean shear flow, the tendency for the mean shear to produce streamwise vortices by distortion of the turbulent vorticity is largely cancelled by a distortion of the mean vorticity by the turbulent fluctuations. This latter process is absent in distortion by Stokes drift, since there is then no mean vorticity. The components of the Reynolds stress and the integral length scales computed from turbulence distorted by Stokes drift show the same behaviour as in the simulations of Langmuir turbulence reported by McWilliams, Sullivan & Moeng (1997). Hence we suggest that turbulent vorticity in the upper ocean, such as produced by breaking waves, may help to provide the initial seeds for Langmuir circulations, thereby complementing the shear-flow instability mechanism developed by Craik & Leibovich (1976). The tilting of the vertical vorticity into the horizontal by the Stokes drift tends also to produce a shear stress that does work against the mean straining associated with the wave orbital motions. The turbulent kinetic energy then increases at the expense of energy in the wave. Hence the wave decays. An expression for the wave attenuation rate is obtained by scaling the equation for the wave energy, and is found to be broadly consistent with available laboratory data.
Resumo:
Extratropical cyclone lifecycles have been studied extensively with the aim of understanding the dynamical mechanisms involved in their development. Previous work has often been based on subjective analysis of individual case studies. Such case studies have contributed heavily to the generation of conceptual models of extratropical cyclones that provide a framework for understanding the dynamical evolution of cyclones. These conceptual models are widely used in educational meteorology courses throughout the world to illustrate the basic structure and evolution of extratropical cyclones. This article presents a database of extratropical cyclone composites which highlight the average structure and evolution of 20 years of extratropical cyclones, as opposed to individual case studies. The composite fields are achieved by combining a database containing cyclone tracks from the ERA-Interim reanalysis (1989-2009, 6 hourly) with the full 3D ERA-Interim reanalysis fields. Vertical and horizontal composites of cyclone structure for cyclones generated in the Atlantic and Pacific regions identifying features such as the relative positions of cold, warm and occluded fronts and their associated wind and cloud patterns are shown. In addition the evolution of cyclonic flows such as the warm and cold conveyor belts and dry intrusion are illustrated. A webpage containing an archive of the composited data is freely available for educational purposes.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.
Resumo:
Atlantic Multidecadal Variability (AMV) is investigated in a millennial control simulation with the Kiel Climate Model (KCM), a coupled atmosphere–ocean–sea ice model. An oscillatory mode with approximately 60 years period and characteristics similar to observations is identified with the aid of three-dimensional temperature and salinity joint empirical orthogonal function analysis. The mode explains 30 % of variability on centennial and shorter timescales in the upper 2,000 m of the North Atlantic. It is associated with changes in the Atlantic Meridional Overturning Circulation (AMOC) of ±1–2 Sv and Atlantic Sea Surface Temperature (SST) of ±0.2 °C. AMV in KCM results from an out-of-phase interaction between horizontal and vertical ocean circulation, coupled through Irminger Sea convection. Wintertime convection in this region is mainly controlled by salinity anomalies transported by the Subpolar Gyre (SPG). Increased (decreased) dense water formation in this region leads to a stronger (weaker) AMOC after 15 years, and this in turn leads to a weaker (stronger) SPG after another 15 years. The key role of salinity variations in the subpolar North Atlantic for AMV is confirmed in a 1,000 year long simulation with salinity restored to model climatology: No low frequency variations in convection are simulated, and the 60 year mode of variability is absent.
Resumo:
High-resolution simulations over a large tropical domain (∼20◦S–20◦N and 42◦E–180◦E) using both explicit and parameterized convection are analyzed and compared to observations during a 10-day case study of an active Madden-Julian Oscillation (MJO) event. The parameterized convection model simulations at both 40 km and 12 km grid spacing have a very weak MJO signal and little eastward propagation. A 4 km explicit convection simulation using Smagorinsky subgrid mixing in the vertical and horizontal dimensions exhibits the best MJO strength and propagation speed. 12 km explicit convection simulations also perform much better than the 12 km parameterized convection run, suggesting that the convection scheme, rather than horizontal resolution, is key for these MJO simulations. Interestingly, a 4 km explicit convection simulation using the conventional boundary layer scheme for vertical subgrid mixing (but still using Smagorinsky horizontal mixing) completely loses the large-scale MJO organization, showing that relatively high resolution with explicit convection does not guarantee a good MJO simulation. Models with a good MJO representation have a more realistic relationship between lower-free-tropospheric moisture and precipitation, supporting the idea that moisture-convection feedback is a key process for MJO propagation. There is also increased generation of available potential energy and conversion of that energy into kinetic energy in models with a more realistic MJO, which is related to larger zonal variance in convective heating and vertical velocity, larger zonal temperature variance around 200 hPa, and larger correlations between temperature and ascent (and between temperature and diabatic heating) between 500–400 hPa.