922 resultados para large-eddy simulation
Resumo:
A millimetre-wave scintillometer was paired with an infrared scintillometer, enabling estimation of large-area evapotranspiration across northern Swindon, a suburban area in the UK. Both sensible and latent heat fluxes can be obtained using this "two-wavelength" technique, as it is able to provide both temperature and humidity structure parameters, offering a major advantage over conventional single-wavelength scintillometry. The first paper of this two-part series presented the measurement theory and structure parameters. In this second paper, heat fluxes are obtained and analysed. These fluxes, estimated using two-wavelength scintillometry over an urban area, are the first of their kind. Source area modelling suggests the scintillometric fluxes are representative of 5–10 km2. For comparison, local-scale (0.05–0.5 km2) fluxes were measured by an eddy covariance station. Similar responses to seasonal changes are evident at the different scales but the energy partitioning varies between source areas. The response to moisture availability is explored using data from 2 consecutive years with contrasting rainfall patterns (2011–2012). This extensive data set offers insight into urban surface-atmosphere interactions and demonstrates the potential for two-wavelength scintillometry to deliver fluxes over mixed land cover, typically representative of an area 1–2 orders of magnitude greater than for eddy covariance measurements. Fluxes at this scale are extremely valuable for hydro-meteorological model evaluation and assessment of satellite data products
Resumo:
Instrumental observations, palaeo-proxies, and climate models suggest significant decadal variability within the North Atlantic subpolar gyre (NASPG). However, a poorly sampled observational record and a diversity of model behaviours mean that the precise nature and mechanisms of this variability are unclear. Here, we analyse an exceptionally large multi-model ensemble of 42 present-generation climate models to test whether NASPG mean state biases systematically affect the representation of decadal variability. Temperature and salinity biases in the Labrador Sea co-vary and influence whether density variability is controlled by temperature or salinity variations. Ocean horizontal resolution is a good predictor of the biases and the location of the dominant dynamical feedbacks within the NASPG. However, we find no link to the spectral characteristics of the variability. Our results suggest that the mean state and mechanisms of variability within the NASPG are not independent. This represents an important caveat for decadal predictions using anomaly-assimilation methods.
Resumo:
How tropical cyclone (TC) activity in the northwestern Pacific might change in a future climate is assessed using multidecadal Atmospheric Model Intercomparison Project (AMIP)-style and time-slice simulations with the ECMWF Integrated Forecast System (IFS) at 16-km and 125-km global resolution. Both models reproduce many aspects of the present-day TC climatology and variability well, although the 16-km IFS is far more skillful in simulating the full intensity distribution and genesis locations, including their changes in response to El Niño–Southern Oscillation. Both IFS models project a small change in TC frequency at the end of the twenty-first century related to distinct shifts in genesis locations. In the 16-km IFS, this shift is southward and is likely driven by the southeastward penetration of the monsoon trough/subtropical high circulation system and the southward shift in activity of the synoptic-scale tropical disturbances in response to the strengthening of deep convective activity over the central equatorial Pacific in a future climate. The 16-km IFS also projects about a 50% increase in the power dissipation index, mainly due to significant increases in the frequency of the more intense storms, which is comparable to the natural variability in the model. Based on composite analysis of large samples of supertyphoons, both the development rate and the peak intensities of these storms increase in a future climate, which is consistent with their tendency to develop more to the south, within an environment that is thermodynamically more favorable for faster development and higher intensities. Coherent changes in the vertical structure of supertyphoon composites show system-scale amplification of the primary and secondary circulations with signs of contraction, a deeper warm core, and an upward shift in the outflow layer and the frequency of the most intense updrafts. Considering the large differences in the projections of TC intensity change between the 16-km and 125-km IFS, this study further emphasizes the need for high-resolution modeling in assessing potential changes in TC activity.
Resumo:
As part of an international intercomparison project, a set of single column models (SCMs) and cloud-resolving models (CRMs) are run under the weak temperature gradient (WTG) method and the damped gravity wave (DGW) method. For each model, the implementation of the WTG or DGW method involves a simulated column which is coupled to a reference state defined with profiles obtained from the same model in radiative-convective equilibrium. The simulated column has the same surface conditions as the reference state and is initialized with profiles from the reference state. We performed systematic comparison of the behavior of different models under a consistent implementation of the WTG method and the DGW method and systematic comparison of the WTG and DGW methods in models with different physics and numerics. CRMs and SCMs produce a variety of behaviors under both WTG and DGW methods. Some of the models reproduce the reference state while others sustain a large-scale circulation which results in either substantially lower or higher precipitation compared to the value of the reference state. CRMs show a fairly linear relationship between precipitation and circulation strength. SCMs display a wider range of behaviors than CRMs. Some SCMs under the WTG method produce zero precipitation. Within an individual SCM, a DGW simulation and a corresponding WTG simulation can produce different signed circulation. When initialized with a dry troposphere, DGW simulations always result in a precipitating equilibrium state. The greatest sensitivities to the initial moisture conditions occur for multiple stable equilibria in some WTG simulations, corresponding to either a dry equilibrium state when initialized as dry or a precipitating equilibrium state when initialized as moist. Multiple equilibria are seen in more WTG simulations for higher SST. In some models, the existence of multiple equilibria is sensitive to some parameters in the WTG calculations.
Resumo:
The atmospheric response to an idealized decline in Arctic sea ice is investigated in a novel fully coupled climate model experiment. In this experiment two ensembles of single-year model integrations are performed starting on 1 April, the approximate start of the ice melt season. By perturbing the initial conditions of sea ice thickness (SIT), declines in both sea ice concentration and SIT, which result in sea ice distributions that are similar to the recent sea ice minima of 2007 and 2012, are induced. In the ice loss regions there are strong (~3 K) local increases in sea surface temperature (SST); additionally, there are remote increases in SST in the central North Pacific and subpolar gyre in the North Atlantic. Over the central Arctic there are increases in surface air temperature (SAT) of ~8 K due to increases in ocean–atmosphere heat fluxes. There are increases in SAT over continental North America that are in good agreement with recent changes as seen by reanalysis data. It is estimated that up to two-thirds of the observed increase in SAT in this region could be related to Arctic sea ice loss. In early summer there is a significant but weak atmospheric circulation response that projects onto the summer North Atlantic Oscillation (NAO). In early summer and early autumn there is an equatorward shift of the eddy-driven jet over the North Atlantic as a result of a reduction in the meridional temperature gradients. In winter there is no projection onto a particular phase of the NAO.
Resumo:
The Madden-Julian Oscillation (MJO) is the dominant mode of intraseasonal variability in the Trop- ics. It can be characterised as a planetary-scale coupling between the atmospheric circulation and organised deep convection that propagates east through the equatorial Indo-Pacific region. The MJO interacts with weather and climate systems on a near-global scale and is a crucial source of predictability for weather forecasts on medium to seasonal timescales. Despite its global signifi- cance, accurately representing the MJO in numerical weather prediction (NWP) and climate models remains a challenge. This thesis focuses on the representation of the MJO in the Integrated Forecasting System (IFS) at the European Centre for Medium-Range Weather Forecasting (ECMWF), a state-of-the-art NWP model. Recent modifications to the model physics in Cycle 32r3 (Cy32r3) of the IFS led to ad- vances in the simulation of the MJO; for the first time the observed amplitude of the MJO was maintained throughout the integration period. A set of hindcast experiments, which differ only in their formulation of convection, have been performed between May 2008 and April 2009 to asses the sensitivity of MJO simulation in the IFS to the Cy32r3 convective parameterization. Unique to this thesis is the attribution of the advances in MJO simulation in Cy32r3 to the mod- ified convective parameterization, specifically, the relative-humidity-dependent formulation for or- ganised deep entrainment. Increasing the sensitivity of the deep convection scheme to environmen- tal moisture is shown to modify the relationship between precipitation and moisture in the model. Through dry-air entrainment, convective plumes ascending in low-humidity environments terminate lower in the atmosphere. As a result, there is an increase in the occurrence of cumulus congestus, which acts to moisten the mid-troposphere. Due to the modified precipitation-moisture relationship more moisture is able to build up which effectively preconditions the tropical atmosphere for the transition to deep convection. Results from this thesis suggest that a tropospheric moisture control on convection is key to simulating the interaction between the physics and large-scale circulation associated with the MJO.
Resumo:
Despite the importance of dust aerosol in the Earth system, state-of-the-art models show a large variety for North African dust emission. This study presents a systematic evaluation of dust emitting-winds in 30 years of the historical model simulation with the UK Met Office Earth-system model HadGEM2-ES for the Coupled Model Intercomparison Project Phase 5. Isolating the effect of winds on dust emission and using an automated detection for nocturnal low-level jets (NLLJs) allow an in-depth evaluation of the model performance for dust emission from a meteorological perspective. The findings highlight that NLLJs are a key driver for dust emission in HadGEM2-ES in terms of occurrence frequency and strength. The annually and spatially averaged occurrence frequency of NLLJs is similar in HadGEM2-ES and ERA-Interim from the European Centre for Medium-Range Weather Forecasts. Compared to ERA-Interim, a stronger pressure ridge over northern Africa in winter and the southward displaced heat low in summer result in differences in location and strength of NLLJs. Particularly the larger geostrophic winds associated with the stronger ridge have a strengthening effect on NLLJs over parts of West Africa in winter. Stronger NLLJs in summer may rather result from an artificially increased mixing coefficient under stable stratification that is weaker in HadGEM2-ES. NLLJs in the Bodélé Depression are affected by stronger synoptic-scale pressure gradients in HadGEM2-ES. Wintertime geostrophic winds can even be so strong that the associated vertical wind shear prevents the formation of NLLJs. These results call for further model improvements in the synoptic-scale dynamics and the physical parametrization of the nocturnal stable boundary layer to better represent dust-emitting processes in the atmospheric model. The new approach could be used for identifying systematic behavior in other models with respect to meteorological processes for dust emission. This would help to improve dust emission simulations and contribute to decreasing the currently large uncertainty in climate change projections with respect to dust aerosol.
Resumo:
The Southern Ocean is a critical region for global climate, yet large cloud and solar radiation biases over the Southern Ocean are a long-standing problem in climate models and are poorly understood, leading to biases in simulated sea surface temperatures. This study shows that supercooled liquid clouds are central to understanding and simulating the Southern Ocean environment. A combination of satellite observational data and detailed radiative transfer calculations is used to quantify the impact of cloud phase and cloud vertical structure on the reflected solar radiation in the Southern Hemisphere summer. It is found that clouds with supercooled liquid tops dominate the population of liquid clouds. The observations show that clouds with supercooled liquid tops contribute between 27% and 38% to the total reflected solar radiation between 40° and 70°S, and climate models are found to poorly simulate these clouds. The results quantify the importance of supercooled liquid clouds in the Southern Ocean environment and highlight the need to improve understanding of the physical processes that control these clouds in order to improve their simulation in numerical models. This is not only important for improving the simulation of present-day climate and climate variability, but also relevant for increasing confidence in climate feedback processes and future climate projections.
Resumo:
The stratospheric mean-meridional circulation (MMC) and eddy mixing are compared among six meteorological reanalysis data sets: NCEP-NCAR, NCEP-CFSR, ERA-40, ERA-Interim, JRA-25, and JRA-55 for the period 1979–2012. The reanalysis data sets produced using advanced systems (i.e., NCEP-CFSR, ERA-Interim, and JRA-55) generally reveal a weaker MMC in the Northern Hemisphere (NH) compared with those produced using older systems (i.e., NCEP/NCAR, ERA-40, and JRA-25). The mean mixing strength differs largely among the data products. In the NH lower stratosphere, the contribution of planetary-scale mixing is larger in the new data sets than in the old data sets, whereas that of small-scale mixing is weaker in the new data sets. Conventional data assimilation techniques introduce analysis increments without maintaining physical balance, which may have caused an overly strong MMC and spurious small-scale eddies in the old data sets. At the NH mid-latitudes, only ERA-Interim reveals a weakening MMC trend in the deep branch of the Brewer–Dobson circulation (BDC). The relative importance of the eddy mixing compared with the mean-meridional transport in the subtropical lower stratosphere shows increasing trends in ERA-Interim and JRA-55; this together with the weakened MMC in the deep branch may imply an increasing age-of-air (AoA) in the NH middle stratosphere in ERA-Interim. Overall, discrepancies between the different variables and trends therein as derived from the different reanalyses are still relatively large, suggesting that more investments in these products are needed in order to obtain a consolidated picture of observed changes in the BDC and the mechanisms that drive them.
Resumo:
Electron transport parameters are important in several areas ranging from particle detectors to plasma-assisted processing reactors. Nevertheless, especially at high fields strengths and for complex gases, relatively few data are published. A dedicated setup has been developed to measure the electron drift velocity and the first Townsend coefficient in parallel plate geometry. An RPC-like cell has been adopted to reach high field strengths without the risk of destructive sparks. The validation data obtained with pure Nitrogen will be presented and compared to a selection of the available literature and to calculations performed with Magboltz 2 version 8.6. The new data collected in pure Isobutane will then be discussed. This is the first time the electron drift velocity in pure Isobutane is measured well into the saturation region. Good agreement is found with expectations from Magboltz. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Large-scale simulations of parts of the brain using detailed neuronal models to improve our understanding of brain functions are becoming a reality with the usage of supercomputers and large clusters. However, the high acquisition and maintenance cost of these computers, including the physical space, air conditioning, and electrical power, limits the number of simulations of this kind that scientists can perform. Modern commodity graphical cards, based on the CUDA platform, contain graphical processing units (GPUs) composed of hundreds of processors that can simultaneously execute thousands of threads and thus constitute a low-cost solution for many high-performance computing applications. In this work, we present a CUDA algorithm that enables the execution, on multiple GPUs, of simulations of large-scale networks composed of biologically realistic Hodgkin-Huxley neurons. The algorithm represents each neuron as a CUDA thread, which solves the set of coupled differential equations that model each neuron. Communication among neurons located in different GPUs is coordinated by the CPU. We obtained speedups of 40 for the simulation of 200k neurons that received random external input and speedups of 9 for a network with 200k neurons and 20M neuronal connections, in a single computer with two graphic boards with two GPUs each, when compared with a modern quad-core CPU. Copyright (C) 2010 John Wiley & Sons, Ltd.
Resumo:
This paper reports the findings of using multi-agent based simulation model to evaluate the sawmill yard operations within a large privately owned sawmill in Sweden, Bergkvist Insjön AB in the current case. Conventional working routines within sawmill yard threaten the overall efficiency and thereby limit the profit margin of sawmill. Deploying dynamic work routines within the sawmill yard is not readily feasible in real time, so discrete event simulation model has been investigated to be able to report optimal work order depending on the situations. Preliminary investigations indicate that the results achieved by simulation model are promising. It is expected that the results achieved in the current case will support Bergkvist-Insjön AB in making optimal decisions by deploying efficient work order in sawmill yard.
Resumo:
Running hydrodynamic models interactively allows both visual exploration and change of model state during simulation. One of the main characteristics of an interactive model is that it should provide immediate feedback to the user, for example respond to changes in model state or view settings. For this reason, such features are usually only available for models with a relatively small number of computational cells, which are used mainly for demonstration and educational purposes. It would be useful if interactive modeling would also work for models typically used in consultancy projects involving large scale simulations. This results in a number of technical challenges related to the combination of the model itself and the visualisation tools (scalability, implementation of an appropriate API for control and access to the internal state). While model parallelisation is increasingly addressed by the environmental modeling community, little effort has been spent on developing a high-performance interactive environment. What can we learn from other high-end visualisation domains such as 3D animation, gaming, virtual globes (Autodesk 3ds Max, Second Life, Google Earth) that also focus on efficient interaction with 3D environments? In these domains high efficiency is usually achieved by the use of computer graphics algorithms such as surface simplification depending on current view, distance to objects, and efficient caching of the aggregated representation of object meshes. We investigate how these algorithms can be re-used in the context of interactive hydrodynamic modeling without significant changes to the model code and allowing model operation on both multi-core CPU personal computers and high-performance computer clusters.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The dispersion of pollutants in the environment is an issue of great interest as it directly affects air quality, mainly in large cities. Experimental and numerical tools have been used to predict the behavior of pollutant species dispersion in the atmosphere. A software has been developed based on the control-volume based on the finite element method in order to obtain two-dimensional simulations of Navier-Stokes equations and heat or mass transportation in regions with obstacles, varying position of the pollutant source. Numeric results of some applications were obtained and, whenever possible, compared with literature results showing satisfactory accordance. Copyright (C) 2010 John Wiley & Sons, Ltd.