939 resultados para Numeric simulations


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of northern high-latitude peatlands played an important role in the carbon (C) balance of the land biosphere since the Last Glacial Maximum (LGM). At present, carbon storage in northern peatlands is substantial and estimated to be 500 ± 100 Pg C (1 Pg C = 1015 g C). Here, we develop and apply a peatland module embedded in a dynamic global vegetation and land surface process model (LPX-Bern 1.0). The peatland module features a dynamic nitrogen cycle, a dynamic C transfer between peatland acrotelm (upper oxic layer) and catotelm (deep anoxic layer), hydrology- and temperature-dependent respiration rates, and peatland specific plant functional types. Nitrogen limitation down-regulates average modern net primary productivity over peatlands by about half. Decadal acrotelm-to-catotelm C fluxes vary between −20 and +50 g C m−2 yr−1 over the Holocene. Key model parameters are calibrated with reconstructed peat accumulation rates from peat-core data. The model reproduces the major features of the peat core data and of the observation-based modern circumpolar soil carbon distribution. Results from a set of simulations for possible evolutions of northern peat development and areal extent show that soil C stocks in modern peatlands increased by 365–550 Pg C since the LGM, of which 175–272 Pg C accumulated between 11 and 5 kyr BP. Furthermore, our simulations suggest a persistent C sequestration rate of 35–50 Pg C per 1000 yr in present-day peatlands under current climate conditions, and that this C sink could either sustain or turn towards a source by 2100 AD depending on climate trajectories as projected for different representative greenhouse gas concentration pathways.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Changes in Greenland accumulation and the stability in the relationship between accumulation variability and large-scale circulation are assessed by performing time-slice simulations for the present day, the preindustrial era, the early Holocene, and the Last Glacial Maximum (LGM) with a comprehensive climate model. The stability issue is an important prerequisite for reconstructions of Northern Hemisphere atmospheric circulation variability based on accumulation or precipitation proxy records from Greenland ice cores. The analysis reveals that the relationship between accumulation variability and large-scale circulation undergoes a significant seasonal cycle. As the contributions of the individual seasons to the annual signal change, annual mean accumulation variability is not necessarily related to the same atmospheric circulation patterns during the different climate states. Interestingly, within a season, local Greenland accumulation variability is indeed linked to a consistent circulation pattern, which is observed for all studied climate periods, even for the LGM. Hence, it would be possible to deduce a reliable reconstruction of seasonal atmospheric variability (e.g., for North Atlantic winters) if an accumulation or precipitation proxy were available that resolves single seasons. We further show that the simulated impacts of orbital forcing and changes in the ice sheet topography on Greenland accumulation exhibit strong spatial differences, emphasizing that accumulation records from different ice core sites regarding both interannual and long-term (centennial to millennial) variability cannot be expected to look alike since they include a distinct local signature. The only uniform signal to external forcing is the strong decrease in Greenland accumulation during glacial (LGM) conditions and an increase associated with the recent rise in greenhouse gas concentrations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the observation that stochasticity is important in biological systems, chemical kinetics have begun to receive wider interest. While the use of Monte Carlo discrete event simulations most accurately capture the variability of molecular species, they become computationally costly for complex reaction-diffusion systems with large populations of molecules. On the other hand, continuous time models are computationally efficient but they fail to capture any variability in the molecular species. In this study a hybrid stochastic approach is introduced for simulating reaction-diffusion systems. We developed an adaptive partitioning strategy in which processes with high frequency are simulated with deterministic rate-based equations, and those with low frequency using the exact stochastic algorithm of Gillespie. Therefore the stochastic behavior of cellular pathways is preserved while being able to apply it to large populations of molecules. We describe our method and demonstrate its accuracy and efficiency compared with the Gillespie algorithm for two different systems. First, a model of intracellular viral kinetics with two steady states and second, a compartmental model of the postsynaptic spine head for studying the dynamics of Ca+2 and NMDA receptors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monte Carlo simulations arrive at their results by introducing randomness, sometimes derived from a physical randomizing device. Nonetheless, we argue, they open no new epistemic channels beyond that already employed by traditional simulations: the inference by ordinary argumentation of conclusions from assumptions built into the simulations. We show that Monte Carlo simulations cannot produce knowledge other than by inference, and that they resemble other computer simulations in the manner in which they derive their conclusions. Simple examples of Monte Carlo simulations are analysed to identify the underlying inferences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monte Carlo simulation is a powerful method in many natural and social sciences. But what sort of method is it? And where does its power come from? Are Monte Carlo simulations experiments, theories or something else? The aim of this talk is to answer these questions and to explain the power of Monte Carlo simulations. I provide a classification of Monte Carlo techniques and defend the claim that Monte Carlo simulation is a sort of inference.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimized regional climate simulations are conducted using the Polar MM5, a version of the fifth-generation Pennsylvania State University-NCAR Mesoscale Model (MM5), with a 60-km horizontal resolution domain over North America during the Last Glacial Maximum (LGM, 21 000 calendar years ago), when much of the continent was covered by the Laurentide Ice Sheet (LIS). The objective is to describe the LGM annual cycle at high spatial resolution with an emphasis on the winter atmospheric circulation. Output from a tailored NCAR Community Climate Model version 3 (CCM3) simulation of the LGM climate is used to provide the initial and lateral boundary conditions for Polar MM5. LGM boundary conditions include continental ice sheets, appropriate orbital forcing, reduced CO2 concentration, paleovegetation, modified sea surface temperatures, and lowered sea level. Polar MM5 produces a substantially different atmospheric response to the LGM boundary conditions than CCM3 and other recent GCM simulations. In particular, from November to April the upper-level flow is split around a blocking anticyclone over the LIS, with a northern branch over the Canadian Arctic and a southern branch impacting southern North America. The split flow pattern is most pronounced in January and transitions into a single, consolidated jet stream that migrates northward over the LIS during summer. Sensitivity experiments indicate that the winter split flow in Polar MM5 is primarily due to mechanical forcing by LIS, although model physics and resolution also contribute to the simulated flow configuration. Polar MM5 LGM results are generally consistent with proxy climate estimates in the western United States, Alaska, and the Canadian Arctic and may help resolve some long-standing discrepancies between proxy data and previous simulations of the LGM climate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Princeton Ocean Model is used to study the circulation features in the Pearl River Estuary and their responses to tide, river discharge, wind, and heat flux in the winter dry and summer wet seasons. The model has an orthogonal curvilinear grid in the horizontal plane with variable spacing from 0.5 km in the estuary to 1 km on the shelf and 15 sigma levels in the vertical direction. The initial conditions and the subtidal open boundary forcing are obtained from an associated larger-scale model of the northern South China Sea. Buoyancy forcing uses the climatological monthly heat fluxes and river discharges, and both the climatological monthly wind and the realistic wind are used in the sensitivity experiments. The tidal forcing is represented by sinusoidal functions with the observed amplitudes and phases. In this paper, the simulated tide is first examined. The simulated seasonal distributions of the salinity, as well as the temporal variations of the salinity and velocity over a tidal cycle are described and then compared with the in situ survey data from July 1999 and January 2000. The model successfully reproduces the main hydrodynamic processes, such as the stratification, mixing, frontal dynamics, summer upwelling, two-layer gravitational circulation, etc., and the distributions of hydrodynamic parameters in the Pearl River Estuary and coastal waters for both the winter and the summer season.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a conceptual prototype model of a focal plane array unit for the STEAMR instrument, highlighting the challenges presented by the required high relative beam proximity of the instrument and focus on how edge-diffraction effects contribute to the array's performance. The analysis was carried out as a comparative process using both PO & PTD and MoM techniques. We first highlight general differences between these computational techniques, with the discussion focusing on diffractive edge effects for near-field imaging reflectors with high truncation. We then present the results of in-depth modeling analyses of the STEAMR focal plane array followed by near-field antenna measurements of a breadboard model of the array. The results of these near-field measurements agree well with both simulation techniques although MoM shows slightly higher complex beam coupling to the measurements than PO & PTD.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Peptide dendrimers are synthetic tree-like molecules composed of amino acids. There are at least two kinds of preferential structural behaviors exhibited by these molecules, which acquire either compact or noncompact shapes. However, the key structural determinants of such behaviors remained, until now, unstudied. Herein, we conduct a comprehensive investigation of the structural determinants of peptide dendrimers by employing long molecular dynamics simulations to characterize an extended set of third generation dendrimers. Our results clearly show that a trade-off between electrostatic effects and hydrogen bond formation controls structure acquisition in these systems. Moreover, by selectively changing the dendrimers charge we are able to manipulate the exhibited compactness. In contrast, the length of branching residues does not seem to be a major structural determinant. Our results are in accordance with the most recent experimental evidence and shed some light on the key molecular level interactions controlling structure acquisition in these systems. Thus, the results presented constitute valuable insights that can contribute to the development of truly tailor-made dendritic systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aims to evaluate the direct effects of anthropogenic deforestation on simulated climate at two contrasting periods in the Holocene, ~6 and ~0.2 k BP in Europe. We apply We apply the Rossby Centre regional climate model RCA3, a regional climate model with 50 km spatial resolution, for both time periods, considering three alternative descriptions of the past vegetation: (i) potential natural vegetation (V) simulated by the dynamic vegetation model LPJ-GUESS, (ii) potential vegetation with anthropogenic land use (deforestation) from the HYDE3.1 (History Database of the Global Environment) scenario (V + H3.1), and (iii) potential vegetation with anthropogenic land use from the KK10 scenario (V + KK10). The climate model results show that the simulated effects of deforestation depend on both local/regional climate and vegetation characteristics. At ~6 k BP the extent of simulated deforestation in Europe is generally small, but there are areas where deforestation is large enough to produce significant differences in summer temperatures of 0.5–1 °C. At ~0.2 k BP, extensive deforestation, particularly according to the KK10 model, leads to significant temperature differences in large parts of Europe in both winter and summer. In winter, deforestation leads to lower temperatures because of the differences in albedo between forested and unforested areas, particularly in the snow-covered regions. In summer, deforestation leads to higher temperatures in central and eastern Europe because evapotranspiration from unforested areas is lower than from forests. Summer evaporation is already limited in the southernmost parts of Europe under potential vegetation conditions and, therefore, cannot become much lower. Accordingly, the albedo effect dominates in southern Europe also in summer, which implies that deforestation causes a decrease in temperatures. Differences in summer temperature due to deforestation range from −1 °C in south-western Europe to +1 °C in eastern Europe. The choice of anthropogenic land-cover scenario has a significant influence on the simulated climate, but uncertainties in palaeoclimate proxy data for the two time periods do not allow for a definitive discrimination among climate model results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Is numerical mimicry a third way of establishing truth? Kevin Heng received his M.S. and Ph.D. in astrophysics from the Joint Institute for Laboratory Astrophysics (JILA) and the University of Colorado at Boulder. He joined the Institute for Advanced Study in Princeton from 2007 to 2010, first as a Member and later as the Frank & Peggy Taplin Member. From 2010 to 2012 he was a Zwicky Prize Fellow at ETH Z¨urich (the Swiss Federal Institute of Technology). In 2013, he joined the Center for Space and Habitability (CSH) at the University of Bern, Switzerland, as a tenure-track assistant professor, where he leads the Exoplanets and Exoclimes Group. He has worked on, and maintains, a broad range of interests in astrophysics: shocks, extrasolar asteroid belts, planet formation, fluid dynamics, brown dwarfs and exoplanets. He coordinates the Exoclimes Simulation Platform (ESP), an open-source set of theoretical tools designed for studying the basic physics and chemistry of exoplanetary atmospheres and climates (www.exoclime.org). He is involved in the CHEOPS (Characterizing Exoplanet Satellite) space telescope, a mission approved by the European Space Agency (ESA) and led by Switzerland. He spends a fair amount of time humbly learning the lessons gleaned from studying the Earth and Solar System planets, as related to him by atmospheric, climate and planetary scientists. He received a Sigma Xi Grant-in-Aid of Research in 2006

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We review our recent work on protein-ligand interactions in vitamin transporters of the Sec-14-like protein. Our studies focused on the cellular-retinaldehyde binding protein (CRALBP) and the alpha-tocopherol transfer protein (alpha-TTP). CRALBP is responsible for mobilisation and photo-protection of short-chain cis-retinoids in the dim-light visual cycle or rod photoreceptors. alpha-TTP is a key protein responsible for selection and retention of RRR-alpha-tocopherol, the most active isoform of vitamin E in superior animals. Our simulation studies evidence how subtle chemical variations in the substrate can lead to significant distortion in the structure of the complex, and how these changes can either lead to new protein function, or be used to model engineered protein variants with tailored properties. Finally, we show how integration of computational and experimental results can contribute in synergy to the understanding of fundamental processes at the biomolecular scale.