146 resultados para Models and Principles


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We investigate the spatial characteristics of urban-like canopy flow by applying particle image velocimetry (PIV) to atmospheric turbulence. The study site was a Comprehensive Outdoor Scale MOdel (COSMO) experiment for urban climate in Japan. The PIV system captured the two-dimensional flow field within the canopy layer continuously for an hour with a sampling frequency of 30 Hz, thereby providing reliable outdoor turbulence statistics. PIV measurements in a wind-tunnel facility using similar roughness geometry, but with a lower sampling frequency of 4 Hz, were also done for comparison. The turbulent momentum flux from COSMO, and the wind tunnel showed similar values and distributions when scaled using friction velocity. Some different characteristics between outdoor and indoor flow fields were mainly caused by the larger fluctuations in wind direction for the atmospheric turbulence. The focus of the analysis is on a variety of instantaneous turbulent flow structures. One remarkable flow structure is termed 'flushing', that is, a large-scale upward motion prevailing across the whole vertical cross-section of a building gap. This is observed intermittently, whereby tracer particles are flushed vertically out from the canopy layer. Flushing phenomena are also observed in the wind tunnel where there is neither thermal stratification nor outer-layer turbulence. It is suggested that flushing phenomena are correlated with the passing of large-scale low-momentum regions above the canopy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Using the plausible model of activated carbon proposed by Harris and co-workers and grand canonical Monte Carlo simulations, we study the applicability of standard methods for describing adsorption data on microporous carbons widely used in adsorption science. Two carbon structures are studied, one with a small distribution of micropores in the range up to 1 nm, and the other with micropores covering a wide range of porosity. For both structures, adsorption isotherms of noble gases (from Ne to Xe), carbon tetrachloride and benzene are simulated. The data obtained are considered in terms of Dubinin-Radushkevich plots. Moreover, for benzene and carbon tetrachloride the temperature invariance of the characteristic curve is also studied. We show that using simulated data some empirical relationships obtained from experiment can be successfully recovered. Next we test the applicability of Dubinin's related models including the Dubinin-Izotova, Dubinin-Radushkevich-Stoeckli, and Jaroniec-Choma equations. The results obtained demonstrate the limits and applications of the models studied in the field of carbon porosity characterization.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study evaluated the effects of fat and sugar levels on the surface properties of Lactobacillus rhamnosus GG during storage in food model systems, simulating yogurt and ice cream, and related them with the ability of the bacterial cells to adhere to Caco-2 cells. Freeze-dried L. rhamnosus GG cells were added to the model food systems and stored for 7 days. The bacterial cells were analyzed for cell viability, hydrophobicity, ζ potential, and their ability to adhere to Caco-2 cells. The results indicated that the food type and its composition affected the surface and adhesion properties of the bacterial cells during storage, with yogurt being a better delivery vehicle than ice cream in terms of bacterial adhesion to Caco-2 cells. The most important factor influencing bacterial adhesion was the storage time rather than the levels of fats and sugars, indicating that conformational changes were taking place on the surface of the bacterial cells during storage.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Government targets for CO2 reductions are being progressively tightened, the Climate Change Act set the UK target as an 80% reduction by 2050 on 1990 figures. The residential sector accounts for about 30% of emissions. This paper discusses current modelling techniques in the residential sector: principally top-down and bottom-up. Top-down models work on a macro-economic basis and can be used to consider large scale economic changes; bottom-up models are detail rich to model technological changes. Bottom-up models demonstrate what is technically possible. However, there are differences between the technical potential and what is likely given the limited economic rationality of the typical householder. This paper recommends research to better understand individuals’ behaviour. Such research needs to include actual choices, stated preferences and opinion research to allow a detailed understanding of the individual end user. This increased understanding can then be used in an agent based model (ABM). In an ABM, agents are used to model real world actors and can be given a rule set intended to emulate the actions and behaviours of real people. This can help in understanding how new technologies diffuse. In this way a degree of micro-economic realism can be added to domestic carbon modelling. Such a model should then be of use for both forward projections of CO2 and to analyse the cost effectiveness of various policy measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current variability of precipitation (P) and its response to surface temperature (T) are analysed using coupled(CMIP5) and atmosphere-only (AMIP5) climate model simulations and compared with observational estimates. There is striking agreement between Global Precipitation Climatology Project (GPCP) observed and AMIP5 simulated P anomalies over land both globally and in the tropics suggesting that prescribed sea surface temperature and realistic radiative forcings are sufficient for simulating the interannual variability in continental P. Differences between the observed and simulated P variability over the ocean, originate primarily from the wet tropical regions, in particular the western Pacific, but are reduced slightly after 1995. All datasets show positive responses of P to T globally of around 2 %/K for simulations and 3-4 %/K in GPCP observations but model responses over the tropical oceans are around 3 times smaller than GPCP over the period 1988-2005. The observed anticorrelation between land and ocean P, linked with El Niño Southern Oscillation, is captured by the simulations. All data sets over the tropical ocean show a tendency for wet regions to become wetter and dry regions drier with warming. Over the wet region (75% precipitation percentile), the precipitation response is ~13-15%/K for GPCP and ~5%/K for models while trends in P are 2.4%/decade for GPCP, 0.6% /decade for CMIP5 and 0.9%/decade for AMIP5 suggesting that models are underestimating the precipitation responses or a deficiency exists in the satellite datasets.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate modeling is a complex process, requiring accurate and complete metadata in order to identify, assess and use climate data stored in digital repositories. The preservation of such data is increasingly important given the development of ever-increasingly complex models to predict the effects of global climate change. The EU METAFOR project has developed a Common Information Model (CIM) to describe climate data and the models and modelling environments that produce this data. There is a wide degree of variability between different climate models and modelling groups. To accommodate this, the CIM has been designed to be highly generic and flexible, with extensibility built in. METAFOR describes the climate modelling process simply as "an activity undertaken using software on computers to produce data." This process has been described as separate UML packages (and, ultimately, XML schemas). This fairly generic structure canbe paired with more specific "controlled vocabularies" in order to restrict the range of valid CIM instances. The CIM will aid digital preservation of climate models as it will provide an accepted standard structure for the model metadata. Tools to write and manage CIM instances, and to allow convenient and powerful searches of CIM databases,. Are also under development. Community buy-in of the CIM has been achieved through a continual process of consultation with the climate modelling community, and through the METAFOR team’s development of a questionnaire that will be used to collect the metadata for the Intergovernmental Panel on Climate Change’s (IPCC) Coupled Model Intercomparison Project Phase 5 (CMIP5) model runs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Robust and physically understandable responses of the global atmospheric water cycle to a warming climate are presented. By considering interannual responses to changes in surface temperature (T), observations and AMIP5 simulations agree on an increase in column integrated water vapor at the rate 7 %/K (in line with the Clausius­Clapeyron equation) and of precipitation at the rate 2-­3 %/K (in line with energetic constraints). Using simple and complex climate models, we demonstrate that radiative forcing by greenhouse gases is currently suppressing global precipitation (P) at ~ -0.15 %/decade. Along with natural variability, this can explain why observed trends in global P over the period 1988-2008 are close to zero. Regional responses in the global water cycle are strongly constrained by changes in moisture fluxes. Model simulations show an increased moisture flux into the tropical wet region at 900 hPa and an enhanced outflow (of smaller magnitude) at around 600 hPa with warming. Moisture transport explains an increase in P in the wet tropical regions and small or negative changes in the dry regions of the subtropics in CMIP5 simulations of a warming climate. For AMIP5 simulations and satellite observations, the heaviest 5-day rainfall totals increase in intensity at ~15 %/K over the ocean with reductions at all percentiles over land. The climate change response in CMIP5 simulations shows consistent increases in P over ocean and land for the highest intensities, close to the Clausius-Clapeyron scaling of 7 %/K, while P declines for the lowest percentiles, indicating that interannual variability over land may not be a good proxy for climate change. The local changes in precipitation and its extremes are highly dependent upon small shifts in the large-scale atmospheric circulation and regional feedbacks.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of the Chemistry‐Climate Model Validation (CCMVal) activity is to improve understanding of chemistry‐climate models (CCMs) through process‐oriented evaluation and to provide reliable projections of stratospheric ozone and its impact on climate. An appreciation of the details of model formulations is essential for understanding how models respond to the changing external forcings of greenhouse gases and ozonedepleting substances, and hence for understanding the ozone and climate forecasts produced by the models participating in this activity. Here we introduce and review the models used for the second round (CCMVal‐2) of this intercomparison, regarding the implementation of chemical, transport, radiative, and dynamical processes in these models. In particular, we review the advantages and problems associated with approaches used to model processes of relevance to stratospheric dynamics and chemistry. Furthermore, we state the definitions of the reference simulations performed, and describe the forcing data used in these simulations. We identify some developments in chemistry‐climate modeling that make models more physically based or more comprehensive, including the introduction of an interactive ocean, online photolysis, troposphere‐stratosphere chemistry, and non‐orographic gravity‐wave deposition as linked to tropospheric convection. The relatively new developments indicate that stratospheric CCM modeling is becoming more consistent with our physically based understanding of the atmosphere.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent high-resolution radiosonde climatologies have revealed a tropopause inversion layer (TIL) in the extratropics: temperature strongly increases just above a sharp local cold point tropopause. Here, it is asked to what extent a TIL exists in current general circulation models (GCMs) and meteorological analyses. Only a weak hint of a TIL exists in NCEP/NCAR reanalysis data. In contrast, the Canadian Middle Atmosphere Model (CMAM), a comprehensive GCM, exhibits a TIL of realistic strength. However, in data assimilation mode CMAM exhibits a much weaker TIL, especially in the Southern Hemisphere where only coarse satellite data are available. The discrepancy between the analyses and the GCM is thus hypothesized to be mainly due to data assimilation acting to smooth the observed strong curvature in temperature around the tropopause. This is confirmed in the reanalysis where the stratification around the tropopause exhibits a strong discontinuity at the start of the satellite era.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents single-column model (SCM) simulations of a tropical squall-line case observed during the Coupled Ocean-Atmosphere Response Experiment of the Tropical Ocean/Global Atmosphere Programme. This case-study was part of an international model intercomparison project organized by Working Group 4 ‘Precipitating Convective Cloud Systems’ of the GEWEX (Global Energy and Water-cycle Experiment) Cloud System Study. Eight SCM groups using different deep-convection parametrizations participated in this project. The SCMs were forced by temperature and moisture tendencies that had been computed from a reference cloud-resolving model (CRM) simulation using open boundary conditions. The comparison of the SCM results with the reference CRM simulation provided insight into the ability of current convection and cloud schemes to represent organized convection. The CRM results enabled a detailed evaluation of the SCMs in terms of the thermodynamic structure and the convective mass flux of the system, the latter being closely related to the surface convective precipitation. It is shown that the SCMs could reproduce reasonably well the time evolution of the surface convective and stratiform precipitation, the convective mass flux, and the thermodynamic structure of the squall-line system. The thermodynamic structure simulated by the SCMs depended on how the models partitioned the precipitation between convective and stratiform. However, structural differences persisted in the thermodynamic profiles simulated by the SCMs and the CRM. These differences could be attributed to the fact that the total mass flux used to compute the SCM forcing differed from the convective mass flux. The SCMs could not adequately represent these organized mesoscale circulations and the microphysicallradiative forcing associated with the stratiform region. This issue is generally known as the ‘scale-interaction’ problem that can only be properly addressed in fully three-dimensional simulations. Sensitivity simulations run by several groups showed that the time evolution of the surface convective precipitation was considerably smoothed when the convective closure was based on convective available potential energy instead of moisture convergence. Finally, additional SCM simulations without using a convection parametrization indicated that the impact of a convection parametrization in forced SCM runs was more visible in the moisture profiles than in the temperature profiles because convective transport was particularly important in the moisture budget.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Much has been written about Wall Street and the global financial crisis (GFC). From a fraudulent derivatives market to a contestable culture of banking bonuses, culpability has been examined within the frames of American praxis, namely that of American exceptionalism. This study begins with an exploratory analysis of non-US voices concerning the nature of the causes of the GFC. The analysis provides glimpses of the globalized extent of assumptions shared, but not debated within the globalization convergence of financial markets as the neo-liberal project. Practical and paradigmatic tensions are revealed in the capture of a London-based set of views articulated by senior financial executives of financial service organizations, the outcomes of which are not overly optimistic for any significant change in praxis within the immediate future.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The term neural population models (NPMs) is used here as catchall for a wide range of approaches that have been variously called neural mass models, mean field models, neural field models, bulk models, and so forth. All NPMs attempt to describe the collective action of neural assemblies directly. Some NPMs treat the densely populated tissue of cortex as an excitable medium, leading to spatially continuous cortical field theories (CFTs). An indirect approach would start by modelling individual cells and then would explain the collective action of a group of cells by coupling many individual models together. In contrast, NPMs employ collective state variables, typically defined as averages over the group of cells, in order to describe the population activity directly in a single model. The strength and the weakness of his approach are hence one and the same: simplification by bulk. Is this justified and indeed useful, or does it lead to oversimplification which fails to capture the pheno ...

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, I seek to undermine G.A. Cohen’s polemical use of a metaethical claim he makes in his article, ‘Facts and Principles’, by arguing that that use requires an unsustainable equivocation between epistemic and logical grounding. I begin by distinguishing three theses that Cohen has offered during the course of his critique of Rawls and contractualism more generally, the foundationalism about grounding thesis, the justice as non-regulative thesis, and the justice as all-encompassing thesis, and briefly argue that they are analytically independent of each other. I then offer an outline of the foundationalism about grounding thesis, characterising it, as Cohen does, as a demand of logic. That thesis claims that whenever a normative principle is dependent on a fact, it is so dependent in virtue of some other principle. I then argue that although this is true as a matter of logic, it, as Cohen admits, cannot be true of actual justifications, since logic cannot tell us anything about the truth as opposed to the validity of arguments. Facts about a justification cannot then be decisive for whether or not a given argument violates the foundationalism about grounding thesis. As long as, independently of actual justifications, theorists can point to plausible logically grounding principles, as I argue contractualists can, Cohen’s thesis lacks critical bite.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Individual-based models (IBMs) can simulate the actions of individual animals as they interact with one another and the landscape in which they live. When used in spatially-explicit landscapes IBMs can show how populations change over time in response to management actions. For instance, IBMs are being used to design strategies of conservation and of the exploitation of fisheries, and for assessing the effects on populations of major construction projects and of novel agricultural chemicals. In such real world contexts, it becomes especially important to build IBMs in a principled fashion, and to approach calibration and evaluation systematically. We argue that insights from physiological and behavioural ecology offer a recipe for building realistic models, and that Approximate Bayesian Computation (ABC) is a promising technique for the calibration and evaluation of IBMs. IBMs are constructed primarily from knowledge about individuals. In ecological applications the relevant knowledge is found in physiological and behavioural ecology, and we approach these from an evolutionary perspective by taking into account how physiological and behavioural processes contribute to life histories, and how those life histories evolve. Evolutionary life history theory shows that, other things being equal, organisms should grow to sexual maturity as fast as possible, and then reproduce as fast as possible, while minimising per capita death rate. Physiological and behavioural ecology are largely built on these principles together with the laws of conservation of matter and energy. To complete construction of an IBM information is also needed on the effects of competitors, conspecifics and food scarcity; the maximum rates of ingestion, growth and reproduction, and life-history parameters. Using this knowledge about physiological and behavioural processes provides a principled way to build IBMs, but model parameters vary between species and are often difficult to measure. A common solution is to manually compare model outputs with observations from real landscapes and so to obtain parameters which produce acceptable fits of model to data. However, this procedure can be convoluted and lead to over-calibrated and thus inflexible models. Many formal statistical techniques are unsuitable for use with IBMs, but we argue that ABC offers a potential way forward. It can be used to calibrate and compare complex stochastic models and to assess the uncertainty in their predictions. We describe methods used to implement ABC in an accessible way and illustrate them with examples and discussion of recent studies. Although much progress has been made, theoretical issues remain, and some of these are outlined and discussed.