116 resultados para Compactification and String Models


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a novel kinetic multi-layer model that explicitly resolves mass transport and chemical reaction at the surface and in the bulk of aerosol particles (KM-SUB). The model is based on the PRA framework of gas-particle interactions (Poschl-Rudich-Ammann, 2007), and it includes reversible adsorption, surface reactions and surface-bulk exchange as well as bulk diffusion and reaction. Unlike earlier models, KM-SUB does not require simplifying assumptions about steady-state conditions and radial mixing. The temporal evolution and concentration profiles of volatile and non-volatile species at the gas-particle interface and in the particle bulk can be modeled along with surface concentrations and gas uptake coefficients. In this study we explore and exemplify the effects of bulk diffusion on the rate of reactive gas uptake for a simple reference system, the ozonolysis of oleic acid particles, in comparison to experimental data and earlier model studies. We demonstrate how KM-SUB can be used to interpret and analyze experimental data from laboratory studies, and how the results can be extrapolated to atmospheric conditions. In particular, we show how interfacial and bulk transport, i.e., surface accommodation, bulk accommodation and bulk diffusion, influence the kinetics of the chemical reaction. Sensitivity studies suggest that in fine air particulate matter oleic acid and compounds with similar reactivity against ozone (carbon-carbon double bonds) can reach chemical lifetimes of many hours only if they are embedded in a (semi-)solid matrix with very low diffusion coefficients (< 10(-10) cm(2) s(-1)). Depending on the complexity of the investigated system, unlimited numbers of volatile and non-volatile species and chemical reactions can be flexibly added and treated with KM-SUB. We propose and intend to pursue the application of KM-SUB as a basis for the development of a detailed master mechanism of aerosol chemistry as well as for the derivation of simplified but realistic parameterizations for large-scale atmospheric and climate models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a novel kinetic multi-layer model that explicitly resolves mass transport and chemical reaction at the surface and in the bulk of aerosol particles (KM-SUB). The model is based on the PRA framework of gas–particle interactions (P¨oschl et al., 5 2007), and it includes reversible adsorption, surface reactions and surface-bulk exchange as well as bulk diffusion and reaction. Unlike earlier models, KM-SUB does not require simplifying assumptions about steady-state conditions and radial mixing. The temporal evolution and concentration profiles of volatile and non-volatile species at the gas-particle interface and in the particle bulk can be modeled along with surface 10 concentrations and gas uptake coefficients. In this study we explore and exemplify the effects of bulk diffusion on the rate of reactive gas uptake for a simple reference system, the ozonolysis of oleic acid particles, in comparison to experimental data and earlier model studies. We demonstrate how KM-SUB can be used to interpret and analyze experimental data from laboratory stud15 ies, and how the results can be extrapolated to atmospheric conditions. In particular, we show how interfacial transport and bulk transport, i.e., surface accommodation, bulk accommodation and bulk diffusion, influence the kinetics of the chemical reaction. Sensitivity studies suggest that in fine air particulate matter oleic acid and compounds with similar reactivity against ozone (C=C double bonds) can reach chemical lifetimes of 20 multiple hours only if they are embedded in a (semi-)solid matrix with very low diffusion coefficients (10−10 cm2 s−1). Depending on the complexity of the investigated system, unlimited numbers of volatile and non-volatile species and chemical reactions can be flexibly added and treated with KM-SUB. We propose and intend to pursue the application of KM-SUB 25 as a basis for the development of a detailed master mechanism of aerosol chemistry as well as for the derivation of simplified but realistic parameterizations for large-scale atmospheric and climate models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The scaling of metabolic rates to body size is widely considered to be of great biological and ecological importance, and much attention has been devoted to determining its theoretical and empirical value. Most debate centers on whether the underlying power law describing metabolic rates is 2/3 (as predicted by scaling of surface area/volume relationships) or 3/4 ("Kleiber's law"). Although recent evidence suggests that empirically derived exponents vary among clades with radically different metabolic strategies, such as ectotherms and endotherms, models, such as the metabolic theory of ecology, depend on the assumption that there is at least a predominant, if not universal, metabolic scaling exponent. Most analyses claimed to support the predictions of general models, however, failed to control for phylogeny. We used phylogenetic generalized least-squares models to estimate allometric slopes for both basal metabolic rate (BMR) and field metabolic rate (FMR) in mammals. Metabolic rate scaling conformed to no single theoretical prediction, but varied significantly among phylogenetic lineages. In some lineages we found a 3/4 exponent, in others a 2/3 exponent, and in yet others exponents differed significantly from both theoretical values. Analysis of the phylogenetic signal in the data indicated that the assumptions of neither species-level analysis nor independent contrasts were met. Analyses that assumed no phylogenetic signal in the data (species-level analysis) or a strong phylogenetic signal (independent contrasts), therefore, returned estimates of allometric slopes that were erroneous in 30% and 50% of cases, respectively. Hence, quantitative estimation of the phylogenetic signal is essential for determining scaling exponents. The lack of evidence for a predominant scaling exponent in these analyses suggests that general models of metabolic scaling, and macro-ecological theories that depend on them, have little explanatory power.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although the use of climate scenarios for impact assessment has grown steadily since the 1990s, uptake of such information for adaptation is lagging by nearly a decade in terms of scientific output. Nonetheless, integration of climate risk information in development planning is now a priority for donor agencies because of the need to prepare for climate change impacts across different sectors and countries. This urgency stems from concerns that progress made against Millennium Development Goals (MDGs) could be threatened by anthropogenic climate change beyond 2015. Up to this time the human signal, though detectable and growing, will be a relatively small component of climate variability and change. This implies the need for a twin-track approach: on the one hand, vulnerability assessments of social and economic strategies for coping with present climate extremes and variability, and, on the other hand, development of climate forecast tools and scenarios to evaluate sector-specific, incremental changes in risk over the next few decades. This review starts by describing the climate outlook for the next couple of decades and the implications for adaptation assessments. We then review ways in which climate risk information is already being used in adaptation assessments and evaluate the strengths and weaknesses of three groups of techniques. Next we identify knowledge gaps and opportunities for improving the production and uptake of climate risk information for the 2020s. We assert that climate change scenarios can meet some, but not all, of the needs of adaptation planning. Even then, the choice of scenario technique must be matched to the intended application, taking into account local constraints of time, resources, human capacity and supporting infrastructure. We also show that much greater attention should be given to improving and critiquing models used for climate impact assessment, as standard practice. Finally, we highlight the over-arching need for the scientific community to provide more information and guidance on adapting to the risks of climate variability and change over nearer time horizons (i.e. the 2020s). Although the focus of the review is on information provision and uptake in developing regions, it is clear that many developed countries are facing the same challenges. Copyright © 2009 Royal Meteorological Society

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The potential of near infrared spectroscopy in conjunction with partial least squares regression to predict Miscanthus xgiganteus and short rotation coppice willow quality indices was examined. Moisture, calorific value, ash and carbon content were predicted with a root mean square error of cross validation of 0.90% (R2 = 0.99), 0.13 MJ/kg (R2 = 0.99), 0.42% (R2 = 0.58), and 0.57% (R2 = 0.88), respectively. The moisture and calorific value prediction models had excellent accuracy while the carbon and ash models were fair and poor, respectively. The results indicate that near infrared spectroscopy has the potential to predict quality indices of dedicated energy crops, however the models must be further validated on a wider range of samples prior to implementation. The utilization of such models would assist in the optimal use of the feedstock based on its biomass properties.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to determine the potential of mid-infrared spectroscopy coupled with multidimensional statistical analysis for the prediction of processed cheese instrumental texture and meltability attributes. Processed cheeses (n = 32) of varying composition were manufactured in a pilot plant. Following two and four weeks storage at 4 degrees C samples were analysed using texture profile analysis, two meltability tests (computer vision, Olson and Price) and mid-infrared spectroscopy (4000-640 cm(-1)). Partial least squares regression was used to develop predictive models for all measured attributes. Five attributes were successfully modelled with varying degrees of accuracy. The computer vision meltability model allowed for discrimination between high and low melt values (R-2 = 0.64). The hardness and springiness models gave approximate quantitative results (R-2 = 0.77) and the cohesiveness (R-2 = 0.81) and Olson and Price meltability (R-2 = 0.88) models gave good prediction results. (c) 2006 Elsevier Ltd. All rights reserved..

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to determine the potential of mid-infrared spectroscopy in conjunction with partial least squares (PLS) regression to predict various quality parameters in cheddar cheese. Cheddar cheeses (n = 24) were manufactured and stored at 8 degrees C for 12 mo. Mid-infrared spectra (640 to 4000/cm) were recorded after 4, 6, 9, and 12 mo storage. At 4, 6, and 9 mo, the water-soluble nitrogen (WSN) content of the samples was determined and the samples were also evaluated for 11 sensory texture attributes using descriptive sensory analysis. The mid-infrared spectra were subjected to a number of pretreatments, and predictive models were developed for all parameters. Age was predicted using scatter-corrected, 1st derivative spectra with a root mean square error of cross-validation (RMSECV) of 1 mo, while WSN was predicted using 1st derivative spectra (RMSECV = 2.6%). The sensory texture attributes most successfully predicted were rubbery, crumbly, chewy, and massforming. These attributes were modeled using 2nd derivative spectra and had, corresponding RMSECV values in the range of 2.5 to 4.2 on a scale of 0 to 100. It was concluded that mid-infrared spectroscopy has the potential to predict age, WSN, and several sensory texture attributes of cheddar cheese..

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the effect of choices of model structure and scale in development viability appraisal. The paper addresses two questions concerning the application of development appraisal techniques to viability modelling within the UK planning system. The first relates to the extent to which, given intrinsic input uncertainty, the choice of model structure significantly affects model outputs. The second concerns the extent to which, given intrinsic input uncertainty, the level of model complexity significantly affects model outputs. Monte Carlo simulation procedures are applied to a hypothetical development scheme in order to measure the effects of model aggregation and structure on model output variance. It is concluded that, given the particular scheme modelled and unavoidably subjective assumptions of input variance, simple and simplistic models may produce similar outputs to more robust and disaggregated models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Aqua-Planet Experiment (APE) was first proposed by Neale and Hoskins (2000a) as a benchmark for atmospheric general circulation models (AGCMs) on an idealised water-covered Earth. The experiment and its aims are summarised, and its context within a modelling hierarchy used to evaluate complex models and to provide a link between realistic simulation and conceptual models of atmospheric phenomena is discussed. The simplified aqua-planet configuration bridges a gap in the existing hierarchy. It is designed to expose differences between models and to focus attention on particular phenomena and their response to changes in the underlying distribution of sea surface temperature.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Steady state and dynamic models have been developed and applied to the River Kennet system. Annual nitrogen exports from the land surface to the river have been estimated based on land use from the 1930s and the 1990s. Long term modelled trends indicate that there has been a large increase in nitrogen transport into the river system driven by increased fertiliser application associated with increased cereal production, increased population and increased livestock levels. The dynamic model INCA Integrated Nitrogen in Catchments. has been applied to simulate the day-to-day transport of N from the terrestrial ecosystem to the riverine environment. This process-based model generates spatial and temporal data and reproduces the observed instream concentrations. Applying the model to current land use and 1930s land use indicates that there has been a major shift in the short term dynamics since the 1930s, with increased river and groundwater concentrations caused by both non-point source pollution from agriculture and point source discharges. �

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Wernicke’s aphasia is a condition which results in severely disrupted language comprehension following a lesion to the left temporo-parietal region. A phonological analysis deficit has traditionally been held to be at the root of the comprehension impairment in WA, a view consistent with current functional neuroimaging which finds areas in the superior temporal cortex responsive to phonological stimuli. However behavioural evidence to support the link between a phonological analysis deficit and auditory comprehension has not been yet shown. This study extends seminal work by Blumstein et al. (1977) to investigate the relationship between acoustic-phonological perception, measured through phonological discrimination, and auditory comprehension in a case series of Wernicke’s aphasia participants. A novel adaptive phonological discrimination task was used to obtain reliable thresholds of the phonological perceptual distance required between nonwords before they could be discriminated. Wernicke’s aphasia participants showed significantly elevated thresholds compared to age and hearing matched control participants. Acoustic-phonological thresholds correlated strongly with auditory comprehension abilities in Wernicke’s aphasia. In contrast, nonverbal semantic skills showed no relationship with auditory comprehension. The results are evaluated in the context of recent neurobiological models of language and suggest that impaired acoustic-phonological perception underlies the comprehension impairment in Wernicke’s aphasia and favour models of language which propose a leftward asymmetry in phonological analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impending threat of global climate change and its regional manifestations is among the most important and urgent problems facing humanity. Society needs accurate and reliable estimates of changes in the probability of regional weather variations to develop science-based adaptation and mitigation strategies. Recent advances in weather prediction and in our understanding and ability to model the climate system suggest that it is both necessary and possible to revolutionize climate prediction to meet these societal needs. However, the scientific workforce and the computational capability required to bring about such a revolution is not available in any single nation. Motivated by the success of internationally funded infrastructure in other areas of science, this paper argues that, because of the complexity of the climate system, and because the regional manifestations of climate change are mainly through changes in the statistics of regional weather variations, the scientific and computational requirements to predict its behavior reliably are so enormous that the nations of the world should create a small number of multinational high-performance computing facilities dedicated to the grand challenges of developing the capabilities to predict climate variability and change on both global and regional scales over the coming decades. Such facilities will play a key role in the development of next-generation climate models, build global capacity in climate research, nurture a highly trained workforce, and engage the global user community, policy-makers, and stakeholders. We recommend the creation of a small number of multinational facilities with computer capability at each facility of about 20 peta-flops in the near term, about 200 petaflops within five years, and 1 exaflop by the end of the next decade. Each facility should have sufficient scientific workforce to develop and maintain the software and data analysis infrastructure. Such facilities will enable questions of what resolution, both horizontal and vertical, in atmospheric and ocean models, is necessary for more confident predictions at the regional and local level. Current limitations in computing power have placed severe limitations on such an investigation, which is now badly needed. These facilities will also provide the world's scientists with the computational laboratories for fundamental research on weather–climate interactions using 1-km resolution models and on atmospheric, terrestrial, cryospheric, and oceanic processes at even finer scales. Each facility should have enabling infrastructure including hardware, software, and data analysis support, and scientific capacity to interact with the national centers and other visitors. This will accelerate our understanding of how the climate system works and how to model it. It will ultimately enable the climate community to provide society with climate predictions, which are based on our best knowledge of science and the most advanced technology.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Independent studies have demonstrated that flagella are associated with the invasive process of Salmonella enterica serotypes, and aflagellate derivatives of Salmonella enterica serotype Enteritidis are attenuated in murine and avian models of infection. One widely held view is that the motility afforded by flagella, probably aided by chemotactic responses, mediates the initial interaction between bacterium and host cell. The adherence and invasion properties of two S. Enteritidis wild-type strains and isogenic aflagellate mutants were assessed on HEp-2 and Div-1 cells that are of human and avian epithelial origin, respectively. Both aflagellate derivatives showed a significant reduction of invasion compared with wild type over the three hours of the assays. Complementation of the defective fliC allele recovered partially the wild-type phenotype. Examination of the bacterium-host cell interaction by electron and confocal microscopy approaches showed that wild-type bacteria induced ruffle formation and significant cytoskeletal rearrangements on HEp-2 cells within 5 minutes of contact. The aflagellate derivatives induced fewer ruffles than wild type. Ruffle formation on the Div-1 cell line was less pronounced than for HEp-2 cells for wild-type S. Enteritidis. Collectively, these data support the hypothesis that flagella play an active role in the early events of the invasive process.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Undeniably, anticipation plays a crucial role in cognition. By what means, to what extent, and what it achieves remain open questions. In a recent BBS target article, Clark (in press) depicts an integrative model of the brain that builds on hierarchical Bayesian models of neural processing (Rao and Ballard, 1999; Friston, 2005; Brown et al., 2011), and their most recent formulation using the free-energy principle borrowed from thermodynamics (Feldman and Friston, 2010; Friston, 2010; Friston et al., 2010). Hierarchical generative models of cognition, such as those described by Clark, presuppose the manipulation of representations and internal models of the world, in as much detail as is perceptually available. Perhaps surprisingly, Clark acknowledges the existence of a “virtual version of the sensory data” (p. 4), but with no reference to some of the historical debates that shaped cognitive science, related to the storage, manipulation, and retrieval of representations in a cognitive system (Shanahan, 1997), or accounting for the emergence of intentionality within such a system (Searle, 1980; Preston and Bishop, 2002). Instead of demonstrating how this Bayesian framework responds to these foundational questions, Clark describes the structure and the functional properties of an action-oriented, multi-level system that is meant to combine perception, learning, and experience (Niedenthal, 2007).