143 resultados para Relaxation Processes

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Asynchronous Optical Sampling has the potential to improve signal to noise ratio in THz transient sperctrometry. The design of an inexpensive control scheme for synchronising two femtosecond pulse frequency comb generators at an offset frequency of 20 kHz is discussed. The suitability of a range of signal processing schemes adopted from the Systems Identification and Control Theory community for further processing recorded THz transients in the time and frequency domain are outlined. Finally, possibilities for femtosecond pulse shaping using genetic algorithms are mentioned.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Asynchronous Optical Sampling (ASOPS) [1,2] and frequency comb spectrometry [3] based on dual Ti:saphire resonators operated in a master/slave mode have the potential to improve signal to noise ratio in THz transient and IR sperctrometry. The multimode Brownian oscillator time-domain response function described by state-space models is a mathematically robust framework that can be used to describe the dispersive phenomena governed by Lorentzian, Debye and Drude responses. In addition, the optical properties of an arbitrary medium can be expressed as a linear combination of simple multimode Brownian oscillator functions. The suitability of a range of signal processing schemes adopted from the Systems Identification and Control Theory community for further processing the recorded THz transients in the time or frequency domain will be outlined [4,5]. Since a femtosecond duration pulse is capable of persistent excitation of the medium within which it propagates, such approach is perfectly justifiable. Several de-noising routines based on system identification will be shown. Furthermore, specifically developed apodization structures will be discussed. These are necessary because due to dispersion issues, the time-domain background and sample interferograms are non-symmetrical [6-8]. These procedures can lead to a more precise estimation of the complex insertion loss function. The algorithms are applicable to femtosecond spectroscopies across the EM spectrum. Finally, a methodology for femtosecond pulse shaping using genetic algorithms aiming to map and control molecular relaxation processes will be mentioned.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Relaxation behavior was measured for dough, gluten and gluten protein fractions obtained from the U.K. biscuitmaking flour, Riband, and the U.K. breadmaking flour, Hereward. The relaxation spectrum, in which relaxation times (tau) are related to polymer molecular size, for dough showed a broad molecular size distribution, with two relaxation processes: a major peak at short times and a second peak at times longer than 10 sec, which is thought to correspond to network structure, and which may be attributed to entanglements and physical cross-links of polymers. Relaxation spectra of glutens were similar to those for the corresponding doughs from both flours. Hereward gluten clearly showed a much more pronounced second peak in relaxation spectrum and higher relaxation modulus than Riband gluten at the same water content. In the gluten protein fractions, gliadin and acetic acid soluble glutenin only showed the first relaxation process, but gel protein clearly showed both the first and second relaxation processes. The results show that the relaxation properties of dough depend on its gluten protein and that gel protein is responsible for the network structure for dough and gluten.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The linear viscoelastic (LVE) spectrum is one of the primary fingerprints of polymer solutions and melts, carrying information about most relaxation processes in the system. Many single chain theories and models start with predicting the LVE spectrum to validate their assumptions. However, until now, no reliable linear stress relaxation data were available from simulations of multichain systems. In this work, we propose a new efficient way to calculate a wide variety of correlation functions and mean-square displacements during simulations without significant additional CPU cost. Using this method, we calculate stress−stress autocorrelation functions for a simple bead−spring model of polymer melt for a wide range of chain lengths, densities, temperatures, and chain stiffnesses. The obtained stress−stress autocorrelation functions were compared with the single chain slip−spring model in order to obtain entanglement related parameters, such as the plateau modulus or the molecular weight between entanglements. Then, the dependence of the plateau modulus on the packing length is discussed. We have also identified three different contributions to the stress relaxation:  bond length relaxation, colloidal and polymeric. Their dependence on the density and the temperature is demonstrated for short unentangled systems without inertia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A range of side chain liquid crystal copolymers have been prepared using mesogenic and non-mesogenic units. It is found that high levels of the non-mesogenic moieties may be introduced without completely disrupting the organization of the liquid crystal phase. Incorporation of this comonomer causes a marked reduction in the glass transition temperature (Tg), presumably as a result of enhanced backbone mobility and a corresponding lowering of the nematic transition temperature, thereby restricting the temperature range for stability of the liquid crystal phase. The effect of the interactions between the various components of these side-chain polymers on their electro-optic responses is described. Infrared (i.r.) dichroism measurements have been made to determine the order parameters of the liquid crystalline side-chain polymers. By identifying a certain band (CN stretching) in the i.r. absorption spectrum, the order parameter of the mesogenic groups can be obtained. The temperature and composition dependence of the observed order parameter are related to the liquid crystal phase transitions and to the electro-optic response. It is found that the introduction of the non-mesogenic units into the polymer chain lowers the threshold voltage of the electro-optic response over and above that due to the reduction in the order parameter. The dynamic electro-optic responses are dominated by the temperature-dependent viscosity and evidence is presented for relaxation processes involving the polymer backbone which are on a time scale greater than that for the mesogenic side-chain units.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have established the surface tension relaxation time in the liquid-solid interfaces of Lennard-Jones (LJ) liquids by means of direct measurements in molecular dynamics (MD) simulations. The main result is that the relaxation time is found to be almost independent of the molecular structures and viscosity of the liquids (at seventy-fold change) used in our study and lies in such a range that in slow hydrodynamic motion the interfaces are expected to be at equilibrium. The implications of our results for the modelling of dynamic wetting processes and interpretation of dynamic contact angle data are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A case of long-range transport of a biomass burning plume from Alaska to Europe is analyzed using a Lagrangian approach. This plume was sampled several times in the free troposphere over North America, the North Atlantic and Europe by three different aircraft during the IGAC Lagrangian 2K4 experiment which was part of the ICARTT/ITOP measurement intensive in summer 2004. Measurements in the plume showed enhanced values of CO, VOCs and NOy, mainly in form of PAN. Observed O3 levels increased by 17 ppbv over 5 days. A photochemical trajectory model, CiTTyCAT, was used to examine processes responsible for the chemical evolution of the plume. The model was initialized with upwind data and compared with downwind measurements. The influence of high aerosol loading on photolysis rates in the plume was investigated using in situ aerosol measurements in the plume and lidar retrievals of optical depth as input into a photolysis code (Fast-J), run in the model. Significant impacts on photochemistry are found with a decrease of 18% in O3 production and 24% in O3 destruction over 5 days when including aerosols. The plume is found to be chemically active with large O3 increases attributed primarily to PAN decomposition during descent of the plume toward Europe. The predicted O3 changes are very dependent on temperature changes during transport and also on water vapor levels in the lower troposphere which can lead to O3 destruction. Simulation of mixing/dilution was necessary to reproduce observed pollutant levels in the plume. Mixing was simulated using background concentrations from measurements in air masses in close proximity to the plume, and mixing timescales (averaging 6.25 days) were derived from CO changes. Observed and simulated O3/CO correlations in the plume were also compared in order to evaluate the photochemistry in the model. Observed slopes change from negative to positive over 5 days. This change, which can be attributed largely to photochemistry, is well reproduced by multiple model runs even if slope values are slightly underestimated suggesting a small underestimation in modeled photochemical O3 production. The possible impact of this biomass burning plume on O3 levels in the European boundary layer was also examined by running the model for a further 5 days and comparing with data collected at surface sites, such as Jungfraujoch, which showed small O3 increases and elevated CO levels. The model predicts significant changes in O3 over the entire 10 day period due to photochemistry but the signal is largely lost because of the effects of dilution. However, measurements in several other BB plumes over Europe show that O3 impact of Alaskan fires can be potentially significant over Europe.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is often assumed that ventilation of the atmospheric boundary layer is weak in the absence of fronts, but is this always true? In this paper we investigate the processes responsible for ventilation of the atmospheric boundary layer during a nonfrontal day that occurred on 9 May 2005 using the UK Met Office Unified Model. Pollution sources are represented by the constant emission of a passive tracer everywhere over land. The ventilation processes observed include shallow convection, turbulent mixing followed by large-scale ascent, a sea breeze circulation and coastal outflow. Vertical distributions of tracer are validated qualitatively with AMPEP (Aircraft Measurement of chemical Processing Export fluxes of Pollutants over the UK) CO aircraft measurements and are shown to agree impressively well. Budget calculations of tracers are performed in order to determine the relative importance of these ventilation processes. Coastal outflow and the sea breeze circulation were found to ventilate 26% of the boundary layer tracer by sunset of which 2% was above 2 km. A combination of coastal outflow, the sea breeze circulation, turbulent mixing and large-scale ascent ventilated 46% of the boundary layer tracer, of which 10% was above 2 km. Finally, coastal outflow, the sea breeze circulation, turbulent mixing, large-scale ascent and shallow convection together ventilated 52% of the tracer into the free troposphere, of which 26% was above 2 km. Hence this study shows that significant ventilation of the boundary layer can occur in the absence of fronts (and thus during high-pressure events). Turbulent mixing and convection processes can double the amount of pollution ventilated from the boundary layer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are various situations in which it is natural to ask whether a given collection of k functions, ρ j (r 1,…,r j ), j=1,…,k, defined on a set X, are the first k correlation functions of a point process on X. Here we describe some necessary and sufficient conditions on the ρ j ’s for this to be true. Our primary examples are X=ℝ d , X=ℤ d , and X an arbitrary finite set. In particular, we extend a result by Ambartzumian and Sukiasian showing realizability at sufficiently small densities ρ 1(r). Typically if any realizing process exists there will be many (even an uncountable number); in this case we prove, when X is a finite set, the existence of a realizing Gibbs measure with k body potentials which maximizes the entropy among all realizing measures. We also investigate in detail a simple example in which a uniform density ρ and translation invariant ρ 2 are specified on ℤ; there is a gap between our best upper bound on possible values of ρ and the largest ρ for which realizability can be established.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to linear response theory, all relaxation functions in the linear regime can be obtained using time correlation functions calculated under equilibrium. In this paper, we demonstrate that the cross correlations make a significant contribution to the partial stress relaxation functions in polymer melts. We present two illustrations in the context of polymer rheology using (1) Brownian dynamics simulations of a single chain model for entangled polymers, the slip-spring model, and (2) molecular dynamics simulations of a multichain model. Using the single chain model, we analyze the contribution of the confining potential to the stress relaxation and the plateau modulus. Although the idea is illustrated with a particular model, it applies to any single chain model that uses a potential to confine the motion of the chains. This leads us to question some of the assumptions behind the tube theory, especially the meaning of the entanglement molecular weight obtained from the plateau modulus. To shed some light on this issue, we study the contribution of the nonbonded excluded-volume interactions to the stress relaxation using the multichain model. The proportionality of the bonded/nonbonded contributions to the total stress relaxation (after a density dependent "colloidal" relaxation time) provides some insight into the success of the tube theory in spite of using questionable assumptions. The proportionality indicates that the shape of the relaxation spectrum can indeed be reproduced using the tube theory and the problem is reduced to that of finding the correct prefactor. (c) 2007 American Institute of Physics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Many modelling studies examine the impacts of climate change on crop yield, but few explore either the underlying bio-physical processes, or the uncertainty inherent in the parameterisation of crop growth and development. We used a perturbed-parameter crop modelling method together with a regional climate model (PRECIS) driven by the 2071-2100 SRES A2 emissions scenario in order to examine processes and uncertainties in yield simulation. Crop simulations used the groundnut (i.e. peanut; Arachis hypogaea L.) version of the General Large-Area Model for annual crops (GLAM). Two sets of GLAM simulations were carried out: control simulations and fixed-duration simulations, where the impact of mean temperature on crop development rate was removed. Model results were compared to sensitivity tests using two other crop models of differing levels of complexity: CROPGRO, and the groundnut model of Hammer et al. [Hammer, G.L., Sinclair, T.R., Boote, K.J., Wright, G.C., Meinke, H., and Bell, M.J., 1995, A peanut simulation model: I. Model development and testing. Agron. J. 87, 1085-1093]. GLAM simulations were particularly sensitive to two processes. First, elevated vapour pressure deficit (VPD) consistently reduced yield. The same result was seen in some simulations using both other crop models. Second, GLAM crop duration was longer, and yield greater, when the optimal temperature for the rate of development was exceeded. Yield increases were also seen in one other crop model. Overall, the models differed in their response to super-optimal temperatures, and that difference increased with mean temperature; percentage changes in yield between current and future climates were as diverse as -50% and over +30% for the same input data. The first process has been observed in many crop experiments, whilst the second has not. Thus, we conclude that there is a need for: (i) more process-based modelling studies of the impact of VPD on assimilation, and (ii) more experimental studies at super-optimal temperatures. Using the GLAM results, central values and uncertainty ranges were projected for mean 2071-2100 crop yields in India. In the fixed-duration simulations, ensemble mean yields mostly rose by 10-30%. The full ensemble range was greater than this mean change (20-60% over most of India). In the control simulations, yield stimulation by elevated CO2 was more than offset by other processes-principally accelerated crop development rates at elevated, but sub-optimal, mean temperatures. Hence, the quantification of uncertainty can facilitate relatively robust indications of the likely sign of crop yield changes in future climates. (C) 2007 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Processes in the climate system that can either amplify or dampen the climate response to an external perturbation are referred to as climate feedbacks. Climate sensitivity estimates depend critically on radiative feedbacks associated with water vapor, lapse rate, clouds, snow, and sea ice, and global estimates of these feedbacks differ among general circulation models. By reviewing recent observational, numerical, and theoretical studies, this paper shows that there has been progress since the Third Assessment Report of the Intergovernmental Panel on Climate Change in (i) the understanding of the physical mechanisms involved in these feedbacks, (ii) the interpretation of intermodel differences in global estimates of these feedbacks, and (iii) the development of methodologies of evaluation of these feedbacks (or of some components) using observations. This suggests that continuing developments in climate feedback research will progressively help make it possible to constrain the GCMs’ range of climate feedbacks and climate sensitivity through an ensemble of diagnostics based on physical understanding and observations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of Research Theme 4 (RT4) was to advance understanding of the basic science issues at the heart of the ENSEMBLES project, focusing on the key processes that govern climate variability and change, and that determine the predictability of climate. Particular attention was given to understanding linear and non-linear feedbacks that may lead to climate surprises,and to understanding the factors that govern the probability of extreme events. Improved understanding of these issues will contribute significantly to the quantification and reduction of uncertainty in seasonal to decadal predictions and projections of climate change. RT4 exploited the ENSEMBLES integrations (stream 1) performed in RT2A as well as undertaking its own experimentation to explore key processes within the climate system. It was working at the cutting edge of problems related to climate feedbacks, the interaction between climate variability and climate change � especially how climate change pertains to extreme events, and the predictability of the climate system on a range of time-scales. The statisticalmethodologies developed for extreme event analysis are new and state-of-the-art. The RT4-coordinated experiments, which have been conducted with six different atmospheric GCMs forced by common timeinvariant sea surface temperature (SST) and sea-ice fields (removing some sources of inter-model variability), are designed to help to understand model uncertainty (rather than scenario or initial condition uncertainty) in predictions of the response to greenhouse-gas-induced warming. RT4 links strongly with RT5 on the evaluation of the ENSEMBLES prediction system and feeds back its results to RT1 to guide improvements in the Earth system models and, through its research on predictability, to steer the development of methods for initialising the ensembles

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Our understanding of the climate system has been revolutionized recently, by the development of sophisticated computer models. The predictions of such models are used to formulate international protocols, intended to mitigate the severity of global warming and its impacts. Yet, these models are not perfect representations of reality, because they remove from explicit consideration many physical processes which are known to be key aspects of the climate system, but which are too small or fast to be modelled. The purpose of this paper is to give a personal perspective of the current state of knowledge regarding the problem of unresolved scales in climate models. A recent novel solution to the problem is discussed, in which it is proposed, somewhat counter-intuitively, that the performance of models may be improved by adding random noise to represent the unresolved processes.