33 resultados para exponential wide band model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Safety enforcement practitioners within Europe and marketers, designers or manufacturers of consumer products need to determine compliance with the legal test of "reasonable safety" for consumer goods, to reduce the "risks" of injury to the minimum. To enable freedom of movement of products, a method for safety appraisal is required for use as an "expert" system of hazard analysis by non-experts in safety testing of consumer goods for implementation consistently throughout Europe. Safety testing approaches and the concept of risk assessment and hazard analysis are reviewed in developing a model for appraising consumer product safety which seeks to integrate the human factors contribution of risk assessment, hazard perception, and information processing. The model develops a system of hazard identification, hazard analysis and risk assessment which can be applied to a wide range of consumer products through use of a series of systematic checklists and matrices and applies alternative numerical and graphical methods for calculating a final product safety risk assessment score. It is then applied in its pilot form by selected "volunteer" Trading Standards Departments to a sample of consumer products. A series of questionnaires is used to select participating Trading Standards Departments, to explore the contribution of potential subjective influences, to establish views regarding the usability and reliability of the model and any preferences for the risk assessment scoring system used. The outcome of the two stage hazard analysis and risk assessment process is considered to determine consistency in results of hazard analysis, final decisions regarding the safety of the sample product and to determine any correlation in the decisions made using the model and alternative scoring methods of risk assessment. The research also identifies a number of opportunities for future work, and indicates a number of areas where further work has already begun.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The soil-plant-moisture subsystem is an important component of the hydrological cycle. Over the last 20 or so years a number of computer models of varying complexity have represented this subsystem with differing degrees of success. The aim of this present work has been to improve and extend an existing model. The new model is less site specific thus allowing for the simulation of a wide range of soil types and profiles. Several processes, not included in the original model, are simulated by the inclusion of new algorithms, including: macropore flow; hysteresis and plant growth. Changes have also been made to the infiltration, water uptake and water flow algorithms. Using field data from various sources, regression equations have been derived which relate parameters in the suction-conductivity-moisture content relationships to easily measured soil properties such as particle-size distribution data. Independent tests have been performed on laboratory data produced by Hedges (1989). The parameters found by regression for the suction relationships were then used in equations describing the infiltration and macropore processes. An extensive literature review produced a new model for calculating plant growth from actual transpiration, which was itself partly determined by the root densities and leaf area indices derived by the plant growth model. The new infiltration model uses intensity/duration curves to disaggregate daily rainfall inputs into hourly amounts. The final model has been calibrated and tested against field data, and its performance compared to that of the original model. Simulations have also been carried out to investigate the effects of various parameters on infiltration, macropore flow, actual transpiration and plant growth. Qualitatively comparisons have been made between these results and data given in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The 21-day experimental gingivitis model, an established noninvasive model of inflammation in response to increasing bacterial accumulation in humans, is designed to enable the study of both the induction and resolution of inflammation. Here, we have analyzed gingival crevicular fluid, an oral fluid comprising a serum transudate and tissue exudates, by LC-MS/MS using Fourier transform ion cyclotron resonance mass spectrometry and iTRAQ isobaric mass tags, to establish meta-proteomic profiles of inflammation-induced changes in proteins in healthy young volunteers. Across the course of experimentally induced gingivitis, we identified 16 bacterial and 186 human proteins. Although abundances of the bacterial proteins identified did not vary temporally, Fusobacterium outer membrane proteins were detected. Fusobacterium species have previously been associated with periodontal health or disease. The human proteins identified spanned a wide range of compartments (both extracellular and intracellular) and functions, including serum proteins, proteins displaying antibacterial properties, and proteins with functions associated with cellular transcription, DNA binding, the cytoskeleton, cell adhesion, and cilia. PolySNAP3 clustering software was used in a multilayered analytical approach. Clusters of proteins that associated with changes to the clinical parameters included neuronal and synapse associated proteins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although event-related potentials (ERPs) are widely used to study sensory, perceptual and cognitive processes, it remains unknown whether they are phase-locked signals superimposed upon the ongoing electroencephalogram (EEG) or result from phase-alignment of the EEG. Previous attempts to discriminate between these hypotheses have been unsuccessful but here a new test is presented based on the prediction that ERPs generated by phase-alignment will be associated with event-related changes in frequency whereas evoked-ERPs will not. Using empirical mode decomposition (EMD), which allows measurement of narrow-band changes in the EEG without predefining frequency bands, evidence was found for transient frequency slowing in recognition memory ERPs but not in simulated data derived from the evoked model. Furthermore, the timing of phase-alignment was frequency dependent with the earliest alignment occurring at high frequencies. Based on these findings, the Firefly model was developed, which proposes that both evoked and induced power changes derive from frequency-dependent phase-alignment of the ongoing EEG. Simulated data derived from the Firefly model provided a close match with empirical data and the model was able to account for i) the shape and timing of ERPs at different scalp sites, ii) the event-related desynchronization in alpha and synchronization in theta, and iii) changes in the power density spectrum from the pre-stimulus baseline to the post-stimulus period. The Firefly Model, therefore, provides not only a unifying account of event-related changes in the EEG but also a possible mechanism for cross-frequency information processing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Algae are a new potential biomass for energy production but there is limited information on their pyrolysis and kinetics. The main aim of this thesis is to investigate the pyrolytic behaviour and kinetics of Chlorella vulgaris, a green microalga. Under pyrolysis conditions, these microalgae show their comparable capabilities to terrestrial biomass for energy and chemicals production. Also, the evidence from a preliminary pyrolysis by the intermediate pilot-scale reactor supports the applicability of these microalgae in the existing pyrolysis reactor. Thermal decomposition of Chlorella vulgaris occurs in a wide range of temperature (200-550°C) with multi-step reactions. To evaluate the kinetic parameters of their pyrolysis process, two approaches which are isothermal and non-isothermal experiments are applied in this work. New developed Pyrolysis-Mass Spectrometry (Py-MS) technique has the potential for isothermal measurements with a short run time and small sample size requirement. The equipment and procedure are assessed by the kinetic evaluation of thermal decomposition of polyethylene and lignocellulosic derived materials (cellulose, hemicellulose, and lignin). In the case of non-isothermal experiment, Thermogravimetry- Mass Spectrometry (TG-MS) technique is used in this work. Evolved gas analysis provides the information on the evolution of volatiles and these data lead to a multi-component model. Triplet kinetic values (apparent activation energy, pre-exponential factor, and apparent reaction order) from isothermal experiment are 57 (kJ/mol), 5.32 (logA, min-1), 1.21-1.45; 9 (kJ/mol), 1.75 (logA, min-1), 1.45 and 40 (kJ/mol), 3.88 (logA, min-1), 1.45- 1.15 for low, middle and high temperature region, respectively. The kinetic parameters from non-isothermal experiment are varied depending on the different fractions in algal biomass when the range of apparent activation energies are 73-207 (kJ/mol); pre-exponential factor are 5-16 (logA, min-1); and apparent reaction orders are 1.32–2.00. The kinetic procedures reported in this thesis are able to be applied to other kinds of biomass and algae for future works.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Constructing and executing distributed systems that can adapt to their operating context in order to sustain provided services and the service qualities are complex tasks. Managing adaptation of multiple, interacting services is particularly difficult since these services tend to be distributed across the system, interdependent and sometimes tangled with other services. Furthermore, the exponential growth of the number of potential system configurations derived from the variabilities of each service need to be handled. Current practices of writing low-level reconfiguration scripts as part of the system code to handle run time adaptation are both error prone and time consuming and make adaptive systems difficult to validate and evolve. In this paper, we propose to combine model driven and aspect oriented techniques to better cope with the complexities of adaptive systems construction and execution, and to handle the problem of exponential growth of the number of possible configurations. Combining these techniques allows us to use high level domain abstractions, simplify the representation of variants and limit the problem pertaining to the combinatorial explosion of possible configurations. In our approach we also use models at runtime to generate the adaptation logic by comparing the current configuration of the system to a composed model representing the configuration we want to reach. © 2008 Springer-Verlag Berlin Heidelberg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We argue that, for certain constrained domains, elaborate model transformation technologies-implemented from scratch in general-purpose programming languages-are unnecessary for model-driven engineering; instead, lightweight configuration of commercial off-the-shelf productivity tools suffices. In particular, in the CancerGrid project, we have been developing model-driven techniques for the generation of software tools to support clinical trials. A domain metamodel captures the community's best practice in trial design. A scientist authors a trial protocol, modelling their trial by instantiating the metamodel; customized software artifacts to support trial execution are generated automatically from the scientist's model. The metamodel is expressed as an XML Schema, in such a way that it can be instantiated by completing a form to generate a conformant XML document. The same process works at a second level for trial execution: among the artifacts generated from the protocol are models of the data to be collected, and the clinician conducting the trial instantiates such models in reporting observations-again by completing a form to create a conformant XML document, representing the data gathered during that observation. Simple standard form management tools are all that is needed. Our approach is applicable to a wide variety of information-modelling domains: not just clinical trials, but also electronic public sector computing, customer relationship management, document workflow, and so on. © 2012 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This chapter explains a functional integral approach about impurity in the Tomonaga–Luttinger model. The Tomonaga–Luttinger model of one-dimensional (1D) strongly correlates electrons gives a striking example of non-Fermi-liquid behavior. For simplicity, the chapter considers only a single-mode Tomonaga–Luttinger model, with one species of right- and left-moving electrons, thus, omitting spin indices and considering eventually the simplest linearized model of a single-valley parabolic electron band. The standard operator bosonization is one of the most elegant methods developed in theoretical physics. The main advantage of the bosonization, either in standard or functional form, is that including the quadric electron–electron interaction does not substantially change the free action. The chapter demonstrates the way to develop the formalism of bosonization based on the functional integral representation of observable quantities within the Keldysh formalism.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the authors use an exponential generalized autoregressive conditional heteroscedastic (EGARCH) error-correction model (ECM), that is, EGARCH-ECM, to estimate the pass-through effects of foreign exchange (FX) rates and producers’ prices for 20 U.K. export sectors. The long-run adjustment of export prices to FX rates and producers’ prices is within the range of -1.02% (for the Textiles sector) and -17.22% (for the Meat sector). The contemporaneous pricing-to-market (PTM) coefficient is within the range of -72.84% (for the Fuels sector) and -8.05% (for the Textiles sector). Short-run FX rate pass-through is not complete even after several months. Rolling EGARCH-ECMs show that the short and long-run effects of FX rate and producers’ prices fluctuate substantially as are asymmetry and volatility estimates before equilibrium is achieved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper provides the most fully comprehensive evidence to date on whether or not monetary aggregates are valuable for forecasting US inflation in the early to mid 2000s. We explore a wide range of different definitions of money, including different methods of aggregation and different collections of included monetary assets. We use non-linear, artificial intelligence techniques, namely, recurrent neural networks, evolution strategies and kernel methods in our forecasting experiment. In the experiment, these three methodologies compete to find the best fitting US inflation forecasting models and are then compared to forecasts from a naive random walk model. The best models were non-linear autoregressive models based on kernel methods. Our findings do not provide much support for the usefulness of monetary aggregates in forecasting inflation. There is evidence in the literature that evolutionary methods can be used to evolve kernels hence our future work should combine the evolutionary and kernel methods to get the benefits of both.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A travelling-wave model of a semiconductor optical amplifier based non-linear loop mirror is developed to investigate the importance of travelling-wave effects and gain/phase dynamics in predicting device behaviour. A constant effective carrier recovery lifetime approximation is found to be reasonably accurate (±10%) within a wide range of control pulse energies. Based on this approximation, a heuristic model is developed for maximum computational efficiency. The models are applied to a particular configuration involving feedback.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Transmembrane proteins play crucial roles in many important physiological processes. The intracellular domain of membrane proteins is key for their function by interacting with a wide variety of cytosolic proteins. It is therefore important to examine this interaction. A recently developed method to study these interactions, based on the use of liposomes as a model membrane, involves the covalent coupling of the cytoplasmic domains of membrane proteins to the liposome membrane. This allows for the analysis of interaction partners requiring both protein and membrane lipid binding. This thesis further establishes the liposome recruitment system and utilises it to examine the intracellular interactome of the amyloid precursor protein (APP), most well-known for its proteolytic cleavage that results in the production and accumulation of amyloid beta fragments, the main constituent of amyloid plaques in Alzheimer’s disease pathology. Despite this, the physiological function of APP remains largely unclear. Through the use of the proteo-liposome recruitment system two novel interactions of APP’s intracellular domain (AICD) are examined with a view to gaining a greater insight into APP’s physiological function. One of these novel interactions is between AICD and the mTOR complex, a serine/threonine protein kinase that integrates signals from nutrients and growth factors. The kinase domain of mTOR directly binds to AICD and the N-terminal amino acids of AICD are crucial for this interaction. The second novel interaction is between AICD and the endosomal PIKfyve complex, a lipid kinase involved in the production of phosphatidylinositol-3,5-bisphosphate (PI(3,5)P2) from phosphatidylinositol-3-phosphate, which has a role in controlling ensdosome dynamics. The scaffold protein Vac14 of the PIKfyve complex binds directly to AICD and the C-terminus of AICD is important for its interaction with the PIKfyve complex. Using a recently developed intracellular PI(3,5)P2 probe it is shown that APP controls the formation of PI(3,5)P2 positive vesicular structures and that the PIKfyve complex is involved in the trafficking and degradation of APP. Both of these novel APP interactors have important implications of both APP function and Alzheimer’s disease. The proteo-liposome recruitment method is further validated through its use to examine the recruitment and assembly of the AP-2/clathrin coat from purified components to two membrane proteins containing different sorting motifs. Taken together this thesis highlights the proteo-liposome recruitment system as a valuable tool for the study of membrane proteins intracellular interactome. It allows for the mimicking of the protein in its native configuration therefore identifying weaker interactions that are not detected by more conventional methods and also detecting interactions that are mediated by membrane phospholipids.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visible light communications is a technology with enormous potential for a wide range of applications within next generation transmission and broadcasting technologies. VLC offers simultaneous illumination and data communications by intensity modulating the optical power emitted by LEDs operating in the visible range of the electromagnetic spectrum (~370-780 nm). The major challenge in VLC systems to date has been in improving transmission speeds, considering the low bandwidths available with commercial LED devices. Thus, to improve the spectral usage, the research community has increasingly turned to advanced modulation formats such as orthogonal frequency-division multiplexing. In this article we introduce a new modulation scheme into the VLC domain; multiband carrier-less amplitude and phase modulation (m-CAP) and describe in detail its performance within the context of bandlimited systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Surface quality is important in engineering and a vital aspect of it is surface roughness, since it plays an important role in wear resistance, ductility, tensile, and fatigue strength for machined parts. This paper reports on a research study on the development of a geometrical model for surface roughness prediction when face milling with square inserts. The model is based on a geometrical analysis of the recreation of the tool trail left on the machined surface. The model has been validated with experimental data obtained for high speed milling of aluminum alloy (Al 7075-T7351) when using a wide range of cutting speed, feed per tooth, axial depth of cut and different values of tool nose radius (0.8. mm and 2.5. mm), using the Taguchi method as the design of experiments. The experimental roughness was obtained by measuring the surface roughness of the milled surfaces with a non-contact profilometer. The developed model can be used for any combination of material workpiece and tool, when tool flank wear is not considered and is suitable for using any tool diameter with any number of teeth and tool nose radius. The results show that the developed model achieved an excellent performance with almost 98% accuracy in terms of predicting the surface roughness when compared to the experimental data. © 2014 The Society of Manufacturing Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Contemporary models of contrast integration across space assume that pooling operates uniformly over the target region. For sparse stimuli, where high contrast regions are separated by areas containing no signal, this strategy may be sub-optimal because it pools more noise than signal as area increases. Little is known about the behaviour of human observers for detecting such stimuli. We performed an experiment in which three observers detected regular textures of various areas, and six levels of sparseness. Stimuli were regular grids of horizontal grating micropatches, each 1 cycle wide. We varied the ratio of signals (marks) to gaps (spaces), with mark:space ratios ranging from 1 : 0 (a dense texture with no spaces) to 1 : 24. To compensate for the decline in sensitivity with increasing distance from fixation, we adjusted the stimulus contrast as a function of eccentricity based on previous measurements [Baldwin, Meese & Baker, 2012, J Vis, 12(11):23]. We used the resulting area summation functions and psychometric slopes to test several filter-based models of signal combination. A MAX model failed to predict the thresholds, but did a good job on the slopes. Blanket summation of stimulus energy improved the threshold fit, but did not predict an observed slope increase with mark:space ratio. Our best model used a template matched to the sparseness of the stimulus, and pooled the squared contrast signal over space. Templates for regular patterns have also recently been proposed to explain the regular appearance of slightly irregular textures (Morgan et al, 2012, Proc R Soc B, 279, 2754–2760)