65 resultados para Reliability in refrigeration systems
Resumo:
Acrylamide and pyrazine formation, as influenced by the incorporation of different amino acids, was investigated in sealed low-moisture asparagine-glucose model systems. Added amino acids, with the exception of glycine and cysteine and at an equimolar concentration to asparagine, increased the rate of acrylamide formation. The strong correlation between the unsubstituted pyrazine and acrylamide suggests the promotion of the formation of Maillard reaction intermediates, and in particular glyoxal, as the determining mode of-action. At increased amino acid concentrations, diverse effects were observed. The initial rates of acrylamide formation remained high for valine, alanine, phenylalanine, tryptophan, glutamine, and Ieucine, while a significant mitigating effect, as evident from the acrylamide yields after 60 min of heating at 160 degrees C, was observed for proline, tryptophan, glycine, and cysteine. The secondary amine containing amino acids, proline and tryptophan, had the most profound mitigating effect on acrylamide after 60 min of heating. The relative importance of the competing effect of added amino acids for alpha-dicarbonyls and acrylamide-amino, acid alkylation reactions is discussed and accompanied by data on the relative formation rates of selected amino acid-AA adducts.
Resumo:
The effect of different sugars and glyoxal on the formation of acrylamide in low-moisture starch-based model systems was studied, and kinetic data were obtained. Glucose was more effective than fructose, tagatose, or maltose in acrylamide formation, whereas the importance of glyoxal as a key sugar fragmentation intermediate was confirmed. Glyoxal formation was greater in model systems containing asparagine and glucose rather than fructose. A solid phase microextraction GC-MS method was employed to determine quantitatively the formation of pyrazines in model reaction systems. Substituted pyrazine formation was more evident in model systems containing fructose; however, the unsubstituted homologue, which was the only pyrazine identified in the headspace of glyoxal-asparagine systems, was formed at higher yields when aldoses were used as the reducing sugar. Highly significant correlations were obtained for the relationship between pyrazine and acrylamide formation. The importance of the tautomerization of the asparagine-carbonyl decarboxylated Schiff base in the relative yields of pyrazines and acrylamide is discussed.
Resumo:
The evolvability of a software artifact is its capacity for producing heritable or reusable variants; the inverse quality is the artifact's inertia or resistance to evolutionary change. Evolvability in software systems may arise from engineering and/or self-organising processes. We describe our 'Conditional Growth' simulation model of software evolution and show how, it can be used to investigate evolvability from a self-organisation perspective. The model is derived from the Bak-Sneppen family of 'self-organised criticality' simulations. It shows good qualitative agreement with Lehman's 'laws of software evolution' and reproduces phenomena that have been observed empirically. The model suggests interesting predictions about the dynamics of evolvability and implies that much of the observed variability in software evolution can be accounted for by comparatively simple self-organising processes.
Resumo:
This paper identifies the major challenges in the area of pattern formation. The work is also motivated by the need for development of a single framework to surmount these challenges. A framework based on the control of macroscopic parameters is proposed. The issue of transformation of patterns is specifically considered. A definition for transformation and four special cases, namely elementary and geometrical transformations by repositioning all or some robots in the pattern are provided. Two feasible tools for pattern transformation namely, a macroscopic parameter method and a mathematical tool - Moebius transformation also known as the linear fractional transformation are introduced. The realization of the unifying framework considering planning and communication is reported.
Resumo:
The SPE taxonomy of evolving software systems, first proposed by Lehman in 1980, is re-examined in this work. The primary concepts of software evolution are related to generic theories of evolution, particularly Dawkins' concept of a replicator, to the hermeneutic tradition in philosophy and to Kuhn's concept of paradigm. These concepts provide the foundations that are needed for understanding the phenomenon of software evolution and for refining the definitions of the SPE categories. In particular, this work argues that a software system should be defined as of type P if its controlling stakeholders have made a strategic decision that the system must comply with a single paradigm in its representation of domain knowledge. The proposed refinement of SPE is expected to provide a more productive basis for developing testable hypotheses and models about possible differences in the evolution of E- and P-type systems than is provided by the original scheme. Copyright (C) 2005 John Wiley & Sons, Ltd.
Resumo:
Although extensively studied within the lidar community, the multiple scattering phenomenon has always been considered a rare curiosity by radar meteorologists. Up to few years ago its appearance has only been associated with two- or three-body-scattering features (e.g. hail flares and mirror images) involving highly reflective surfaces. Recent atmospheric research aimed at better understanding of the water cycle and the role played by clouds and precipitation in affecting the Earth's climate has driven the deployment of high frequency radars in space. Examples are the TRMM 13.5 GHz, the CloudSat 94 GHz, the upcoming EarthCARE 94 GHz, and the GPM dual 13-35 GHz radars. These systems are able to detect the vertical distribution of hydrometeors and thus provide crucial feedbacks for radiation and climate studies. The shift towards higher frequencies increases the sensitivity to hydrometeors, improves the spatial resolution and reduces the size and weight of the radar systems. On the other hand, higher frequency radars are affected by stronger extinction, especially in the presence of large precipitating particles (e.g. raindrops or hail particles), which may eventually drive the signal below the minimum detection threshold. In such circumstances the interpretation of the radar equation via the single scattering approximation may be problematic. Errors will be large when the radiation emitted from the radar after interacting more than once with the medium still contributes substantially to the received power. This is the case if the transport mean-free-path becomes comparable with the instrument footprint (determined by the antenna beam-width and the platform altitude). This situation resembles to what has already been experienced in lidar observations, but with a predominance of wide- versus small-angle scattering events. At millimeter wavelengths, hydrometeors diffuse radiation rather isotropically compared to the visible or near infrared region where scattering is predominantly in the forward direction. A complete understanding of radiation transport modeling and data analysis methods under wide-angle multiple scattering conditions is mandatory for a correct interpretation of echoes observed by space-borne millimeter radars. This paper reviews the status of research in this field. Different numerical techniques currently implemented to account for higher order scattering are reviewed and their weaknesses and strengths highlighted. Examples of simulated radar backscattering profiles are provided with particular emphasis given to situations in which the multiple scattering contributions become comparable or overwhelm the single scattering signal. We show evidences of multiple scattering effects from air-borne and from CloudSat observations, i.e. unique signatures which cannot be explained by single scattering theory. Ideas how to identify and tackle the multiple scattering effects are discussed. Finally perspectives and suggestions for future work are outlined. This work represents a reference-guide for studies focused at modeling the radiation transport and at interpreting data from high frequency space-borne radar systems that probe highly opaque scattering media such as thick ice clouds or precipitating clouds.
Resumo:
Progress is reported in the development of a new synthesis method for the design of filters and coatings for use in spaceborne infrared optics. This method uses the Golden Section optimization routine to make a search, using designated dielectric thin film combinations, for the coating design which fulfills the required spectral requirements. The final design is that which uses the least number of layers for the given thin film materials in the starting design. This synthesis method has successfully been used to design broadband anti-reflection coatings on infrared substrates. The 6 micrometers to 18 micrometers anti-reflection coating for the germanium optics of the HIRDLS instrument, to be flown on the NASA EOS-Chem satellite, is given as an example. By correctly defining the target function to describe any specific type of filter in the optimization part of the method, this synthesis method may be used to design general filters for use in spaceborne infrared optics.
Resumo:
One of the most pervading concepts underlying computational models of information processing in the brain is linear input integration of rate coded uni-variate information by neurons. After a suitable learning process this results in neuronal structures that statically represent knowledge as a vector of real valued synaptic weights. Although this general framework has contributed to the many successes of connectionism, in this paper we argue that for all but the most basic of cognitive processes, a more complex, multi-variate dynamic neural coding mechanism is required - knowledge should not be spacially bound to a particular neuron or group of neurons. We conclude the paper with discussion of a simple experiment that illustrates dynamic knowledge representation in a spiking neuron connectionist system.
Resumo:
Information modelling is a topic that has been researched a great deal, but still many questions around it have not been solved. An information model is essential in the design of a database which is the core of an information system. Currently most of databases only deal with information that represents facts, or asserted information. The ability of capturing semantic aspect has to be improved, and yet other types, such as temporal and intentional information, should be considered. Semantic Analysis, a method of information modelling, has offered a way to handle various aspects of information. It employs the domain knowledge and communication acts as sources of information modelling. It lends itself to a uniform structure whereby semantic, temporal and intentional information can be captured, which builds a sound foundation for building a semantic temporal database.
Resumo:
With the advance of information technology capabilities, and the importance of human computer interfaces within society there has been a significant increase in research activity within the field of human computer interaction (HCI). This paper summarizes some of the work undertaken to date, paying particular attention to methods applicable to on-line control and monitoring systems such as those employed by The National Grid Company plc.