61 resultados para MACROSCOPIC QUANTUM PHENOMENA IN MAGNETIC SYSTEMS
Resumo:
The soil fauna is often a neglected group in many large-scale studies of farmland biodiversity due to difficulties in extracting organisms efficiently from the soil. This study assesses the relative efficiency of the simple and cheap sampling method of handsorting against Berlese-Tullgren funnel and Winkler apparatus extraction. Soil cores were taken from grassy arable field margins and wheat fields in Cambridgeshire, UK, and the efficiencies of the three methods in assessing the abundances and species densities of soil macroinver-tebrates were compared. Handsorting in most cases was as efficient at extracting the majority of the soil macrofauna as the Berlese-Tullgren funnel and Winkler bag methods, although it underestimated the species densities of the woodlice and adult beetles. There were no obvious biases among the three methods for the particular vegetation types sampled and no significant differences in the size distributions of the earthworms and beetles. Proportionally fewer damaged earthworms were recorded in larger (25 x 25 cm) soil cores when compared with smaller ones (15 x 15 cm). Handsorting has many benefits, including targeted extraction, minimum disturbance to the habitat and shorter sampling periods and may be the most appropriate method for studies of farmland biodiversity when a high number of soil cores need to be sampled. (C) 2008 Elsevier Masson SAS. All rights reserved.
Resumo:
The presence of a grass strip was found to be beneficial to soil macrofauna, increasing the species densities and abundances of earthworms, woodlice and staphylinid beetles. The biodiversity of the three main feeding groups - predators, soil ingesters and litter consumers - was also significantly higher in the grass strips than in the field edges without strips, indicating that establishment of grassy margins in arable fields may enhance ecosystem services such as soil fertility and pest control. The grass strip habitat contained a large number of species of soil macrofauna, being second only to hedgerow habitat, with 10% of the total species list for the farm found only within the margins. Of the rare species recorded on the farm, five of the nine were from the grass strips, four of which were found only there. This study shows that establishing grassy strips in the margins of arable fields increases the biodiversity of the soil macrofauna, both within fields (alpha diversity) and across the farm (beta diversity). (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
The horticultural industry was instrumental in the early development and exploitation of genetic techniques over a century ago. This review will describe recent advances in a range of in vitro methods and their application to plant breeding, with especial emphasis on horticultural crops. These methods include improvements in the efficiency of haploid breeding techniques in many fruit and vegetable species using either microspore-derived or ovule-derived plants. Significant molecular information is now available to supplement these essentially empirical approaches and this may enable the more predictable application of these technologies in previously intransigent crops. Similarly there are now improved techniques for isolation of somatic hybrids, by application of either in vitro fertilisation or the culture of excised ovules from interspecific crosses. In addition to examples taken from the traditional scientific literature, emphasis will also be given to the use of patent databases as a valuable source of information on recent novel technologies developed in the commercial world.
Resumo:
Acrylamide and pyrazine formation, as influenced by the incorporation of different amino acids, was investigated in sealed low-moisture asparagine-glucose model systems. Added amino acids, with the exception of glycine and cysteine and at an equimolar concentration to asparagine, increased the rate of acrylamide formation. The strong correlation between the unsubstituted pyrazine and acrylamide suggests the promotion of the formation of Maillard reaction intermediates, and in particular glyoxal, as the determining mode of-action. At increased amino acid concentrations, diverse effects were observed. The initial rates of acrylamide formation remained high for valine, alanine, phenylalanine, tryptophan, glutamine, and Ieucine, while a significant mitigating effect, as evident from the acrylamide yields after 60 min of heating at 160 degrees C, was observed for proline, tryptophan, glycine, and cysteine. The secondary amine containing amino acids, proline and tryptophan, had the most profound mitigating effect on acrylamide after 60 min of heating. The relative importance of the competing effect of added amino acids for alpha-dicarbonyls and acrylamide-amino, acid alkylation reactions is discussed and accompanied by data on the relative formation rates of selected amino acid-AA adducts.
Resumo:
The effect of different sugars and glyoxal on the formation of acrylamide in low-moisture starch-based model systems was studied, and kinetic data were obtained. Glucose was more effective than fructose, tagatose, or maltose in acrylamide formation, whereas the importance of glyoxal as a key sugar fragmentation intermediate was confirmed. Glyoxal formation was greater in model systems containing asparagine and glucose rather than fructose. A solid phase microextraction GC-MS method was employed to determine quantitatively the formation of pyrazines in model reaction systems. Substituted pyrazine formation was more evident in model systems containing fructose; however, the unsubstituted homologue, which was the only pyrazine identified in the headspace of glyoxal-asparagine systems, was formed at higher yields when aldoses were used as the reducing sugar. Highly significant correlations were obtained for the relationship between pyrazine and acrylamide formation. The importance of the tautomerization of the asparagine-carbonyl decarboxylated Schiff base in the relative yields of pyrazines and acrylamide is discussed.
Resumo:
The SPE taxonomy of evolving software systems, first proposed by Lehman in 1980, is re-examined in this work. The primary concepts of software evolution are related to generic theories of evolution, particularly Dawkins' concept of a replicator, to the hermeneutic tradition in philosophy and to Kuhn's concept of paradigm. These concepts provide the foundations that are needed for understanding the phenomenon of software evolution and for refining the definitions of the SPE categories. In particular, this work argues that a software system should be defined as of type P if its controlling stakeholders have made a strategic decision that the system must comply with a single paradigm in its representation of domain knowledge. The proposed refinement of SPE is expected to provide a more productive basis for developing testable hypotheses and models about possible differences in the evolution of E- and P-type systems than is provided by the original scheme. Copyright (C) 2005 John Wiley & Sons, Ltd.
Resumo:
Although extensively studied within the lidar community, the multiple scattering phenomenon has always been considered a rare curiosity by radar meteorologists. Up to few years ago its appearance has only been associated with two- or three-body-scattering features (e.g. hail flares and mirror images) involving highly reflective surfaces. Recent atmospheric research aimed at better understanding of the water cycle and the role played by clouds and precipitation in affecting the Earth's climate has driven the deployment of high frequency radars in space. Examples are the TRMM 13.5 GHz, the CloudSat 94 GHz, the upcoming EarthCARE 94 GHz, and the GPM dual 13-35 GHz radars. These systems are able to detect the vertical distribution of hydrometeors and thus provide crucial feedbacks for radiation and climate studies. The shift towards higher frequencies increases the sensitivity to hydrometeors, improves the spatial resolution and reduces the size and weight of the radar systems. On the other hand, higher frequency radars are affected by stronger extinction, especially in the presence of large precipitating particles (e.g. raindrops or hail particles), which may eventually drive the signal below the minimum detection threshold. In such circumstances the interpretation of the radar equation via the single scattering approximation may be problematic. Errors will be large when the radiation emitted from the radar after interacting more than once with the medium still contributes substantially to the received power. This is the case if the transport mean-free-path becomes comparable with the instrument footprint (determined by the antenna beam-width and the platform altitude). This situation resembles to what has already been experienced in lidar observations, but with a predominance of wide- versus small-angle scattering events. At millimeter wavelengths, hydrometeors diffuse radiation rather isotropically compared to the visible or near infrared region where scattering is predominantly in the forward direction. A complete understanding of radiation transport modeling and data analysis methods under wide-angle multiple scattering conditions is mandatory for a correct interpretation of echoes observed by space-borne millimeter radars. This paper reviews the status of research in this field. Different numerical techniques currently implemented to account for higher order scattering are reviewed and their weaknesses and strengths highlighted. Examples of simulated radar backscattering profiles are provided with particular emphasis given to situations in which the multiple scattering contributions become comparable or overwhelm the single scattering signal. We show evidences of multiple scattering effects from air-borne and from CloudSat observations, i.e. unique signatures which cannot be explained by single scattering theory. Ideas how to identify and tackle the multiple scattering effects are discussed. Finally perspectives and suggestions for future work are outlined. This work represents a reference-guide for studies focused at modeling the radiation transport and at interpreting data from high frequency space-borne radar systems that probe highly opaque scattering media such as thick ice clouds or precipitating clouds.
Resumo:
Progress is reported in the development of a new synthesis method for the design of filters and coatings for use in spaceborne infrared optics. This method uses the Golden Section optimization routine to make a search, using designated dielectric thin film combinations, for the coating design which fulfills the required spectral requirements. The final design is that which uses the least number of layers for the given thin film materials in the starting design. This synthesis method has successfully been used to design broadband anti-reflection coatings on infrared substrates. The 6 micrometers to 18 micrometers anti-reflection coating for the germanium optics of the HIRDLS instrument, to be flown on the NASA EOS-Chem satellite, is given as an example. By correctly defining the target function to describe any specific type of filter in the optimization part of the method, this synthesis method may be used to design general filters for use in spaceborne infrared optics.
Resumo:
One of the most pervading concepts underlying computational models of information processing in the brain is linear input integration of rate coded uni-variate information by neurons. After a suitable learning process this results in neuronal structures that statically represent knowledge as a vector of real valued synaptic weights. Although this general framework has contributed to the many successes of connectionism, in this paper we argue that for all but the most basic of cognitive processes, a more complex, multi-variate dynamic neural coding mechanism is required - knowledge should not be spacially bound to a particular neuron or group of neurons. We conclude the paper with discussion of a simple experiment that illustrates dynamic knowledge representation in a spiking neuron connectionist system.
Resumo:
Information modelling is a topic that has been researched a great deal, but still many questions around it have not been solved. An information model is essential in the design of a database which is the core of an information system. Currently most of databases only deal with information that represents facts, or asserted information. The ability of capturing semantic aspect has to be improved, and yet other types, such as temporal and intentional information, should be considered. Semantic Analysis, a method of information modelling, has offered a way to handle various aspects of information. It employs the domain knowledge and communication acts as sources of information modelling. It lends itself to a uniform structure whereby semantic, temporal and intentional information can be captured, which builds a sound foundation for building a semantic temporal database.