47 resultados para Perturb and Observe

em CentAUR: Central Archive University of Reading - UK


Relevância:

80.00% 80.00%

Publicador:

Resumo:

This chapter takes the example of local African beekeeping to explore how the forest can act as an important locus for men's work in Western Tanzania. Here we scrutinise how beekeeping enables its practitioners to situate themselves in the forest locality and observe how the social relationships, interactions and everyday practices entailed in living and working together are a means through which beekeepers generate a sense of belonging and identity. As part and parcel of this process, men transmit their skills to a new generation, thus reproducing themselves and their social environment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this study, we examine the options market reaction to bank loan announcements for the population of US firms with traded options and loan announcements during 1996-2010. We get evidence on a significant options market reaction to bank loan announcements in terms of levels and changes in short-term implied volatility and its term structure, and observe significant decreases in short-term implied volatility, and significant increases in the slope of its term structure as a result of loan announcements. Our findings appear to be more pronounced for firms with more information asymmetry, lower credit ratings and loans with longer maturities and higher spreads. Evidence is consistent with loan announcements providing reassurance for investors in the short-term, however, over longer time horizons, the increase in the TSIV slope indicates that investors become increasingly unsure over the potential risks of loan repayment or uses of the proceeds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the results of stable carbon and nitrogen isotope analysis of bone collagen for 155 individuals buried at the Later Medieval (13th to early 16th century AD) Gilbertine priory of St. Andrew, Fishergate in the city of York (UK). The data show significant variation in the consumption of marine foods between males and females as well as between individuals buried in different areas of the priory. Specifically, individuals from the crossing of the church and the cloister garth had consumed significantly less marine protein than those from other locations. Isotope data for four individuals diagnosed with diffuse idiopathic skeletal hyperostosis (DISH) are consistent with a diet rich in animal protein. We also observe that isotopic signals of individuals with perimortem sharp force trauma are unusual in the context of the Fishergate dataset. We discuss possible explanations for these patterns and suggest that there may have been a specialist hospital or a local tradition of burying victims of violent conflict at the priory. The results demonstrate how the integration of archaeological, osteological, and isotopic data can provide novel information about Medieval burial and society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Negative correlations between task performance in dynamic control tasks and verbalizable knowledge, as assessed by a post-task questionnaire, have been interpreted as dissociations that indicate two antagonistic modes of learning, one being “explicit”, the other “implicit”. This paper views the control tasks as finite-state automata and offers an alternative interpretation of these negative correlations. It is argued that “good controllers” observe fewer different state transitions and, consequently, can answer fewer post-task questions about system transitions than can “bad controllers”. Two experiments demonstrate the validity of the argument by showing the predicted negative relationship between control performance and the number of explored state transitions, and the predicted positive relationship between the number of explored state transitions and questionnaire scores. However, the experiments also elucidate important boundary conditions for the critical effects. We discuss the implications of these findings, and of other problems arising from the process control paradigm, for conclusions about implicit versus explicit learning processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The spatial distribution of aerosol chemical composition and the evolution of the Organic Aerosol (OA) fraction is investigated based upon airborne measurements of aerosol chemical composition in the planetary boundary layer across Europe. Sub-micron aerosol chemical composition was measured using a compact Time-of-Flight Aerosol Mass Spectrometer (cToF-AMS). A range of sampling conditions were evaluated, including relatively clean background conditions, polluted conditions in North-Western Europe and the near-field to far-field outflow from such conditions. Ammonium nitrate and OA were found to be the dominant chemical components of the sub-micron aerosol burden, with mass fractions ranging from 20--50% each. Ammonium nitrate was found to dominate in North-Western Europe during episodes of high pollution, reflecting the enhanced NO_x and ammonia sources in this region. OA was ubiquitous across Europe and concentrations generally exceeded sulphate by 30--160%. A factor analysis of the OA burden was performed in order to probe the evolution across this large range of spatial and temporal scales. Two separate Oxygenated Organic Aerosol (OOA) components were identified; one representing an aged-OOA, termed Low Volatility-OOA and another representing fresher-OOA, termed Semi Volatile-OOA on the basis of their mass spectral similarity to previous studies. The factors derived from different flights were not chemically the same but rather reflect the range of OA composition sampled during a particular flight. Significant chemical processing of the OA was observed downwind of major sources in North-Western Europe, with the LV-OOA component becoming increasingly dominant as the distance from source and photochemical processing increased. The measurements suggest that the aging of OA can be viewed as a continuum, with a progression from a less oxidised, semi-volatile component to a highly oxidised, less-volatile component. Substantial amounts of pollution were observed far downwind of continental Europe, with OA and ammonium nitrate being the major constituents of the sub-micron aerosol burden. Such anthropogenically perturbed air masses can significantly perturb regional climate far downwind of major source regions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By making use of TOVS Path-B satellite retrievals and ECMWF reanalyses, correlations between bulk microphysical properties of large-scale semi-transparent cirrus (visible optical thickness between 0.7 and 3.8) and thermodynamic and dynamic properties of the surrounding atmosphere have been studied on a global scale. These clouds constitute about half of all high clouds. The global averages (from 60°N to 60°S) of mean ice crystal diameter, De, and ice water path (IWP) of these clouds are 55 μm and 30 g m−2, respectively. IWP of these cirrus is slightly increasing with cloud-top temperature, whereas De of cold cirrus does not depend on this parameter. Correlations between De and IWp of large-scale cirrus seem to be different in the midlatitudes and in the tropics. However, we observe in general stronger correlations between De and IWP and atmospheric humidity and winds deduced from the ECMWF reanalyses: De and IWP increase both with increasing atmospheric water vapour. There is also a good distinction between different dynamical situations: In humid situations, IWP is on average about 10 gm−2 larger in regions with strong large-scale vertical updraft only that in regions with strong large-scale horizontal winds only, whereas the mean De of cold large-scale cirrus decreases by about 10 μm if both strong large-scale updraft and horizontal winds are present.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aircraft OH and HO2 measurements made over West Africa during the AMMA field campaign in summer 2006 have been investigated using a box model constrained to observations of long-lived species and physical parameters. "Good" agreement was found for HO2 (modelled to observed gradient of 1.23 ± 0.11). However, the model significantly overpredicts OH concentrations. The reasons for this are not clear, but may reflect instrumental instabilities affecting the OH measurements. Within the model, HOx concentrations in West Africa are controlled by relatively simple photochemistry, with production dominated by ozone photolysis and reaction of O(1D) with water vapour, and loss processes dominated by HO2 + HO2 and HO2 + RO2. Isoprene chemistry was found to influence forested regions. In contrast to several recent field studies in very low NOx and high isoprene environments, we do not observe any dependence of model success for HO2 on isoprene and attribute this to efficient recycling of HOx through RO2 + NO reactions under the moderate NOx concentrations (5–300 ppt NO in the boundary layer, median 76 ppt) encountered during AMMA. This suggests that some of the problems with understanding the impact of isoprene on atmospheric composition may be limited to the extreme low range of NOx concentrations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A full assessment of para-­virtualization is important, because without knowledge about the various overheads, users can not understand whether using virtualization is a good idea or not. In this paper we are very interested in assessing the overheads of running various benchmarks on bare-­‐metal, as well as on para-­‐virtualization. The idea is to see what the overheads of para-­‐ virtualization are, as well as looking at the overheads of turning on monitoring and logging. The knowledge from assessing various benchmarks on these different systems will help a range of users understand the use of virtualization systems. In this paper we assess the overheads of using Xen, VMware, KVM and Citrix, see Table 1. These different virtualization systems are used extensively by cloud-­‐users. We are using various Netlib1 benchmarks, which have been developed by the University of Tennessee at Knoxville (UTK), and Oak Ridge National Laboratory (ORNL). In order to assess these virtualization systems, we run the benchmarks on bare-­‐metal, then on the para-­‐virtualization, and finally we turn on monitoring and logging. The later is important as users are interested in Service Level Agreements (SLAs) used by the Cloud providers, and the use of logging is a means of assessing the services bought and used from commercial providers. In this paper we assess the virtualization systems on three different systems. We use the Thamesblue supercomputer, the Hactar cluster and IBM JS20 blade server (see Table 2), which are all servers available at the University of Reading. A functional virtualization system is multi-­‐layered and is driven by the privileged components. Virtualization systems can host multiple guest operating systems, which run on its own domain, and the system schedules virtual CPUs and memory within each Virtual Machines (VM) to make the best use of the available resources. The guest-­‐operating system schedules each application accordingly. You can deploy virtualization as full virtualization or para-­‐virtualization. Full virtualization provides a total abstraction of the underlying physical system and creates a new virtual system, where the guest operating systems can run. No modifications are needed in the guest OS or application, e.g. the guest OS or application is not aware of the virtualized environment and runs normally. Para-­‐virualization requires user modification of the guest operating systems, which runs on the virtual machines, e.g. these guest operating systems are aware that they are running on a virtual machine, and provide near-­‐native performance. You can deploy both para-­‐virtualization and full virtualization across various virtualized systems. Para-­‐virtualization is an OS-­‐assisted virtualization; where some modifications are made in the guest operating system to enable better performance. In this kind of virtualization, the guest operating system is aware of the fact that it is running on the virtualized hardware and not on the bare hardware. In para-­‐virtualization, the device drivers in the guest operating system coordinate the device drivers of host operating system and reduce the performance overheads. The use of para-­‐virtualization [0] is intended to avoid the bottleneck associated with slow hardware interrupts that exist when full virtualization is employed. It has revealed [0] that para-­‐ virtualization does not impose significant performance overhead in high performance computing, and this in turn this has implications for the use of cloud computing for hosting HPC applications. The “apparent” improvement in virtualization has led us to formulate the hypothesis that certain classes of HPC applications should be able to execute in a cloud environment, with minimal performance degradation. In order to support this hypothesis, first it is necessary to define exactly what is meant by a “class” of application, and secondly it will be necessary to observe application performance, both within a virtual machine and when executing on bare hardware. A further potential complication is associated with the need for Cloud service providers to support Service Level Agreements (SLA), so that system utilisation can be audited.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rationalizing non-participation as a resource deficiency in the household, this paper identifies strategies for milk-market development in the Ethiopian highlands. The additional amounts of covariates required for Positive marketable surplus -'distances-to market'-are computed from a model in which production and sales are correlated; sales are left-censored at some Unobserved thresholds production efficiencies are heterogeneous: and the data are in the form of a panel. Incorporating these features into the modeling exercise ant because they are fundamental to the data-generating environment. There are four reasons. First, because production and sales decisions are enacted within the same household, both decisions are affected by the same exogenous shocks, and production and sales are therefore likely to be correlated. Second. because selling, involves time and time is arguably the most important resource available to a subsistence household, the minimum Sales amount is not zero but, rather, some unobserved threshold that lies beyond zero. Third. the Potential existence of heterogeneous abilities in management, ones that lie latent from the econometrician's perspective, suggest that production efficiencies should be permitted to vary across households. Fourth, we observe a single set of households during multiple visits in a single production year. The results convey clearly that institutional and production) innovations alone are insufficient to encourage participation. Market-precipitating innovation requires complementary inputs, especially improvements in human capital and reductions in risk. Copyright (c) 20 08 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genealogical data have been used very widely to construct indices with which to examine the contribution of plant breeding programmes to the maintenance and enhancement of genetic resources. In this paper we use such indices to examine changes in the genetic diversity of the winter wheat crop in England and Wales between 1923 and 1995. We find that, except for one period characterized by the dominance of imported varieties, the genetic diversity of the winter wheat crop has been remarkably stable. This agrees with many studies of plant breeding programmes elsewhere. However, underlying the stability of the winter wheat crop is accelerating varietal turnover without any significant diversification of the genetic resources used. Moreover, the changes we observe are more directly attributable to changes in the varietal shares of the area under winter wheat than to the genealogical relationship between the varieties sown. We argue, therefore, that while genealogical indices reflect how well plant breeders have retained and exploited the resources with which they started, these indices suffer from a critical limitation. They do not reflect the proportion of the available range of genetic resources which has been effectively utilized in the breeding programme: complex crosses of a given set of varieties can yield high indices, and yet disguise the loss (or non-utilization) of a large proportion of the available genetic diversity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nested clade phylogeographic analysis (NCPA) is a popular method for reconstructing the demographic history of spatially distributed populations from genetic data. Although some parts of the analysis are automated, there is no unique and widely followed algorithm for doing this in its entirety, beginning with the data, and ending with the inferences drawn from the data. This article describes a method that automates NCPA, thereby providing a framework for replicating analyses in an objective way. To do so, a number of decisions need to be made so that the automated implementation is representative of previous analyses. We review how the NCPA procedure has evolved since its inception and conclude that there is scope for some variability in the manual application of NCPA. We apply the automated software to three published datasets previously analyzed manually and replicate many details of the manual analyses, suggesting that the current algorithm is representative of how a typical user will perform NCPA. We simulate a large number of replicate datasets for geographically distributed, but entirely random-mating, populations. These are then analyzed using the automated NCPA algorithm. Results indicate that NCPA tends to give a high frequency of false positives. In our simulations we observe that 14% of the clades give a conclusive inference that a demographic event has occurred, and that 75% of the datasets have at least one clade that gives such an inference. This is mainly due to the generation of multiple statistics per clade, of which only one is required to be significant to apply the inference key. We survey the inferences that have been made in recent publications and show that the most commonly inferred processes (restricted gene flow with isolation by distance and contiguous range expansion) are those that are commonly inferred in our simulations. However, published datasets typically yield a richer set of inferences with NCPA than obtained in our random-mating simulations, and further testing of NCPA with models of structured populations is necessary to examine its accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Inelastic neutron scattering spectroscopy has been used to observe and characterise hydrogen on the carbon component of a Pt/C catalyst. INS provides the complete vibration spectrum of coronene, regarded as a molecular model of a graphite layer. The vibrational modes are assigned with the aid of ab initio density functional theory calculations and the INS spectra by the a-CLIMAX program. A spectrum for which the H modes of coronene have been computationally suppressed, a carbon-only coronene spectrum, is a better representation of the spectrum of a graphite layer than is coronene itself. Dihydrogen dosing of a Pt/C catalyst caused amplification of the surface modes of carbon, an effect described as H riding on carbon. From the enhancement of the low energy carbon modes (100-600 cm(-1)) it is concluded that spillover hydrogen becomes attached to dangling bonds at the edges of graphitic regions of the carbon support. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this introductory paper, and of this special issue of Cognition and Emotion, is to stimulate debate about theoretical issues that will inform child anxiety research in the coming years. Papers included in this special issue have arisen from an Economic and Social Research Council (ESRC, UK) funded seminar series, which we called Child Anxiety Theory and Treatment (CATTS). We begin with an overview of the CATTS project before discussing (1) the application of adult models of anxiety to children, and (2) the role of parents in child anxiety. We explore the utility of adult models of anxiety for child populations before discussing the problems that are associated with employing them uncritically in this context. The study of anxiety in children provides the opportunity to observe the trajectory of anxiety and to identify variables that causally influence its development. Parental influences are of particular interest and new and imaginative strategies are required to isolate the complex network of causal relationships therein. We conclude by suggesting that research into the causes and developmental course of anxiety in children should be developed further. We also propose that, although much is known about the role of parents in the development of anxiety, it would be useful for research in this area to move towards an examination of the specific processes involved. We hope that these views represent a constructive agenda for people in the field to consider when planning future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Establishing a molecular-level understanding of enantioselectivity and chiral resolution at the organic−inorganic interfaces is a key challenge in the field of heterogeneous catalysis. As a model system, we investigate the adsorption geometry of serine on Cu{110} using a combination of low-energy electron diffraction (LEED), scanning tunneling microscopy (STM), X-ray photoelectron spectroscopy (XPS), and near-edge X-ray absorption fine structure (NEXAFS) spectroscopy. The chirality of enantiopure chemisorbed layers, where serine is in its deprotonated (anionic) state, is expressed at three levels: (i) the molecules form dimers whose orientation with respect to the substrate depends on the molecular chirality, (ii) dimers of l- and d-enantiomers aggregate into superstructures with chiral (−1 2; 4 0) lattices, respectively, which are mirror images of each other, and (iii) small islands have elongated shapes with the dominant direction depending on the chirality of the molecules. Dimer and superlattice formation can be explained in terms of intra- and interdimer bonds involving carboxylate, amino, and β−OH groups. The stability of the layers increases with the size of ordered islands. In racemic mixtures, we observe chiral resolution into small ordered enantiopure islands, which appears to be driven by the formation of homochiral dimer subunits and the directionality of interdimer hydrogen bonds. These islands show the same enantiospecific elongated shapes those as in low-coverage enantiopure layers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We performed atomistic molecular dynamics simulations of anionic and cationic micelles in the presence of poly(ethylene oxide) (PEO) to understand why nonionic water-soluble polymers such as PEO interact strongly with anionic micelles but only weakly with cationic micelles. Our micelles include sodium n-dodecyl sulfate (SDS), n-dodecyl trimethylammonium chloride (DTAC), n-dodecyl ammonium chloride (DAC), and micelles in which we artificially reverse the sign of partial charges in SDS and DTAC. We observe that the polymer interacts hydrophobically with anionic SDS but only weakly with cationic DTAC and DAC, in agreement with experiment. However, the polymer also interacts with the artificial anionic DTAC but fails to interact hydrophobically with the artificial cationic SDS, illustrating that large headgroup size does not explain the weak polymer interaction with cationic micelles. In addition, we observe through simulation that this preference for interaction with anionic micelles still exists in a dipolar "dumbbell" solvent, indicating that water structure and hydrogen bonding alone cannot explain this preferential interaction. Our simulations suggest that direct electrostatic interactions between the micelle and polymer explain the preference for interaction with anionic micelles, even though the polymer overall carries no net charge. This is possible given the asymmetric distribution of negative charges on smaller atoms and positive charges oil larger units in the polymer chain.