13 resultados para semi free-choice
em CentAUR: Central Archive University of Reading - UK
Resumo:
A common method for testing preference for objects is to determine which of a pair of objects is approached first in a paired-choice paradigm. In comparison, many studies of preference for environmental enrichment (EE) devices have used paradigms in which total time spent with each of a pair of objects is used to determine preference. While each of these paradigms gives a specific measure of the preference for one object in comparison to another, neither method allows comparisons between multiple objects simultaneously. Since it is possible that several EE objects would be placed in a cage together to improve animal welfare, it is important to determine measures for rats' preferences in conditions that mimic this potential home cage environment. While it would be predicted that each type of measure would produce similar rankings of objects, this has never been tested empirically. In this study, we compared two paradigms: EE objects were either presented in pairs (paired-choice comparison) or four objects were presented simultaneously (simultaneous presentation comparison). We used frequency of first interaction and time spent with each object to rank the objects in the paired-choice experiment, and time spent with each object to rank the objects in the simultaneous presentation experiment. We also considered the behaviours elicited by the objects to determine if these might be contributing to object preference. We demonstrated that object ranking based on time spent with objects from the paired-choice experiment predicted object ranking in the simultaneous presentation experiment. Additionally, we confirmed that behaviours elicited were an important determinant of time spent with an object. This provides convergent evidence that both paired choice and simultaneous comparisons provide valid measures of preference for EE objects in rats. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Choices not only reflect our preference, but they also affect our behavior. The phenomenon of choice-induced preference change has been of interest to cognitive dissonance researchers in social psychology, and more recently, it has attracted the attention of researchers in economics and neuroscience. Preference modulation after the mere act of making a choice has been repeatedly demonstrated over the last 50 years by an experimental paradigm called the “free-choice paradigm.” However, Chen and Risen (2010) pointed out a serious methodological flaw in this paradigm, arguing that evidence for choice-induced preference change is still insufficient. Despite the flaw, studies using the traditional free-choice paradigm continue to be published without addressing the criticism. Here, aiming to draw more attention to this issue, we briefly explain the methodological problem, and then describe simple simulation studies that illustrate how the free-choice paradigm produces a systematic pattern of preference change consistent with cognitive dissonance, even without any change in true preference. Our stimulation also shows how a different level of noise in each phase of the free-choice paradigm independently contributes to the magnitude of artificial preference change. Furthermore, we review ways of addressing the critique and provide a meta-analysis to show the effect size of choice-induced preference change after addressing the critique. Finally, we review and discuss, based on the results of the stimulation studies, how the criticism affects our interpretation of past findings generated from the free-choice paradigm. We conclude that the use of the conventional free-choice paradigm should be avoided in future research and the validity of past findings from studies using this paradigm should be empirically re-established. (PsycINFO Database Record (c) 2013 APA, all rights reserved)(journal abstract)
Resumo:
The type and quantity of fertilizer supplied to a crop will differ between organic and conventional farming practices. Altering the type of fertilizer a plant is provided with can influence a plant’s foliar nitrogen levels, as well as the composition and concentration of defence compounds, such as glucosinolates. Many natural enemies of insect herbivores can respond to headspace volatiles emitted by the herbivores’ host plant in response to herbivory. We propose that manipulating fertilizer type may also influence the headspace volatile profiles of plants, and as a result, the tritrophic interactions that occur between plants, their insect pests and those pests’ natural enemies. Here, we investigate a tritrophic system consisting of cabbage plants, Brassica oleracea, a parasitoid, Diaeretiella rapae, and one of its hosts, the specialist cabbage aphid Brevicoryne brassicae. Brassica oleracea plants were provided with either no additional fertilization or one of three types of fertilizer: Nitram (ammonium nitrate), John Innes base or organic chicken manure. We investigated whether these changes would alter the rate of parasitism of aphids on those plants and whether any differences in parasitism could be explained by differences in attractivity of the plants to D. rapae or attack rate of aphids by D. rapae. In free-choice experiments, there were significant differences in the percentage of B. brassicae parasitized by D. rapae between B. oleracea plants grown in different fertilizer treatments. In a series of dual-choice Y-tube olfactometry experiments, D. rapae females discriminated between B. brassicae-infested and undamaged plants, but parasitoids did not discriminate between similarly infested plants grown in different fertilizer treatments. Correspondingly, in attack rate experiments, there were no differences in the rate that D. rapae attacked B. brassicae on B. oleracea plants grown in different fertilizer treatments. These findings are of direct relevance to sustainable and conventional farming practices.
Resumo:
The type and quantity of fertilizer supplied to a crop will differ between organic and conventional farming practices. Altering the type of fertilizer a plant is provided with can influence a plant’s foliar nitrogen levels, as well as the composition and concentration of defence compounds, such as glucosinolates. Many natural enemies of insect herbivores can respond to headspace volatiles emitted by the herbivores’ host plant in response to herbivory. We propose that manipulating fertilizer type may also influence the headspace volatile profiles of plants, and as a result, the tritrophic interactions that occur between plants, their insect pests and those pests’ natural enemies. Here, we investigate a tritrophic system consisting of cabbage plants, Brassica oleracea, a parasitoid, Diaeretiella rapae, and one of its hosts, the specialist cabbage aphid Brevicoryne brassicae. Brassica oleracea plants were provided with either no additional fertilization or one of three types of fertilizer: Nitram (ammonium nitrate), John Innes base or organic chicken manure. We investigated whether these changes would alter the rate of parasitism of aphids on those plants and whether any differences in parasitism could be explained by differences in attractivity of the plants to D. rapae or attack rate of aphids by D. rapae. In free-choice experiments, there were significant differences in the percentage of B. brassicae parasitized by D. rapae between B. oleracea plants grown in different fertilizer treatments. In a series of dual-choice Y-tube olfactometry experiments, D. rapae females discriminated between B. brassicae-infested and undamaged plants, but parasitoids did not discriminate between similarly infested plants grown in different fertilizer treatments. Correspondingly, in attack rate experiments, there were no differences in the rate that D. rapae attacked B. brassicae on B. oleracea plants grown in different fertilizer treatments. These findings are of direct relevance to sustainable and conventional farming practices.
Resumo:
In Britain, managed grass lawns provide the most traditional and widespread of garden and landscape practices in use today. Grass lawns are coming under increasing challenge as they tend to support a low level of biodiversity and can require substantial additional inputs to maintain. Here we apply a novel approach to the traditional monocultural lawnscape by replacing grasses entirely with clonal perennial forbs. We monitored changes in plant coverage and species composition over a two year period and here we report the results of a study comparing plant origin native, non-native and mixed) and mowing regime. This allows us to assess the viability of this construct as an alternative to traditional grass lawns. Grass-free lawns provided a similar level of plant cover to grass lawns. Both the mowing regime and the combination of species used affected this outcome, with native plant species seen to have the highest survival rates, and mowing at 4cm to produce the greatest amount of ground coverage and plant species diversity within grass-free lawns. Grass-free lawns required over 50% less mowing than a traditionally managed grass lawn. Observations suggest that plant forms that exhibited: a) a relatively fast growth rate, b) a relatively large individual leaf area, and c) an average leaf height substantially above the cut to be applied, were unsuitable for use in grass-free lawns. With an equivalent level of ground coverage to grass lawns, increased plant diversity and a reduced need for mowing, the grass-free lawn can be seen as a species diverse, lower input and potentially highly ornamental alternative to the traditional lawn format.
Resumo:
The gradualist approach to trade liberalization views the uniform tariffs implied by MFN status as an important step on the path to free trade. We investigate whether a regime of uniform tariffs will be preferable to discriminatory tariffs when countries engage in non-cooperative interaction in multilateral trade. The analysis includes product differentiation and asymmetric costs. We show that with the cost asymmetry the countries will disagree on the choice of tariff regime. When the choice of import tariffs and export subsidies is made sequentially the uniform tariff regime may not be sustainable, because of an incentive to deviate to a discriminatory regime. Hence, an international body is needed to ensure compliance with tariff agreement.
Resumo:
Free radicals from one-electron oxidation of the antimalarial drug pyronaridine have been studied by pulse radiolysis. The results show that pyronaridine is readily oxidised to an intermediate semi-iminoquine radical by inorganic and organic free radicals, including those derived from tryptophan and acetaminophen. The pyronaridine radical is rapidly reduced by both ascorbate and caffeic acid. The results indicate that the one-electron reduction potential of the pyronaridine radical at neutral pH lies between those of acetaminophen (707 mV) and caffeic acid (534 mV). The pyronaridine radical decays to produce the iminoquinone, detected by electrospray mass spectrometry, in a second-order process that density functional theory (DFT) calculations (UB3LYP/6-31+G*) suggest is a disproportionation reaction. Important calculated dimensions of pyronaridine, its phenoxyl and aminyl radical, as well as the iminoquinone, are presented.
Resumo:
We introduce the perspex machine which unifies projective geometry and Turing computation and results in a supra-Turing machine. We show two ways in which the perspex machine unifies symbolic and non-symbolic AI. Firstly, we describe concrete geometrical models that map perspexes onto neural networks, some of which perform only symbolic operations. Secondly, we describe an abstract continuum of perspex logics that includes both symbolic logics and a new class of continuous logics. We argue that an axiom in symbolic logic can be the conclusion of a perspex theorem. That is, the atoms of symbolic logic can be the conclusions of sub-atomic theorems. We argue that perspex space can be mapped onto the spacetime of the universe we inhabit. This allows us to discuss how a robot might be conscious, feel, and have free will in a deterministic, or semi-deterministic, universe. We ground the reality of our universe in existence. On a theistic point, we argue that preordination and free will are compatible. On a theological point, we argue that it is not heretical for us to give robots free will. Finally, we give a pragmatic warning as to the double-edged risks of creating robots that do, or alternatively do not, have free will.
Resumo:
This paper examines the normal force between two opposing polyelectrolyte brushes and the interpenetration of their chains that is responsible for sliding friction. It focuses on the special case of semi-dilute brushes in a salt-free theta solvent, for which Zhulina and Borisov [J. Chem. Phys., {\bf 107}, 5952, (1997)] have derived analytical predictions using the classical strong-stretching theory (SST) introduced by Semenov and developed by Milner, Witten and Cates. Interestingly, the SST predicts that the brushes contract maintaining a polymer-free gap as they are compressed together, which provides an explanation for the ultra-low frictional forces observed in experiment. We examine the degree to which the SST predictions are affected by chain fluctuations by employing self-consistent field theory (SCFT). While the normal force is relatively unaffected, fluctuations are found to have a strong impact on brush interpenetration. Even still, the contraction of the brushes does significantly prolong the onset of interpenetration, implying that a sizeable normal force can be achieved before the sliding friction becomes significant.
Resumo:
It is well known that gut bacteria contribute significantly to the host homeostasis, providing a range of benefits such as immune protection and vitamin synthesis. They also supply the host with a considerable amount of nutrients, making this ecosystem an essential metabolic organ. In the context of increasing evidence of the link between the gut flora and the metabolic syndrome, understanding the metabolic interaction between the host and its gut microbiota is becoming an important challenge of modern biology.1-4 Colonization (also referred to as normalization process) designates the establishment of micro-organisms in a former germ-free animal. While it is a natural process occurring at birth, it is also used in adult germ-free animals to control the gut floral ecosystem and further determine its impact on the host metabolism. A common procedure to control the colonization process is to use the gavage method with a single or a mixture of micro-organisms. This method results in a very quick colonization and presents the disadvantage of being extremely stressful5. It is therefore useful to minimize the stress and to obtain a slower colonization process to observe gradually the impact of bacterial establishment on the host metabolism. In this manuscript, we describe a procedure to assess the modification of hepatic metabolism during a gradual colonization process using a non-destructive metabolic profiling technique. We propose to monitor gut microbial colonization by assessing the gut microbial metabolic activity reflected by the urinary excretion of microbial co-metabolites by 1H NMR-based metabolic profiling. This allows an appreciation of the stability of gut microbial activity beyond the stable establishment of the gut microbial ecosystem usually assessed by monitoring fecal bacteria by DGGE (denaturing gradient gel electrophoresis).6 The colonization takes place in a conventional open environment and is initiated by a dirty litter soiled by conventional animals, which will serve as controls. Rodents being coprophagous animals, this ensures a homogenous colonization as previously described.7 Hepatic metabolic profiling is measured directly from an intact liver biopsy using 1H High Resolution Magic Angle Spinning NMR spectroscopy. This semi-quantitative technique offers a quick way to assess, without damaging the cell structure, the major metabolites such as triglycerides, glucose and glycogen in order to further estimate the complex interaction between the colonization process and the hepatic metabolism7-10. This method can also be applied to any tissue biopsy11,12.
Resumo:
The planning of semi-autonomous vehicles in traffic scenarios is a relatively new problem that contributes towards the goal of making road travel by vehicles free of human drivers. An algorithm needs to ensure optimal real time planning of multiple vehicles (moving in either direction along a road), in the presence of a complex obstacle network. Unlike other approaches, here we assume that speed lanes are not present and that different lanes do not need to be maintained for inbound and outbound traffic. Our basic hypothesis is to carry forward the planning task to ensure that a sufficient distance is maintained by each vehicle from all other vehicles, obstacles and road boundaries. We present here a 4-layer planning algorithm that consists of road selection (for selecting the individual roads of traversal to reach the goal), pathway selection (a strategy to avoid and/or overtake obstacles, road diversions and other blockages), pathway distribution (to select the position of a vehicle at every instance of time in a pathway), and trajectory generation (for generating a curve, smooth enough, to allow for the maximum possible speed). Cooperation between vehicles is handled separately at the different levels, the aim being to maximize the separation between vehicles. Simulated results exhibit behaviours of smooth, efficient and safe driving of vehicles in multiple scenarios; along with typical vehicle behaviours including following and overtaking.
Resumo:
We propose first, a simple task for the eliciting attitudes toward risky choice, the SGG lottery-panel task, which consists in a series of lotteries constructed to compensate riskier options with higher risk-return trade-offs. Using Principal Component Analysis technique, we show that the SGG lottery-panel task is capable of capturing two dimensions of individual risky decision making i.e. subjects’ average risk taking and their sensitivity towards variations in risk-return. From the results of a large experimental dataset, we confirm that the task systematically captures a number of regularities such as: A tendency to risk averse behavior (only around 10% of choices are compatible with risk neutrality); An attraction to certain payoffs compared to low risk lotteries, compatible with over-(under-) weighting of small (large) probabilities predicted in PT and; Gender differences, i.e. males being consistently less risk averse than females but both genders being similarly responsive to the increases in risk-premium. Another interesting result is that in hypothetical choices most individuals increase their risk taking responding to the increase in return to risk, as predicted by PT, while across panels with real rewards we see even more changes, but opposite to the expected pattern of riskier choices for higher risk-returns. Therefore, we conclude from our data that an “economic anomaly” emerges in the real reward choices opposite to the hypothetical choices. These findings are in line with Camerer's (1995) view that although in many domains, paid subjects probably do exert extra mental effort which improves their performance, choice over money gambles is not likely to be a domain in which effort will improve adherence to rational axioms (p. 635). Finally, we demonstrate that both dimensions of risk attitudes, average risk taking and sensitivity towards variations in the return to risk, are desirable not only to describe behavior under risk but also to explain behavior in other contexts, as illustrated by an example. In the second study, we propose three additional treatments intended to elicit risk attitudes under high stakes and mixed outcome (gains and losses) lotteries. Using a dataset obtained from a hypothetical implementation of the tasks we show that the new treatments are able to capture both dimensions of risk attitudes. This new dataset allows us to describe several regularities, both at the aggregate and within-subjects level. We find that in every treatment over 70% of choices show some degree of risk aversion and only between 0.6% and 15.3% of individuals are consistently risk neutral within the same treatment. We also confirm the existence of gender differences in the degree of risk taking, that is, in all treatments females prefer safer lotteries compared to males. Regarding our second dimension of risk attitudes we observe, in all treatments, an increase in risk taking in response to risk premium increases. Treatment comparisons reveal other regularities, such as a lower degree of risk taking in large stake treatments compared to low stake treatments and a lower degree of risk taking when losses are incorporated into the large stake lotteries. Results that are compatible with previous findings in the literature, for stake size effects (e.g., Binswanger, 1980; Antoni Bosch-Domènech & Silvestre, 1999; Hogarth & Einhorn, 1990; Holt & Laury, 2002; Kachelmeier & Shehata, 1992; Kühberger et al., 1999; B. J. Weber & Chapman, 2005; Wik et al., 2007) and domain effect (e.g., Brooks and Zank, 2005, Schoemaker, 1990, Wik et al., 2007). Whereas for small stake treatments, we find that the effect of incorporating losses into the outcomes is not so clear. At the aggregate level an increase in risk taking is observed, but also more dispersion in the choices, whilst at the within-subjects level the effect weakens. Finally, regarding responses to risk premium, we find that compared to only gains treatments sensitivity is lower in the mixed lotteries treatments (SL and LL). In general sensitivity to risk-return is more affected by the domain than the stake size. After having described the properties of risk attitudes as captured by the SGG risk elicitation task and its three new versions, it is important to recall that the danger of using unidimensional descriptions of risk attitudes goes beyond the incompatibility with modern economic theories like PT, CPT etc., all of which call for tests with multiple degrees of freedom. Being faithful to this recommendation, the contribution of this essay is an empirically and endogenously determined bi-dimensional specification of risk attitudes, useful to describe behavior under uncertainty and to explain behavior in other contexts. Hopefully, this will contribute to create large datasets containing a multidimensional description of individual risk attitudes, while at the same time allowing for a robust context, compatible with present and even future more complex descriptions of human attitudes towards risk.
Resumo:
Due to their broad differentiation potential and their persistence into adulthood, human neural crest-derived stem cells (NCSCs) harbour great potential for autologous cellular therapies, which include the treatment of neurodegenerative diseases and replacement of complex tissues containing various cell types, as in the case of musculoskeletal injuries. The use of serum-free approaches often results in insufficient proliferation of stem cells and foetal calf serum implicates the use of xenogenic medium components. Thus, there is much need for alternative cultivation strategies. In this study we describe for the first time a novel, human blood plasma based semi-solid medium for cultivation of human NCSCs. We cultivated human neural crest-derived inferior turbinate stem cells (ITSCs) within a blood plasma matrix, where they revealed higher proliferation rates compared to a standard serum-free approach. Three-dimensionality of the matrix was investigated using helium ion microscopy. ITSCs grew within the matrix as revealed by laser scanning microscopy. Genetic stability and maintenance of stemness characteristics were assured in 3D cultivated ITSCs, as demonstrated by unchanged expression profile and the capability for self-renewal. ITSCs pre-cultivated in the 3D matrix differentiated efficiently into ectodermal and mesodermal cell types, particularly including osteogenic cell types. Furthermore, ITSCs cultivated as described here could be easily infected with lentiviruses directly in substrate for potential tracing or gene therapeutic approaches. Taken together, the use of human blood plasma as an additive for a completely defined medium points towards a personalisable and autologous cultivation of human neural crest-derived stem cells under clinical grade conditions.