64 resultados para contingent fees
Resumo:
From 1948 to 1994, the agricultural sector was afforded special treatment in the GATT. We analyse the extent to which this agricultural exceptionalism was curbed as a result of the GATT Uruguay Round Agreement on Agriculture, discuss why it was curbed and finally explore the implication of this for EU policy making. We argue that, in particular, two major changes in GATT institutions brought about restrictions on agricultural exceptionalism. First, the Uruguay Round was a 'single undertaking' in which progress on other dossiers was contingent upon an outcome on agriculture. The EU had keenly supported this new decision rule in the GATT. Within the EU this led to the MacSharry reforms of the Common Agricultural Policy (CAP) in 1992, paving the way for a trade agreement on agriculture within the GATT. Second, under the new quasi-judicial dispute settlement procedure, countries are expected to bring their policies into conformity with WTO rules or face retaliatory trade sanctions. This has brought about a greater willingness on the part of the EU to submit its farm policy to WTO disciplines.
Resumo:
Despite decades of research, it remains controversial whether ecological communities converge towards a common structure determined by environmental conditions irrespective of assembly history. Here, we show experimentally that the answer depends on the level of community organization considered. In a 9-year grassland experiment, we manipulated initial plant composition on abandoned arable land and subsequently allowed natural colonization. Initial compositional variation caused plant communities to remain divergent in species identities, even though these same communities converged strongly in species traits. This contrast between species divergence and trait convergence could not be explained by dispersal limitation or community neutrality alone. Our results show that the simultaneous operation of trait-based assembly rules and species-level priority effects drives community assembly, making it both deterministic and historically contingent, but at different levels of community organization.
Resumo:
Given the growing impact of human activities on the sea, managers are increasingly turning to marine protected areas (MPAs) to protect marine habitats and species. Many MPAs have been unsuccessful, however, and lack of income has been identified as a primary reason for failure. In this study, data from a global survey of 79 MPAs in 36 countries were analysed and attempts made to construct predictive models to determine the income requirements of any given MPA. Statistical tests were used to uncover possible patterns and relationships in the data, with two basic approaches. In the first of these, an attempt was made to build an explanatory "bottom-up" model of the cost structures that might be required to pursue various management activities. This proved difficult in practice owing to the very broad range of applicable data, spanning many orders of magnitude. In the second approach, a "top-down" regression model was constructed using logarithms of the base data, in order to address the breadth of the data ranges. This approach suggested that MPA size and visitor numbers together explained 46% of the minimum income requirements (P < 0.001), with area being the slightly more influential factor. The significance of area to income requirements was of little surprise, given its profile in the literature. However, the relationship between visitors and income requirements might go some way to explaining why northern hemisphere MPAs with apparently high incomes still claim to be under-funded. The relationship between running costs and visitor numbers has important implications not only in determining a realistic level of funding for MPAs, but also in assessing from where funding might be obtained. Since a substantial proportion of the income of many MPAs appears to be utilized for amenity purposes, a case may be made for funds to be provided from the typically better resourced government social and educational budgets as well as environmental budgets. Similarly visitor fees, already an important source of funding for some MPAs, might have a broader role to play in how MPAs are financed in the future. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Pesticide residue in vegetables is a major food safety issue in Thailand. A range of vegetable products (organic/pesticide-free/ hydroponic) has emerged in Thai markets that guarantee compliance with maximum residue limits. The Government of Thailand is eager to extend the benefits of this suite of alternative vegetables to the entire population, particularly the semi-urban/rural segments that are often bypassed by such speciality products. However, little information is available to guide such an effort, particularly with regard to up-country consumer attitudes, shopping and consumption habits and willingness to pay premiums for such produce. This research aims to fill this gap in knowledge. It reports the results of a survey of vegetable consumption and shopping habits and attitudes of 608 consumers in northeast Thailand. Willingness to pay premiums for pesticide residue limit compliant vegetables is also assessed by using a contingent valuation method, and determinants of willingness to pay are examined using an ordered probit empirical model. Results indicate that, given adequate awareness of relative risks, even up-country consumers are willing to pay market premium levels for these products, and that inadequate availability, rather than lack of demand is the constraining factor. Willingness to pay is found to increase with income, age and supermarket sourcing of vegetables. We also discuss the challenge of improving availability at mainstream outlets.
Resumo:
An important element of the developing field of proteomics is to understand protein-protein interactions and other functional links amongst genes. Across-species correlation methods for detecting functional links work on the premise that functionally linked proteins will tend to show a common pattern of presence and absence across a range of genomes. We describe a maximum likelihood statistical model for predicting functional gene linkages. The method detects independent instances of the correlated gain or loss of pairs of proteins on phylogenetic trees, reducing the high rates of false positives observed in conventional across-species methods that do not explicitly incorporate a phylogeny. We show, in a dataset of 10,551 protein pairs, that the phylogenetic method improves by up to 35% on across-species analyses at identifying known functionally linked proteins. The method shows that protein pairs with at least two to three correlated events of gain or loss are almost certainly functionally linked. Contingent evolution, in which one gene's presence or absence depends upon the presence of another, can also be detected phylogenetically, and may identify genes whose functional significance depends upon its interaction with other genes. Incorporating phylogenetic information improves the prediction of functional linkages. The improvement derives from having a lower rate of false positives and from detecting trends that across-species analyses miss. Phylogenetic methods can easily be incorporated into the screening of large-scale bioinformatics datasets to identify sets of protein links and to characterise gene networks.
Resumo:
This paper explores how the concept of 'social capital' relates to the teaching of speaking and listening. The argument draws on Bourdieu's notion that a common language is an illusion but posits that an understanding of the grammar of speech can be productive in the development of both an understanding of what constitutes effective speech and the development of competence in speaking. It is argued that applying structuralist notions of written grammar is an inadequate approach to understanding speech acts or enhancing the creative use of speech. An analysis is made of how typical features of speech relate to dramatic dialogue and how the meaning of what is said is contingent upon aural and visual signifiers. On this basis a competent speaker is seen as being one who produces expressions appropriate for a range of situations by intentionally employing such signifiers. The paper draws on research into the way drama teachers make explicit reference to and use of semiotics and dramatic effectiveness in order to improve students' performance and by so doing empower them to increase their social capital. Ultimately, it is concluded that helping students identify, analyse and employ the aural, visual and verbal grammar of spoken English is not an adjunct to the subject of drama, but an intrinsic part of understanding the art form. What is called for is a re-appraisal by drama teachers of their own understanding of concepts relating to speech acts in order to enhance this area of their work.
Resumo:
Objective: To compare the frequency of nail biting in 4 settings (interventions) designed to elicit the functions of nail biting and to compare the results with a self-report questionnaire about the functions of nail biting. Design: Randomised allocation of participants to order of conditions. Setting: University Psychology Department. Subjects: Forty undergraduates who reported biting their nails. Interventions: Left alone (boredom), solving maths problems (frustration), reprimanded for nail biting (contingent attention), continuous conversation (noncontingent attention). Main Outcome measures: Number of times the undergraduates bit their nails. Results: Nail biting occurred most often in two conditions, boredom and frustration. Conclusion: Nail biting in young adults occurs as a result of boredom or working on difficult problems, which may reflect a particular emotional state. It occurs least often when people are engaged in social interaction or when they are reprimanded for the behavior. (c) 2006 Elsevier Ltd. All rights reserved.
Resumo:
A representative community sample of primiparous depressed women and a nondepressed control group were assessed while in interaction with their infants at 2 months postpartum. At 3 months, infants were assessed on the Still-face perturbation of face to face interaction, and a subsample completed an Instrumental Learning paradigm. Compared to nondepressed women, depressed mothers' interactions were both less contingent and less affectively attuned to infant behavior. Postnatal depression did not adversely affect the infant's performance in either the Still-face perturbation or the Instrumental Learning assessment. Maternal responsiveness in interactions at 2 months predicted the infant's performance in the Instrumental Learning assessment but not in the Still-face perturbation. The implications of these findings for theories of infant cognitive and emotional development are discussed.
Resumo:
The potential of clarification questions (CQs) to act as a form of corrective input for young children's grammatical errors was examined. Corrective responses were operationalized as those occasions when child speech shifted from erroneous to correct (E -> C) contingent on a clarification question. It was predicted that E -> C sequences would prevail over shifts in the opposite direction (C -> E), as can occur in the case of nonerror-contingent CQs. This prediction was tested via a standard intervention paradigm, whereby every 60s a sequence of two clarification requests (either specific or general) was introduced into conversation with a total of 45 2- and 4-year-old children. For 10 categories of grammatical structure, E -> C sequences predominated over their C -> E counterparts, with levels of E -> C shifts increasing after two clarification questions. Children were also more reluctant to repeat erroneous forms than their correct counterparts, following the intervention of CQs. The findings provide support for Saxton's prompt hypothesis, which predicts that error-contingent CQs bear the potential to cue recall of previously acquired grammatical forms.
Resumo:
The associative sequence learning model proposes that the development of the mirror system depends on the same mechanisms of associative learning that mediate Pavlovian and instrumental conditioning. To test this model, two experiments used the reduction of automatic imitation through incompatible sensorimotor training to assess whether mirror system plasticity is sensitive to contingency (i.e., the extent to which activation of one representation predicts activation of another). In Experiment 1, residual automatic imitation was measured following incompatible training in which the action stimulus was a perfect predictor of the response (contingent) or not at all predictive of the response (noncontingent). A contingency effect was observed: There was less automatic imitation indicative of more learning in the contingent group. Experiment 2 replicated this contingency effect and showed that, as predicted by associative learning theory, it can be abolished by signaling trials in which the response occurs in the absence of an action stimulus. These findings support the view that mirror system development depends on associative learning and indicate that this learning is not purely Hebbian. If this is correct, associative learning theory could be used to explain, predict, and intervene in mirror system development.
Resumo:
The academic discipline of television studies has been constituted by the claim that television is worth studying because it is popular. Yet this claim has also entailed a need to defend the subject against the triviality that is associated with the television medium because of its very popularity. This article analyses the many attempts in the later twentieth and twenty-first centuries to constitute critical discourses about television as a popular medium. It focuses on how the theoretical currents of Television Studies emerged and changed in the UK, where a disciplinary identity for the subject was founded by borrowing from related disciplines, yet argued for the specificity of the medium as an object of criticism. Eschewing technological determinism, moral pathologization and sterile debates about television's supposed effects, UK writers such as Raymond Williams addressed television as an aspect of culture. Television theory in Britain has been part of, and also separate from, the disciplinary fields of media theory, literary theory and film theory. It has focused its attention on institutions, audio-visual texts, genres, authors and viewers according to the ways that research problems and theoretical inadequacies have emerged over time. But a consistent feature has been the problem of moving from a descriptive discourse to an analytical and evaluative one, and from studies of specific texts, moments and locations of television to larger theories. By discussing some historically significant critical work about television, the article considers how academic work has constructed relationships between the different kinds of objects of study. The article argues that a fundamental tension between descriptive and politically activist discourses has confused academic writing about ›the popular‹. Television study in Britain arose not to supply graduate professionals to the television industry, nor to perfect the instrumental techniques of allied sectors such as advertising and marketing, but to analyse and critique the medium's aesthetic forms and to evaluate its role in culture. Since television cannot be made by ›the people‹, the empowerment that discourses of television theory and analysis aimed for was focused on disseminating the tools for critique. Recent developments in factual entertainment television (in Britain and elsewhere) have greatly increased the visibility of ›the people‹ in programmes, notably in docusoaps, game shows and other participative formats. This has led to renewed debates about whether such ›popular‹ programmes appropriately represent ›the people‹ and how factual entertainment that is often despised relates to genres hitherto considered to be of high quality, such as scripted drama and socially-engaged documentary television. A further aspect of this problem of evaluation is how television globalisation has been addressed, and the example that the issue has crystallised around most is the reality TV contest Big Brother. Television theory has been largely based on studying the texts, institutions and audiences of television in the Anglophone world, and thus in specific geographical contexts. The transnational contexts of popular television have been addressed as spaces of contestation, for example between Americanisation and national or regional identities. Commentators have been ambivalent about whether the discipline's role is to celebrate or critique television, and whether to do so within a national, regional or global context. In the discourses of the television industry, ›popular television‹ is a quantitative and comparative measure, and because of the overlap between the programming with the largest audiences and the scheduling of established programme types at the times of day when the largest audiences are available, it has a strong relationship with genre. The measurement of audiences and the design of schedules are carried out in predominantly national contexts, but the article refers to programmes like Big Brother that have been broadcast transnationally, and programmes that have been extensively exported, to consider in what ways they too might be called popular. Strands of work in television studies have at different times attempted to diagnose what is at stake in the most popular programme types, such as reality TV, situation comedy and drama series. This has centred on questions of how aesthetic quality might be discriminated in television programmes, and how quality relates to popularity. The interaction of the designations ›popular‹ and ›quality‹ is exemplified in the ways that critical discourse has addressed US drama series that have been widely exported around the world, and the article shows how the two critical terms are both distinct and interrelated. In this context and in the article as a whole, the aim is not to arrive at a definitive meaning for ›the popular‹ inasmuch as it designates programmes or indeed the medium of television itself. Instead the aim is to show how, in historically and geographically contingent ways, these terms and ideas have been dynamically adopted and contested in order to address a multiple and changing object of analysis.
Resumo:
The UK Food Standards Agency convened a group of expert scientists to review current research investigating the optimal dietary intake for n-9 cis-monounsaturated fatty acids (MUFA). The aim was to review the mechanisms underlying the reported beneficial effects of MUFA on CHD risk, and to establish priorities for future research. The issue of optimal MUFA intake is contingent upon optimal total fat intake; however, there is no consensus of opinion on what the optimal total fat intake should be. Thus, it was recommended that a large multi-centre study should look at the effects on CHD risk of MUFA replacement of saturated fatty acids in relation to varying total fat intakes; this study should be of sufficient size to take account of genetic variation, sex, physical activity and stage of life factors, as well as being of sufficient duration to account for adaptation to diets. Recommendations for studies investigating the mechanistic effects of MUFA were also made. Methods of manipulating the food chain to increase MUFA at the expense of saturated fatty acids were also discussed.
Resumo:
Higher animal welfare standards increase costs along the supply chain of certified animal-friendly products (AFP). Since the market outcome of certified AFP depends on consumer confidence toward supply chain operators complying with these standards, the role of trust in consumer willingness-to-pay (WTP) for AFP is paramount. Results from a contingent valuation survey administered in five European Union countries show that WTP estimates were sensitive to robust measures of consumer trust for certified AFP. Deriving the WTP effect of a single food category on total food expenditure is difficult for survey respondents; hence, a budget approach was employed to facilitate this process.
Resumo:
There is concern that insect pollinators, such as honey bees, are currently declining in abundance, and are under serious threat from environmental changes such as habitat loss and climate change; the use of pesticides in intensive agriculture, and emerging diseases. This paper aims to evaluate how much public support there would be in preventing further decline to maintain the current number of bee colonies in the UK. The contingent valuation method (CVM) was used to obtain the willingness to pay (WTP) for a theoretical pollinator protection policy. Respondents were asked whether they would be WTP to support such a policy and how much would they pay? Results show that the mean WTP to support the bee protection policy was £1.37/week/household. Based on there being 24.9 million households in the UK, this is equivalent to £1.77 billion per year. This total value can show the importance of maintaining the overall pollination service to policy makers. We compare this total with estimates obtained using a simple market valuation of pollination for the UK.
Resumo:
We use contingent valuation (CV) and choice experiment (CE) methods to assess cattle farmers’ attitudes to and willingness to pay (WTP) for a bovine tuberculosis (bTB) cattle vaccine, to help inform vaccine development and policy. A survey questionnaire was administered by means of telephone interviews to a stratified sample of 300 cattle farmers in annually bTB-tested areas in England and Wales. Farmers felt that bTB was a major risk for the cattle industry and that there was a high risk of their cattle getting the disease. The CE estimate produced a mean WTP of £35 per animal per single dose for a vaccine that is 90% effective at reducing the risk of a bTB breakdown and an estimated £55 for such a vaccine backed by 100% insurance of loss if a breakdown should occur. The CV estimate produced a mean WTP of nearly £17 per dose/per animal/per year for a vaccine (including 100% insurance) which, given the average lifespan of cattle, is comparable to the CE estimate. These WTP estimates are substantially higher than the expected cost of a vaccine which suggests that farmers in high risk bTB ‘hotspot’ areas perceive a substantial net benefit from buying the vaccine.