34 resultados para Explaining intention to play
em CentAUR: Central Archive University of Reading - UK
Resumo:
The article investigates how purchasing intentions among a sample of Italian consumers are influenced by different levels of risk perception and their trust in food-safety information provided by different sources such as the food industry, government agencies, or consumers' associations. The assessment of the determinants of intention to purchase was carried out by estimating a causal model for the chicken case in which attitudes, subjective norms, and perceived risk play a major role in determining buyer's behavior. In particular, the role of trust in influencing risk perception is highlighted either as a general construct or as specific constructs targeting food chain, policy actors, and the media. [EconLit citations: Q130, Q190, D120]. (C) 2008 Wiley Periodicals, Inc.
Resumo:
Previous research has shown that people's evaluations of explanations about medication and their intention to comply with the prescription are detrimentally affected by the inclusion of information about adverse side effects of the medication. The present study (Experiment 1) examined which particular aspects of information about side effects (their number, likelihood of occurrence, or severity) are likely to have the greatest effect on people's satisfaction, perception of risk, and intention to comply, as well as how the information about side effects interacts with information about the severity of the illness for which the medication was prescribed. Across all measures, it was found that manipulations of side effect severity had the greatest impact on people's judgements, followed by manipulations of side effect likelihood and then number. Experiments 2 and 3 examined how the severity of the diagnosed illness and information about negative side effects interact with two other factors suggested by Social Cognition models of health behaviour to affect people's intention to comply: namely, perceived benefit of taking the prescribed drug, and the perceived level of control over preventing or alleviating the side effects. It was found that providing people with a statement about the positive benefit of taking the medication had relatively little effect on judgements, whereas informing them about how to reduce the chances of experiencing the side effects had an overall beneficial effect on ratings.
Resumo:
This study investigates whether, and how, people's perception of risk and intended health behaviours are affected by whether a medicine is prescribed for themselves or for a young child. The question is relevant to the issue of whether it is beneficial to produce medicines information that is tailored to particular subgroups of the population, such as parents of young children. In the experiment, participants read scenarios which referred either to themselves or their (imagined) 1-year-old child, and were required to make a number of risk judgements. The results showed that both parents and non-parents were less satisfied, perceived side effects to be more severe and more likely to occur, risk to health to be higher, and said that they would be less likely to take (or give) the medicine when the recipient was the child. On the basis of the findings, it is suggested that it may well be beneficial to tailor materials to broader classes of patient type.
Resumo:
This essay traces the development of Otto Neurath’s ideas that led to the publication of one of the first series of children’s books produced by the Isotype Institute in the late 1940s, the Visual History of Mankind. Described in its publicity material as ‘new in content’ and ‘new in method’, it embodied much of Otto Neurath’s thinking about visual education, and also coincided with other educational ideas in the UK in the 1930s and 1940s. It exemplified the Isotype Institute’s approach: teamwork, thinking about the needs of younger readers, clear explanation, and accessible content. Further, drawing on correspondence, notes and drawings from the Otto and Marie Neurath Isotype Collection at the University of Reading, the essay presents insights to the making of the books and the people involved, the costs of production and the influence of this on design decisions, and how the books were received by teachers and children.
Resumo:
A research has been conducted over methodological issues concerning the Theory of Planned Behaviour (TPB) by determining an appropriate measurement (direct and indirect) of constructs and selection of a plausible scaling techniques (unipolar and bipolar) of constructs: attitude, subjective norm, perceived behavioural control and intention that are important in explaining farm level tree planting in Pakistan. Unipolar scoring of beliefs showed higher correlation among the constructs of TPB than bipolar scaling technique. Both direct and indirect methods yielded significant results in explaining intention to perform farm forestry except the belief based measure of perceived behavioural control, which were analysed as statistically non-significant. A need to examine more carefully the scoring of perceived behavioural control (PBC) has been expressed
Resumo:
Background: Currently, all pharmacists and technicians registered with the Royal Pharmaceutical Society of Great Britain must complete a minimum of nine Continuing Professional Development (CPD) record (entries) each year. From September 2010 a new regulatory body, the General Pharmaceutical Council, will oversee the regulation (including revalidation) of all pharmacy registrants in Great Britain. CPD may provide part of the supporting evidence that a practitioner submits to the regulator as part of the revalidation process. Gaps in knowledge necessitated further research to examine the usefulness of CPD in a pharmacy revalidation Project aims: The overall aims of this project were to summarise pharmacy professionals’ past involvement in CPD, examine the usability of current CPD entries for the purpose of revalidation, and to examine the impact of ‘revalidation standards’ and a bespoke Outcomes Framework on the conduct and construction of CPD entries for future revalidation of pharmacy professionals. We completed a comprehensive review of the literature, devised, validated and tested the impact of a new CPD Outcomes Framework and related training material in an empirical investigation involving volunteer pharmacy professionals and also spoke with our participants to bring meaning and understanding to the process of CPD conduct and recording and to gain feedback on the study itself. Key findings: The comprehensive literature review identified perceived barriers to CPD and resulted in recommendations that could potentially rectify pharmacy professionals’ perceptions and facilitate participation in CPD. The CPD Outcomes Framework can be used to score CPD entries Compared to a control (CPD and ‘revalidation standards’ only), we found that training participants to apply the CPD Outcomes Framework resulted in entries that scored significantly higher in the context of a quantitative method of CPD assessment. Feedback from participants who had received the CPD Outcomes Framework was positive and a number of useful suggestions were made about improvements to the Framework and related training. Entries scored higher because participants had consciously applied concepts linked to the CPD Outcomes Framework whereas entries scored low where participants had been unable to apply the concepts of the Framework for a variety of reasons including limitations posed by the ‘Plan & Record’ template. Feedback about the nature of the ‘revalidation standards’ and their application to CPD was not positive and participants had not in the main sought to apply the standards to their CPD entries – but those in the intervention group were more likely to have referred to the revalidation standards for their CPD. As assessors, we too found the process of selecting and assigning ‘revalidation standards’ to individual CPD entries burdensome and somewhat unspecific. We believe that addressing the perceived barriers and drawing on the facilitators will help deal with the apparent lack of engagement with the revalidation standards and have been able to make a set of relevant recommendations. We devised a model to explain and tell the story of CPD behaviour. Based on the concepts of purpose, action and results, the model centres on explaining two types of CPD behaviour, one following the traditional CE pathway and the other a more genuine CPD pathway. Entries which scored higher when we applied the CPD Outcomes Framework were more likely to follow the CPD pathway in the model above. Significant to our finding is that while participants following both models of practice took part in this study, the CPD Outcomes Framework was able to change people’s CPD behaviour to make it more inline with the CPD pathway. The CPD Outcomes Framework in defining the CPD criteria, the training pack in teaching the basis and use of the Framework and the process of assessment in using the CPD Outcomes Framework, would have interacted to improve participants’ CPD through a collective process. Participants were keen to receive a curriculum against which certainly CE-type activities could be conducted and another important observation relates to whether CE has any role to play in pharmacy professionals’ revalidation. We would recommend that the CPD Outcomes Framework is used in the revalidation of pharmacy professionals in the future provided the requirement to submit 9 CPD entries per annum is re-examined and expressed more clearly in relation to what specifically participants are being asked to submit – i.e. the ratio of CE to CPD entries. We can foresee a benefit in setting more regular intervals which would act as deadlines for CPD submission in the future. On the whole, there is value in using CPD for the purpose of pharmacy professionals’ revalidation in the future.
Resumo:
Using experiments with an atmospheric general circulation model, the climate impacts of a basin-scale warming or cooling of the North Atlantic Ocean are investigated. Multidecadal fluctuations with this pattern were observed during the twentieth century, and similar variations--but with larger amplitude--are believed to have occurred in the more distant past. It is found that in all seasons the response to warming the North Atlantic is strongest, in the sense of highest signal-to-noise ratio, in the Tropics. However there is a large seasonal cycle in the climate impacts. The strongest response is found in boreal summer and is associated with suppressed precipitation and elevated temperatures over the lower-latitude parts of North and South America. In August-September-October there is a significant reduction in the vertical shear in the main development region for Atlantic hurricanes. In winter and spring, temperature anomalies over land in the extratropics are governed by dynamical changes in circulation rather than simply reflecting a thermodynamic response to the warming or cooling of the ocean. The tropical climate response is primarily forced by the tropical SST anomalies, and the major features are in line with simple models of the tropical circulation response to diabatic heating anomalies. The extratropical climate response is influenced both by tropical and higher-latitude SST anomalies and exhibits nonlinear sensitivity to the sign of the SST forcing. Comparisons with multidecadal changes in sea level pressure observed in the twentieth century support the conclusion that the impact of North Atlantic SST change is most important in summer, but also suggest a significant influence in lower latitudes in autumn and winter. Significant climate impacts are not restricted to the Atlantic basin, implying that the Atlantic Ocean could be an important driver of global decadal variability. The strongest remote impacts are found to occur in the tropical Pacific region in June-August and September-November. Surface anomalies in this region have the potential to excite coupled oceanatmosphere feedbacks, which are likely to play an important role in shaping the ultimate climate response.
Resumo:
The tropospheric response to midlatitude SST anomalies has been investigated through a series of aquaplanet simulations using a high-resolution version of the Hadley Centre atmosphere model (HadAM3) under perpetual equinox conditions. Model integrations show that increases in the midlatitude SST gradient generally lead to stronger storm tracks that are shifted slightly poleward, consistent with changes in the lower-tropospheric baroclinicity. The large-scale atmospheric response is, however, highly sensitive to the position of the SST gradient anomaly relative to that of the subtropical jet in the unperturbed atmosphere. In particular, when SST gradients are increased very close to the subtropical jet, then the Hadley cell and subtropical jet is strengthened while the storm track and eddy-driven jet are shifted equatorward. Conversely, if the subtropical SST gradients are reduced and the midlatitude gradients increased, then the storm track shows a strong poleward shift and a well-separated eddy-driven jet is produced. The sign of the SST anomaly is shown to play a secondary role in determining the overall tropospheric response. These findings are used to provide a new and consistent interpretation of some previous GCM studies concerning the atmospheric response to midlatitude SST anomalies.
Resumo:
Under global warming, the predicted intensification of the global freshwater cycle will modify the net freshwater flux at the ocean surface. Since the freshwater flux maintains ocean salinity structures, changes to the density-driven ocean circulation are likely. A modified ocean circulation could further alter the climate, potentially allowing rapid changes, as seen in the past. The relevant feedback mechanisms and timescales are poorly understood in detail, however, especially at low latitudes where the effects of salinity are relatively subtle. In an attempt to resolve some of these outstanding issues, we present an investigation of the climate response of the low-latitude Pacific region to changes in freshwater forcing. Initiated from the present-day thermohaline structure, a control run of a coupled ocean-atmosphere general circulation model is compared with a perturbation run in which the net freshwater flux is prescribed to be zero over the ocean. Such an extreme experiment helps to elucidate the general adjustment mechanisms and their timescales. The atmospheric greenhouse gas concentrations are held constant, and we restrict our attention to the adjustment of the upper 1,000 m of the Pacific Ocean between 40°N and 40°S, over 100 years. In the perturbation run, changes to the surface buoyancy, near-surface vertical mixing and mixed-layer depth are established within 1 year. Subsequently, relative to the control run, the surface of the low-latitude Pacific Ocean in the perturbation run warms by an average of 0.6°C, and the interior cools by up to 1.1°C, after a few decades. This vertical re-arrangement of the ocean heat content is shown to be achieved by a gradual shutdown of the heat flux due to isopycnal (i.e. along surfaces of constant density) mixing, the vertical component of which is downwards at low latitudes. This heat transfer depends crucially upon the existence of density-compensating temperature and salinity gradients on isopycnal surfaces. The timescale of the thermal changes in the perturbation run is therefore set by the timescale for the decay of isopycnal salinity gradients in response to the eliminated freshwater forcing, which we demonstrate to be around 10-20 years. Such isopycnal heat flux changes may play a role in the response of the low-latitude climate to a future accelerated freshwater cycle. Specifically, the mechanism appears to represent a weak negative sea surface temperature feedback, which we speculate might partially shield from view the anthropogenically-forced global warming signal at low latitudes. Furthermore, since the surface freshwater flux is shown to play a role in determining the ocean's thermal structure, it follows that evaporation and/or precipitation biases in general circulation models are likely to cause sea surface temperature biases.
Resumo:
Synthetic aperture radar (SAR) data have proved useful in remote sensing studies of deserts, enabling different surfaces to be discriminated by differences in roughness properties. Roughness is characterized in SAR backscatter models using the standard deviation of surface heights (sigma), correlation length (L) and autocorrelation function (rho(xi)). Previous research has suggested that these parameters are of limited use for characterizing surface roughness, and are often unreliable due to the collection of too few roughness profiles, or under-sampling in terms of resolution or profile length (L-p). This paper reports on work aimed at establishing the effects of L-p and sampling resolution on SAR backscatter estimations and site discrimination. Results indicate significant relationships between the average roughness parameters and L-p, but large variability in roughness parameters prevents any clear understanding of these relationships. Integral equation model simulations demonstrate limited change with L-p and under-estimate backscatter relative to SAR observations. However, modelled and observed backscatter conform in pattern and magnitude for C-band systems but not for L-band data. Variation in surface roughness alone does not explain variability in site discrimination. Other factors (possibly sub-surface scattering) appear to play a significant role in controlling backscatter characteristics at lower frequencies.
Resumo:
Mostly because of a lack of observations, fundamental aspects of the St. Lawrence Estuary's wintertime response to forcing remain poorly understood. The results of a field campaign over the winter of 2002/03 in the estuary are presented. The response of the system to tidal forcing is assessed through the use of harmonic analyses of temperature, salinity, sea level, and current observations. The analyses confirm previous evidence for the presence of semidiurnal internal tides, albeit at greater depths than previously observed for ice-free months. The low-frequency tidal streams were found to be mostly baroclinic in character and to produce an important neap tide intensification of the estuarine circulation. Despite stronger atmospheric momentum forcing in winter, the response is found to be less coherent with the winds than seen in previous studies of ice-free months. The tidal residuals show the cold intermediate layer in the estuary is renewed rapidly ( 14 days) in late March by the advection of a wedge of near-freezing waters from the Gulf of St. Lawrence. In situ processes appeared to play a lesser role in the renewal of this layer. In particular, significant wintertime deepening of the estuarine surface mixed layer was prevented by surface stability, which remained high throughout the winter. The observations also suggest that the bottom circulation was intensified during winter, with the intrusion in the deep layer of relatively warm Atlantic waters, such that the 3 C isotherm rose from below 150 m to near 60 m.
Resumo:
Given the growing impact of human activities on the sea, managers are increasingly turning to marine protected areas (MPAs) to protect marine habitats and species. Many MPAs have been unsuccessful, however, and lack of income has been identified as a primary reason for failure. In this study, data from a global survey of 79 MPAs in 36 countries were analysed and attempts made to construct predictive models to determine the income requirements of any given MPA. Statistical tests were used to uncover possible patterns and relationships in the data, with two basic approaches. In the first of these, an attempt was made to build an explanatory "bottom-up" model of the cost structures that might be required to pursue various management activities. This proved difficult in practice owing to the very broad range of applicable data, spanning many orders of magnitude. In the second approach, a "top-down" regression model was constructed using logarithms of the base data, in order to address the breadth of the data ranges. This approach suggested that MPA size and visitor numbers together explained 46% of the minimum income requirements (P < 0.001), with area being the slightly more influential factor. The significance of area to income requirements was of little surprise, given its profile in the literature. However, the relationship between visitors and income requirements might go some way to explaining why northern hemisphere MPAs with apparently high incomes still claim to be under-funded. The relationship between running costs and visitor numbers has important implications not only in determining a realistic level of funding for MPAs, but also in assessing from where funding might be obtained. Since a substantial proportion of the income of many MPAs appears to be utilized for amenity purposes, a case may be made for funds to be provided from the typically better resourced government social and educational budgets as well as environmental budgets. Similarly visitor fees, already an important source of funding for some MPAs, might have a broader role to play in how MPAs are financed in the future. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
This study suggests a statistical strategy for explaining how food purchasing intentions are influenced by different levels of risk perception and trust in food safety information. The modelling process is based on Ajzen's Theory of Planned Behaviour and includes trust and risk perception as additional explanatory factors. Interaction and endogeneity across these determinants is explored through a system of simultaneous equations, while the SPARTA equation is estimated through an ordered probit model. Furthermore, parameters are allowed to vary as a function of socio-demographic variables. The application explores chicken purchasing intentions both in a standard situation and conditional to an hypothetical salmonella scare. Data were collected through a nationally representative UK wide survey of 533 UK respondents in face-to-face, in-home interviews. Empirical findings show that interactions exist among the determinants of planned behaviour and socio-demographic variables improve the model's performance. Attitudes emerge as the key determinant of intention to purchase chicken, while trust in food safety information provided by media reduces the likelihood to purchase. (C) 2006 Elsevier Ltd. All rights reserved.
Resumo:
The stated benefits and perceived risks of genetic modification (GM) cover very diverse issues, such as food safety, world food security, and the environment, that may differentially affect consumer acceptance. In this research, we hypothesize that consumers perceive up to eight dimensions: risks to business (farmers, agribusiness, etc.), benefits to business, risks and benefits to the environment, risks and benefits to the developing world, and risks and benefits to self and family. Moral concerns are also recognized. Using data collected in 2002 in the United States, France, and the UK, we investigate these different dimensions. Second, we analyze the extent to which the dimensions of risk-benefit perceptions can be explained by general attitudes widely used to explain food purchase behavior (such as general attitude to the environment, to technology, etc.), as well as by perceived knowledge of GM, level of education, and trust in various sources of information. In all locations, the majority of consumers only perceive a medium level of risk from GM products. Attitude to technology is the most important attitude variable—those with a positive attitude to technology in general also have a positive attitude to GM technology. More Americans than Europeans fall into this category. Those who trust government and the food industry tend to think GM technology is less risky, whereas those who trust activists believe the opposite. Americans are more trusting of the former, Europeans of the latter. Level of education is positively associated with benefit perceptions and negatively associated with moral concerns. Location continues to play a limited independent role in explaining perceptions even after these factors have been taken into account.
Resumo:
A large number of processes are involved in the pathogenesis of atherosclerosis but it is unclear which of them play a rate-limiting role. One way of resolving this problem is to investigate the highly non-uniform distribution of disease within the arterial system; critical steps in lesion development should be revealed by identifying arterial properties that differ between susceptible and protected sites. Although the localisation of atherosclerotic lesions has been investigated intensively over much of the 20th century, this review argues that the factor determining the distribution of human disease has only recently been identified. Recognition that the distribution changes with age has, for the first time, allowed it to be explained by variation in transport properties of the arterial wall; hitherto, this view could only be applied to experimental atherosclerosis in animals. The newly discovered transport variations which appear to play a critical role in the development of adult disease have underlying mechanisms that differ from those elucidated for the transport variations relevant to experimental atherosclerosis: they depend on endogenous NO synthesis and on blood flow. Manipulation of transport properties might have therapeutic potential. Copyright (C) 2004 S. Karger AG, Basel.