992 resultados para Methodological importance
Resumo:
In The Conduct of Inquiry in International Relations, Patrick Jackson situates methodologies in International Relations in relation to their underlying philosophical assumptions. One of his aims is to map International Relations debates in a way that ‘capture[s] current controversies’ (p. 40). This ambition is overstated: whilst Jackson’s typology is useful as a clarificatory tool, (re)classifying existing scholarship in International Relations is more problematic. One problem with Jackson’s approach is that he tends to run together the philosophical assumptions which decisively differentiate his methodologies (by stipulating a distinctive warrant for knowledge claims) and the explanatory strategies that are employed to generate such knowledge claims, suggesting that the latter are entailed by the former. In fact, the explanatory strategies which Jackson associates with each methodology reflect conventional practice in International Relations just as much as they reflect philosophical assumptions. This makes it more difficult to identify each methodology at work than Jackson implies. I illustrate this point through a critical analysis of Jackson’s controversial reclassification of Waltz as an analyticist, showing that whilst Jackson’s typology helps to expose inconsistencies in Waltz’s approach, it does not fully support the proposed reclassification. The conventional aspect of methodologies in International Relations also raises questions about the limits of Jackson’s ‘engaged pluralism’.
Resumo:
The present study addressed the hypothesis that emotional stimuli relevant to survival or reproduction (biologically emotional stimuli) automatically affect cognitive processing (e.g., attention, memory), while those relevant to social life (socially emotional stimuli) require elaborative processing to modulate attention and memory. Results of our behavioral studies showed that (1) biologically emotional images hold attention more strongly than do socially emotional images, (2) memory for biologically emotional images was enhanced even with limited cognitive resources, but (3) memory for socially emotional images was enhanced only when people had sufficient cognitive resources at encoding. Neither images’ subjective arousal nor their valence modulated these patterns. A subsequent functional magnetic resonance imaging study revealed that biologically emotional images induced stronger activity in the visual cortex and greater functional connectivity between the amygdala and visual cortex than did socially emotional images. These results suggest that the interconnection between the amygdala and visual cortex supports enhanced attention allocation to biological stimuli. In contrast, socially emotional images evoked greater activity in the medial prefrontal cortex (MPFC) and yielded stronger functional connectivity between the amygdala and MPFC than did biological images. Thus, it appears that emotional processing of social stimuli involves elaborative processing requiring frontal lobe activity.
Resumo:
Choices not only reflect our preference, but they also affect our behavior. The phenomenon of choice-induced preference change has been of interest to cognitive dissonance researchers in social psychology, and more recently, it has attracted the attention of researchers in economics and neuroscience. Preference modulation after the mere act of making a choice has been repeatedly demonstrated over the last 50 years by an experimental paradigm called the “free-choice paradigm.” However, Chen and Risen (2010) pointed out a serious methodological flaw in this paradigm, arguing that evidence for choice-induced preference change is still insufficient. Despite the flaw, studies using the traditional free-choice paradigm continue to be published without addressing the criticism. Here, aiming to draw more attention to this issue, we briefly explain the methodological problem, and then describe simple simulation studies that illustrate how the free-choice paradigm produces a systematic pattern of preference change consistent with cognitive dissonance, even without any change in true preference. Our stimulation also shows how a different level of noise in each phase of the free-choice paradigm independently contributes to the magnitude of artificial preference change. Furthermore, we review ways of addressing the critique and provide a meta-analysis to show the effect size of choice-induced preference change after addressing the critique. Finally, we review and discuss, based on the results of the stimulation studies, how the criticism affects our interpretation of past findings generated from the free-choice paradigm. We conclude that the use of the conventional free-choice paradigm should be avoided in future research and the validity of past findings from studies using this paradigm should be empirically re-established. (PsycINFO Database Record (c) 2013 APA, all rights reserved)(journal abstract)
Resumo:
In the present research, we conducted 4 studies designed to examine the hypothesis that perceived competence moderates the relation between performance-approach and performance-avoidance goals. Each study yielded supportive data, indicating that the correlation between the 2 goals is lower when perceived competence is high. This pattern was observed at the between- and within-subject level of analysis, with correlational and experimental methods and using both standard and novel achievement goal assessments, multiple operationalizations of perceived competence, and several different types of focal tasks. The findings from this research contribute to the achievement goal literature on theoretical, applied, and methodological fronts and highlight the importance of and need for additional empirical work in this area. (PsycINFO Database Record (c) 2012 APA, all rights reserved)(journal abstract)
Resumo:
We present a new Bayesian econometric specification for a hypothetical Discrete Choice Experiment (DCE) incorporating respondent ranking information about attribute importance. Our results indicate that a DCE debriefing question that asks respondents to rank the importance of attributes helps to explain the resulting choices. We also examine how mode of survey delivery (online and mail) impacts model performance, finding that results are not substantively a§ected by the mode of survey delivery. We conclude that the ranking data is a complementary source of information about respondent utility functions within hypothetical DCEs
Resumo:
A primitive equation model is used to study the sensitivity of baroclinic wave life cycles to the initial latitude-height distribution of humidity. Diabatic heating is parametrized only as a consequence of condensation in regions of large-scale ascent. Experiments are performed in which the initial relative humidity is a simple function of model level, and in some cases latitude bands are specified which are initially relatively dry. It is found that the presence of moisture can either increase or decrease the peak eddy kinetic energy of the developing wave, depending on the initial moisture distribution. A relative abundance of moisture at mid-latitudes tends to weaken the wave, while a relative abundance at low latitudes tends to strengthen it. This sensitivity exists because competing processes are at work. These processes are described in terms of energy box diagnostics. The most realistic case lies on the cusp of this sensitivity. Further physical parametrizations are then added, including surface fluxes and upright moist convection. These have the effect of increasing wave amplitude, but the sensitivity to initial conditions of relative humidity remains. Finally, 'control' and 'doubled CO2' life cycles are performed, with initial conditions taken from the time-mean zonal-mean output of equilibrium GCM experiments. The attenuation of the wave resulting from reduced baroclinicity is more pronounced than any effect due to changes in initial moisture.
Resumo:
Deindustrialization, stagnant real incomes of production workers, and increasing inequality are latter day features of many economies. It is common to assume that such developments pressure policymakers to relax environmental standards. However, when heavily polluting industries become less important economically, their political importance also tends to diminish. Consequently, a regulator may increase the stringency of environmental policies. Like some other studies, we find that declining industrial employment translates into stricter environmental standards. In contrast to previous studies, but consistent with our argument, we find that greater income inequality is associated with policies that promote a cleaner environment. (JEL Q58, P16, J31, C23)
Resumo:
Galactic Cosmic Rays are one of the major sources of ion production in the troposphere and stratosphere. Recent studies have shown that ions form electrically charged clusters which may grow to become cloud droplets. Aerosol particles charge by the attachment of ions and electrons. The collision efficiency between a particle and a water droplet increases, if the particle is electrically charged, and thus aerosol-cloud interactions can be enhanced. Because these microphysical processes may change radiative properties of cloud and impact Earth's climate it is important to evaluate these processes' quantitative effects. Five different models developed independently have been coupled to investigate this. The first model estimates cloud height from dew point temperature and the temperature profile. The second model simulates the cloud droplet growth from aerosol particles using the cloud parcel concept. In the third model, the scavenging rate of the aerosol particles is calculated using the collision efficiency between charged particles and droplets. The fourth model calculates electric field and charge distribution on water droplets and aerosols within cloud. The fifth model simulates the global electric circuit (GEC), which computes the conductivity and ionic concentration in the atmosphere in altitude range 0–45 km. The first four models are initially coupled to calculate the height of cloud, boundary condition of cloud, followed by growth of droplets, charge distribution calculation on aerosols and cloud droplets and finally scavenging. These models are incorporated with the GEC model. The simulations are verified with experimental data of charged aerosol for various altitudes. Our calculations showed an effect of aerosol charging on the CCN concentration within the cloud, due to charging of aerosols increase the scavenging of particles in the size range 0.1 µm to 1 µm.
Resumo:
Urban land surface models (LSM) are commonly evaluated for short periods (a few weeks to months) because of limited observational data. This makes it difficult to distinguish the impact of initial conditions on model performance or to consider the response of a model to a range of possible atmospheric conditions. Drawing on results from the first urban LSM comparison, these two issues are considered. Assessment shows that the initial soil moisture has a substantial impact on the performance. Models initialised with soils that are too dry are not able to adjust their surface sensible and latent heat fluxes to realistic values until there is sufficient rainfall. Models initialised with too wet soils are not able to restrict their evaporation appropriately for periods in excess of a year. This has implications for short term evaluation studies and implies the need for soil moisture measurements to improve data assimilation and model initialisation. In contrast, initial conditions influencing the thermal storage have a much shorter adjustment timescale compared to soil moisture. Most models partition too much of the radiative energy at the surface into the sensible heat flux at the probable expense of the net storage heat flux.
Resumo:
The purpose of the current article is to support the investigation of linguistic relativity in second language acquisition and sketch methodological and theoretical prerequisites toward developing the domain into a full research program. We identify and discuss three theoretical-methodological components that we believe are needed to succeed in this enterprise. First, we highlight the importance of using nonverbal methods to study linguistic relativity effects in second language (L2) speakers. The use of nonverbal tasks is necessary in order to avoid the circularity that arises when inferences about nonverbal behavior are made on the basis of verbal evidence alone. Second, we identify and delineate the likely cognitive mechanisms underpinning cognitive restructuring in L2 speakers by introducing the theoretical framework of associative learning. By doing so, we demonstrate that the extent and nature of cognitive restructuring in L2 speakers is essentially a function of variation in individual learners’ trajectories. Third, we offer an in-depth discussion of the factors (e.g., L2 proficiency and L2 use) that characterize those trajectories, anchoring them to the framework of associative learning, and reinterpreting their relative strength in predicting L2 speaker cognition
Resumo:
Dynamical downscaling is frequently used to investigate the dynamical variables of extra-tropical cyclones, for example, precipitation, using very high-resolution models nested within coarser resolution models to understand the processes that lead to intense precipitation. It is also used in climate change studies, using long timeseries to investigate trends in precipitation, or to look at the small-scale dynamical processes for specific case studies. This study investigates some of the problems associated with dynamical downscaling and looks at the optimum configuration to obtain the distribution and intensity of a precipitation field to match observations. This study uses the Met Office Unified Model run in limited area mode with grid spacings of 12, 4 and 1.5 km, driven by boundary conditions provided by the ECMWF Operational Analysis to produce high-resolution simulations for the Summer of 2007 UK flooding events. The numerical weather prediction model is initiated at varying times before the peak precipitation is observed to test the importance of the initialisation and boundary conditions, and how long the simulation can be run for. The results are compared to raingauge data as verification and show that the model intensities are most similar to observations when the model is initialised 12 hours before the peak precipitation is observed. It was also shown that using non-gridded datasets makes verification more difficult, with the density of observations also affecting the intensities observed. It is concluded that the simulations are able to produce realistic precipitation intensities when driven by the coarser resolution data.
Resumo:
In this paper, we study the role of the volatility risk premium for the forecasting performance of implied volatility. We introduce a non-parametric and parsimonious approach to adjust the model-free implied volatility for the volatility risk premium and implement this methodology using more than 20 years of options and futures data on three major energy markets. Using regression models and statistical loss functions, we find compelling evidence to suggest that the risk premium adjusted implied volatility significantly outperforms other models, including its unadjusted counterpart. Our main finding holds for different choices of volatility estimators and competing time-series models, underlying the robustness of our results.
Resumo:
The application of the Water Framework Directive (WFD) in the European Union (EU) targets certain threshold levels for the concentration of various nutrients, nitrogen and phosphorous being the most important. In the EU, agri-environmental measures constitute a significant component of Pillar 2—Rural Development Policies in both financial and regulatory terms. Environmental measures also are linked to Pillar 1 payments through cross-compliance and the greening proposals. This paper drawing from work carried out in the REFRESH FP7 project aims to show how an INtegrated CAtchment model of plant/soil system dynamics and instream biogeochemical and hydrological dynamics can be used to assess the cost-effectiveness of agri-environmental measures in relation to nutrient concentration targets set by the WFD, especially in the presence of important habitats. We present the procedures (methodological steps, challenges and problems) for assessing the cost-effectiveness of agri-environmental measures at the baseline situation, and climate and land use change scenarios. Furthermore, we present results of an application of this methodology to the Louros watershed in Greece and discuss the likely uses and future extensions of the modelling approach. Finally, we attempt to reveal the importance of this methodology for designing and incorporating alternative environmental practices in Pillar 1 and 2 measures.