915 resultados para Database search Evidential value Bayesian decision theory Influence diagrams


Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper investigates the frequency of extreme events for three LIFFE futures contracts for the calculation of minimum capital risk requirements (MCRRs). We propose a semiparametric approach where the tails are modelled by the Generalized Pareto Distribution and smaller risks are captured by the empirical distribution function. We compare the capital requirements form this approach with those calculated from the unconditional density and from a conditional density - a GARCH(1,1) model. Our primary finding is that both in-sample and for a hold-out sample, our extreme value approach yields superior results than either of the other two models which do not explicitly model the tails of the return distribution. Since the use of these internal models will be permitted under the EC-CAD II, they could be widely adopted in the near future for determining capital adequacies. Hence, close scrutiny of competing models is required to avoid a potentially costly misallocation capital resources while at the same time ensuring the safety of the financial system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper examines the implications of policy fracture and arms length governance within the decision making processes currently shaping curriculum design within the English education system. In particular it argues that an unresolved ‘ideological fracture’ at government level has been passed down to school leaders whose response to the dilemma is distorted by the target-driven agenda of arms length agencies. Drawing upon the findings of a large scale on-line survey of history teaching in English secondary schools, this paper illustrates the problems that occur when policy making is divorced from curriculum theory, and in particular from any consideration of the nature of knowledge. Drawing on the social realist theory of knowledge elaborated by Young (2008), we argue that the rapid spread of alternative curricular arrangements, implemented in the absence of an understanding of curriculum theory, undermines the value of disciplined thinking to the detriment of many young people, particularly those in areas of social and economic deprivation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose: This is a cross-national study which investigates changes in purchase intentions of UK versus Chinese consumers following exposure to successive e-WOM comments in the form of positive and negative user reviews for experience versus search products. Design/methodology/approach: A 2(e-WOM valence and order: negative vs. positive most recent) X 2(product type: experience vs. search) X 3(purchase intentions at t1, t2, t3) repeated measures factorial design is used to test a set of hypotheses developed from the literature. Findings: Chinese consumers are susceptible to recent e-WOM comments regardless of their valence, while UK consumers anchor on negative information regardless of the order in which it is acquired. This holds particularly for experience products. Originality/value: This cross-national study contributes to the scarce literature on the impact of e-WOM on consumer purchase decisions by comparing UK and Chinese consumers. We suggest that culture moderates the development of product evaluations following exposure to e-WOM.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Literacy as a social practice is integrally linked with social, economic and political institutions and processes. As such, it has a material base which is fundamentally constituted in power relations. Literacy is therefore interwoven with the text and context of everyday living in which multi-levelled meanings are organically produced at both individual and societal level. This paper argues that if language thus mediates social reality, then it follows that literacy defined as a social practice cannot really be addressed as a reified, neutral activity but that it should take account of the social, cultural and political processes in which literacy practices are embedded. Drawing on the work of key writers within the field, the paper foregrounds the primary role of the state in defining the forms and levels of literacy required and made available at particular moments within society. In a case-study of the social construction of literacy meanings in pre-revolutionary Iran, it explores the view that the discourse about societal literacy levels has historically constituted a key terrain in which the struggle for control over meaning has taken place. This struggle, it is argued, sets the interests of the state to maintain ideological and political control over the production of knowledge within the culture and society over and against the needs identified by the individual for personal development, empowerment and liberation. In an overall sense, the paper examines existing theoretical perspectives on societal literacy programmes in terms of the scope that they provide for analyses that encompass the multi-levelled power relations that shape and influence dominant discourses on the relative value of literacy for both the individual and society

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The assessment of building energy efficiency is one of the most effective measures for reducing building energy consumption. This paper proposes a holistic method (HMEEB) for assessing and certifying building energy efficiency based on the D-S (Dempster-Shafer) theory of evidence and the Evidential Reasoning (ER) approach. HMEEB has three main features: (i) it provides both a method to assess and certify building energy efficiency, and exists as an analytical tool to identify improvement opportunities; (ii) it combines a wealth of information on building energy efficiency assessment, including identification of indicators and a weighting mechanism; and (iii) it provides a method to identify and deal with inherent uncertainties within the assessment procedure. This paper demonstrates the robustness, flexibility and effectiveness of the proposed method, using two examples to assess the energy efficiency of two residential buildings, both located in the ‘Hot Summer and Cold Winter’ zone in China. The proposed certification method provides detailed recommendations for policymakers in the context of carbon emission reduction targets and promoting energy efficiency in the built environment. The method is transferable to other countries and regions, using an indicator weighting system to modify local climatic, economic and social factors.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Conventional economic theory, applied to information released by listed companies, equates ‘useful’ with ‘price-sensitive’. Stock exchange rules accordingly prohibit the selec- tive, private communication of price-sensitive information. Yet, even in the absence of such communication, UK equity fund managers routinely meet privately with the senior execu- tives of the companies in which they invest. Moreover, they consider these brief, formal and formulaic meetings to be their most important sources of investment information. In this paper we ask how that can be. Drawing on interview and observation data with fund managers and CFOs, we find evidence for three, non-mutually exclusive explanations: that the characterisation of information in conventional economic theory is too restricted, that fund managers fail to act with the rationality that conventional economic theory assumes, and/or that the primary value of the meetings for fund managers is not related to their investment decision making but to the claims of superior knowledge made to clients in marketing their active fund management expertise. Our findings suggest a disconnect between economic theory and economic policy based on that theory, as well as a corre- sponding limitation in research studies that test information-usefulness by assuming it to be synonymous with price-sensitivity. We draw implications for further research into the role of tacit knowledge in equity investment decision-making, and also into the effects of the principal–agent relationship between fund managers and their clients.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A continuous tropospheric and stratospheric vertically resolved ozone time series, from 1850 to 2099, has been generated to be used as forcing in global climate models that do not include interactive chemistry. A multiple linear regression analysis of SAGE I+II satellite observations and polar ozonesonde measurements is used for the stratospheric zonal mean dataset during the well-observed period from 1979 to 2009. In addition to terms describing the mean annual cycle, the regression includes terms representing equivalent effective stratospheric chlorine (EESC) and the 11-yr solar cycle variability. The EESC regression fit coefficients, together with pre-1979 EESC values, are used to extrapolate the stratospheric ozone time series backward to 1850. While a similar procedure could be used to extrapolate into the future, coupled chemistry climate model (CCM) simulations indicate that future stratospheric ozone abundances are likely to be significantly affected by climate change, and capturing such effects through a regression model approach is not feasible. Therefore, the stratospheric ozone dataset is extended into the future (merged in 2009) with multimodel mean projections from 13 CCMs that performed a simulation until 2099 under the SRES (Special Report on Emission Scenarios) A1B greenhouse gas scenario and the A1 adjusted halogen scenario in the second round of the Chemistry-Climate Model Validation (CCMVal-2) Activity. The stratospheric zonal mean ozone time series is merged with a three-dimensional tropospheric data set extracted from simulations of the past by two CCMs (CAM3.5 and GISSPUCCINI)and of the future by one CCM (CAM3.5). The future tropospheric ozone time series continues the historical CAM3.5 simulation until 2099 following the four different Representative Concentration Pathways (RCPs). Generally good agreement is found between the historical segment of the ozone database and satellite observations, although it should be noted that total column ozone is overestimated in the southern polar latitudes during spring and tropospheric column ozone is slightly underestimated. Vertical profiles of tropospheric ozone are broadly consistent with ozonesondes and in-situ measurements, with some deviations in regions of biomass burning. The tropospheric ozone radiative forcing (RF) from the 1850s to the 2000s is 0.23Wm−2, lower than previous results. The lower value is mainly due to (i) a smaller increase in biomass burning emissions; (ii) a larger influence of stratospheric ozone depletion on upper tropospheric ozone at high southern latitudes; and possibly (iii) a larger influence of clouds (which act to reduce the net forcing) compared to previous radiative forcing calculations. Over the same period, decreases in stratospheric ozone, mainly at high latitudes, produce a RF of −0.08Wm−2, which is more negative than the central Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) value of −0.05Wm−2, but which is within the stated range of −0.15 to +0.05Wm−2. The more negative value is explained by the fact that the regression model simulates significant ozone depletion prior to 1979, in line with the increase in EESC and as confirmed by CCMs, while the AR4 assumed no change in stratospheric RF prior to 1979. A negative RF of similar magnitude persists into the future, although its location shifts from high latitudes to the tropics. This shift is due to increases in polar stratospheric ozone, but decreases in tropical lower stratospheric ozone, related to a strengthening of the Brewer-Dobson circulation, particularly through the latter half of the 21st century. Differences in trends in tropospheric ozone among the four RCPs are mainly driven by different methane concentrations, resulting in a range of tropospheric ozone RFs between 0.4 and 0.1Wm−2 by 2100. The ozone dataset described here has been released for the Coupled Model Intercomparison Project (CMIP5) model simulations in netCDF Climate and Forecast (CF) Metadata Convention at the PCMDI website (http://cmip-pcmdi.llnl.gov/).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to explore, from a practical point-of-view, a number of key strategic issues that critically influence organisations' competitiveness. Design/methodology/approach – The paper is based on a semi-structured interview with Mr Paul Walsh, CEO of Diageo. Diageo is a highly successful company and Mr Walsh has played a central role in making Diageo the number one branded drink company in the world. Findings – The paper discusses the key attributes of successful merger, lessons from a complex cross boarder acquisition, rationale for strategic alliance with competitors, distinctive resources, and the role of corporate social responsibility. Research limitations/implications – It is not too often that management scholars have the opportunity to discuss with the CEOs of large multinationals the rational of key strategic decisions. In this paper these issues are explored from the perspective of a CEO of a large and successful company. The lessons, while not generalisable, offer unique insights to students of management and management researchers. Originality/value – The paper offers a bridge between theory and practice. It demonstrates that from Diageo's perspective the distinctive capabilities are intangible. It also offers insight into how to successfully execute strategic decision. In terms of originality it offers a view from the top, which is often missing from strategy research.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

More than thirty years ago, Wind's seminal review of research in market segmentation culminated with a research agenda for the subject area. In the intervening period, research has focused on the development of segmentation bases and models, segmentation research techniques and the identification of statistically sound solutions. Practical questions about implementation and the integration of segmentation into marketing strategy have received less attention, even though practitioners are known to struggle with the actual practice of segmentation. This special issue is motivated by this tension between theory and practice, which has shaped and continues to influence the research priorities for the field. Although many years may have elapsed since Wind's original research agenda, pressing questions about effectiveness and productivity apparently remain; namely: (i) concerns about the link between segmentation and performance, and its measurement; and (ii) the notion that productivity improvements arising from segmentation are only achievable if the segmentation process is effectively implemented. There were central themes to the call for papers for this special issue, which aims to develop our understanding of segmentation value, productivity and strategies, and managerial issues and implementation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Social tagging has become very popular around the Internet as well as in research. The main idea behind tagging is to allow users to provide metadata to the web content from their perspective to facilitate categorization and retrieval. There are many factors that influence users' tag choice. Many studies have been conducted to reveal these factors by analysing tagging data. This paper uses two theories to identify these factors, namely the semiotics theory and activity theory. The former treats tags as signs and the latter treats tagging as an activity. The paper uses both theories to analyse tagging behaviour by explaining all aspects of a tagging system, including tags, tagging system components and the tagging activity. The theoretical analysis produced a framework that was used to identify a number of factors. These factors can be considered as categories that can be consulted to redirect user tagging choice in order to support particular tagging behaviour, such as cross-lingual tagging.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In probabilistic decision tasks, an expected value (EV) of a choice is calculated, and after the choice has been made, this can be updated based on a temporal difference (TD) prediction error between the EV and the reward magnitude (RM) obtained. The EV is measured as the probability of obtaining a reward x RM. To understand the contribution of different brain areas to these decision-making processes, functional magnetic resonance imaging activations related to EV versus RM (or outcome) were measured in a probabilistic decision task. Activations in the medial orbitofrontal cortex were correlated with both RM and with EV and confirmed in a conjunction analysis to extend toward the pregenual cingulate cortex. From these representations, TD reward prediction errors could be produced. Activations in areas that receive from the orbitofrontal cortex including the ventral striatum, midbrain, and inferior frontal gyrus were correlated with the TD error. Activations in the anterior insula were correlated negatively with EV, occurring when low reward outcomes were expected, and also with the uncertainty of the reward, implicating this region in basic and crucial decision-making parameters, low expected outcomes, and uncertainty.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to seek to shed light on the practice of incomplete corporate disclosure of quantitative Greenhouse gas (GHG) emissions and investigates whether external stakeholder pressure influences the existence, and separately, the completeness of voluntary GHG emissions disclosures by 431 European companies. Design/methodology/approach – A classification of reporting completeness is developed with respect to the scope, type and reporting boundary of GHG emissions based on the guidelines of the GHG Protocol, Global Reporting Initiative and the Carbon Disclosure Project. Logistic regression analysis is applied to examine whether proxies for exposure to climate change concerns from different stakeholder groups influence the existence and/or completeness of quantitative GHG emissions disclosure. Findings – From 2005 to 2009, on average only 15 percent of companies that disclose GHG emissions report them in a manner that the authors consider complete. Results of regression analyses suggest that external stakeholder pressure is a determinant of the existence but not the completeness of emissions disclosure. Findings are consistent with stakeholder theory arguments that companies respond to external stakeholder pressure to report GHG emissions, but also with legitimacy theory claims that firms can use carbon disclosure, in this case the incomplete reporting of emissions, as a symbolic act to address legitimacy exposures. Practical implications – Bringing corporate GHG emissions disclosure in line with recommended guidelines will require either more direct stakeholder pressure or, perhaps, a mandated disclosure regime. In the meantime, users of the data will need to carefully consider the relevance of the reported data and develop the necessary competencies to detect and control for its incompleteness. A more troubling concern is that stakeholders may instead grow to accept less than complete disclosure. Originality/value – The paper represents the first large-scale empirical study into the completeness of companies’ disclosure of quantitative GHG emissions and is the first to analyze these disclosures in the context of stakeholder pressure and its relation to legitimation.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The challenge of moving past the classic Window Icons Menus Pointer (WIMP) interface, i.e. by turning it ‘3D’, has resulted in much research and development. To evaluate the impact of 3D on the ‘finding a target picture in a folder’ task, we built a 3D WIMP interface that allowed the systematic manipulation of visual depth, visual aides, semantic category distribution of targets versus non-targets; and the detailed measurement of lower-level stimuli features. Across two separate experiments, one large sample web-based experiment, to understand associations, and one controlled lab environment, using eye tracking to understand user focus, we investigated how visual depth, use of visual aides, use of semantic categories, and lower-level stimuli features (i.e. contrast, colour and luminance) impact how successfully participants are able to search for, and detect, the target image. Moreover in the lab-based experiment, we captured pupillometry measurements to allow consideration of the influence of increasing cognitive load as a result of either an increasing number of items on the screen, or due to the inclusion of visual depth. Our findings showed that increasing the visible layers of depth, and inclusion of converging lines, did not impact target detection times, errors, or failure rates. Low-level features, including colour, luminance, and number of edges, did correlate with differences in target detection times, errors, and failure rates. Our results also revealed that semantic sorting algorithms significantly decreased target detection times. Increased semantic contrasts between a target and its neighbours correlated with an increase in detection errors. Finally, pupillometric data did not provide evidence of any correlation between the number of visible layers of depth and pupil size, however, using structural equation modelling, we demonstrated that cognitive load does influence detection failure rates when there is luminance contrasts between the target and its surrounding neighbours. Results suggest that WIMP interaction designers should consider stimulus-driven factors, which were shown to influence the efficiency with which a target icon can be found in a 3D WIMP interface.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This Study investigated the impact of thermoplastic extrusion on the nutritive quality of bovine rumen protein. Proximal composition, amino acid profile and in vivo true protein digestibility among rats were determined in raw (RBR) and extruded (EBR) rumen. Raw and extruded bovine rumen presented high percentages of protein (more than 95% on dry basis). Neither raw nor extruded proteins had any limiting amino acid, and the RBR and EBR amino acid scores were, respectively, 1.28 (leucine) and 1.25 (methionine plus cystine). Extrusion reduced significantly true protein digestibility from 97.7% to 93.1% (p < 0.001), but protein digestibility-corrected amino acid scores for both proteins (RBR and EBR) were 100%. Animal growth presented comparable profiles using raw and extruded rumen. In conclusion, thermoplastic extrusion did not affect the protein quality of bovine rumen, and this does not hinder the use of this material as a food ingredient. (C) 2009 Elsevier Ltd. Ail rights reserved.