828 resultados para expected value of information
Resumo:
The objective of the current research is to investigate brand value generation. The study is conducted in the context of high-technology companies. The research aims at finding the impact of long-term brand development strategies, including advertising investments, R&D investments, R&D intensity, new products developed and design. The empirical part of the study incorporated collection of primary and secondary data on 36 companies operating in high-technology sector and being rated as top companies with the most valuable brands by Interbrand consultancy. The data contained information for six consequent years from 2008 to 2013. Obtained data was analyzed using the methods of fixed effect and random effect model (panel data analysis). The analysis showed positive effect of advertising and R&D investments on brand value of high-technology companies in the long run. The impact of remaining three strategies was not approved and further investigation is required.
Resumo:
The goal of the thesis was to gain understanding of organizational buying behavior and its effect from the selling perspective and to generate base for verifying customer value propositions for Actiw Oy. The first objective was to discover the current buying decision criteria of current customers to understand the buying motives which had led to the investment initially. Second objective was to understand how the buying decision criteria and customer experiences can be turned into customer value propositions. Research was done with 16 customer interviews, which were focused on obtaining the information on the buying center and the value of the solution. Thesis goes through the main theories of OBB and the theory behind customer value management. Based on customer interviews, the currently used customer value propositions were tested and categorized into points-of-parities and points-ofdifferences. The interviews confirmed customer behavior in new task and modified rebuy situations and also gave confirmation to the internally done customer value propositions. Main finding of the study was, that as the value propositions are possible to present more specifically to each new case instead of using all benefits at the same time.
Resumo:
An investor can either conduct independent analysis or rely on the analyses of others. Stock analysts provide markets with expectations regarding particular securities. However, analysts have different capabilities and resources, of which investors are seldom cognizant. The local advantage refers to the advantage stemming from cultural or geographical proximity to securities analyzed. The research has confirmed that local agents are generally more accurate or produce excess returns. This thesis tests the investment value of the local advantage regarding Finnish stocks via target price data. The empirical section investigates the local advantage from several aspects. It is discovered that local analysts were more focused on certain sectors generally located close to consumer markets. Market reactions to target price revisions were generally insignificant with the exception to local positive target prices. Both local and foreign target prices were overly optimistic and exhibited signs of herding. Neither group could be identified as a leader or follower of new information. Additionally, foreign price change expectations were more in line with the quantitative models and ideas such as beta or return mean reversion. The locals were more accurate than foreign analysts in 5 out of 9 sectors and vice versa in one. These sectors were somewhat in line with coverage decisions and buttressed the idea of local advantage stemming from proximity to markets, not to headquarters. The accuracy advantage was dependent on sample years and on the measure used. Local analysts ranked magnitudes of price changes more accurately in optimistic and foreign analysts in pessimistic target prices. Directional accuracy of both groups was under 50% and target prices held no linear predictive power. Investment value of target prices were tested by forming mean-variance efficient portfolios. Parallel to differing accuracies in the levels of expectations foreign portfolio performed better when short sales were allowed and local better when disallowed. Both local and non-local portfolios performed worse than a passive index fund, albeit not statistically significantly. This was in line with previously reported low overall accuracy and different accuracy profiles. Refraining from estimating individual stock returns altogether produced statistically significantly higher Sharpe ratios compared to local or foreign portfolios. The proposed method of testing the investment value of target prices of different groups suffered from some inconsistencies. Nevertheless, these results are of interest to investors seeking the advice of security analysts.
Resumo:
An investor can either conduct independent analysis or rely on the analyses of others. Stock analysts provide markets with expectations regarding particular securities. However, analysts have different capabilities and resources, of which investors are seldom cognizant. The local advantage refers to the advantage stemming from cultural or geographical proximity to securities analyzed. The research has confirmed that local agents are generally more accurate or produce excess returns. This thesis tests the investment value of the local advantage regarding Finnish stocks via target price data. The empirical section investigates the local advantage from several aspects. It is discovered that local analysts were more focused on certain sectors generally located close to consumer markets. Market reactions to target price revisions were generally insignificant with the exception to local positive target prices. Both local and foreign target prices were overly optimistic and exhibited signs of herding. Neither group could be identified as a leader or follower of new information. Additionally, foreign price change expectations were more in line with the quantitative models and ideas such as beta or return mean reversion. The locals were more accurate than foreign analysts in 5 out of 9 sectors and vice versa in one. These sectors were somewhat in line with coverage decisions and buttressed the idea of local advantage stemming from proximity to markets, not to headquarters. The accuracy advantage was dependent on sample years and on the measure used. Local analysts ranked magnitudes of price changes more accurately in optimistic and foreign analysts in pessimistic target prices. Directional accuracy of both groups was under 50% and target prices held no linear predictive power. Investment value of target prices were tested by forming mean-variance efficient portfolios. Parallel to differing accuracies in the levels of expectations foreign portfolio performed better when short sales were allowed and local better when disallowed. Both local and non-local portfolios performed worse than a passive index fund, albeit not statistically significantly. This was in line with previously reported low overall accuracy and different accuracy profiles. Refraining from estimating individual stock returns altogether produced statistically significantly higher Sharpe ratios compared to local or foreign portfolios. The proposed method of testing the investment value of target prices of different groups suffered from some inconsistencies. Nevertheless, these results are of interest to investors seeking the advice of security analysts.
Resumo:
In Canada freedom of information must be viewed in the context of governing -- how do you deal with an abundance of information while balancing a diversity of competing interests? How can you ensure people are informed enough to participate in crucial decision-making, yet willing enough to let some administrative matters be dealt with in camera without their involvement in every detail. In an age when taxpayers' coalition groups are on the rise, and the government is encouraging the establishment of Parent Council groups for schools, the issues and challenges presented by access to information and protection of privacy legislation are real ones. The province of Ontario's decision to extend freedom of information legislation to local governments does not ensure, or equate to, full public disclosure of all facts or necessarily guarantee complete public comprehension of an issue. The mere fact that local governments, like school boards, decide to collect, assemble or record some information and not to collect other information implies that a prior decision was made by "someone" on what was important to record or keep. That in itself means that not all the facts are going to be disclosed, regardless of the presence of legislation. The resulting lack of information can lead to public mistrust and lack of confidence in those who govern. This is completely contrary to the spirit of the legislation which was to provide interested members of the community with facts so that values like political accountability and trust could be ensured and meaningful criticism and input obtained on matters affecting the whole community. This thesis first reviews the historical reasons for adopting freedom of information legislation, reasons which are rooted in our parliamentary system of government. However, the same reasoning for enacting such legislation cannot be applied carte blanche to the municipal level of government in Ontario, or - ii - more specifially to the programs, policies or operations of a school board. The purpose of this thesis is to examine whether the Municipal Freedom of Information and Protection of Privacy Act, 1989 (MFIPPA) was a neccessary step to ensure greater openness from school boards. Based on a review of the Orders made by the Office of the Information and Privacy Commissioner/Ontario, it also assesses how successfully freedom of information legislation has been implemented at the municipal level of government. The Orders provide an opportunity to review what problems school boards have encountered, and what guidance the Commissioner has offered. Reference is made to a value framework as an administrative tool in critically analyzing the suitability of MFIPPA to school boards. The conclusion is drawn that MFIPPA appears to have inhibited rather than facilitated openness in local government. This may be attributed to several factors inclusive of the general uncertainty, confusion and discretion in interpreting various provisions and exemptions in the Act. Some of the uncertainty is due to the fact that an insufficient number of school board staff are familiar with the Act. The complexity of the Act and its legalistic procedures have over-formalized the processes of exchanging information. In addition there appears to be a concern among municipal officials that granting any access to information may be violating personal privacy rights of others. These concerns translate into indecision and extreme caution in responding to inquiries. The result is delay in responding to information requests and lack of uniformity in the responses given. However, the mandatory review of the legislation does afford an opportunity to address some of these problems and to make this complex Act more suitable for application to school boards. In order for the Act to function more efficiently and effectively legislative changes must be made to MFIPPA. It is important that the recommendations for improving the Act be adopted before the government extends this legislation to any other public entities.
Resumo:
Knowledge of how water is perceived, used and managed in a community is critical to the endeavour of water governance. Surveys of individuals residing in a community offer a valuable avenue to gain information about several of these aspects of water. This paper draws upon experiences in three First Nation communities to explore the values of surveys to illuminate water issues and inform water decision-making. Findings from experiences with surveys in Six Nations of the Grand River, Mississaugas of the New Credit, and Oneida First Nation of the Thames reveal rich information about how surveys can provide insights about: the connection of individuals to the land, water and their community; reasons for valuing water; perceptions of water quality and issues surrounding water-related advisories; and, degree of satisfaction with water management and governance at different scales. Community partners reflected upon the findings of the survey for their community. Dialogue was then broadened across the cases as the partners offer benefits and challenges associated with the survey. Community surveys offer an important tool in the resource managers’ toolbox to understand social perceptions of water and provide valuable insights that may assist in improving its governance.
Resumo:
This thesis examines the quality of credit ratings issued by the three major credit rating agencies - Moody’s, Standard and Poor’s and Fitch. If credit ratings are informative, then prices of underlying credit instruments such as fixed-income securities and credit default insurance should change to reflect the new credit risk information. Using data on 246 different major fixed income securities issuers and spanning January 2000 to December 2011, we find that credit default swaps (CDS) spreads do not react to changes in credit ratings. Hence credit ratings for all three agencies are not price informative. CDS prices are mostly determined by historical CDS prices while ratings are mostly determined by historical ratings. We find that credit ratings are marginally more sensitive to CDS than CDS are sensitive to ratings.
Resumo:
Ever since Sen’s (1993; 1997) criticism on the notion of internal consistency or menu independence of choice, there exists a widespread perception that the standard revealed preference approach to the theory of rational choice has difficulties in coping with the existence of external norms, or the information a menu of choice might convey to a decision-maker, viz., the epistemic value of a menu. This paper provides a brief survey of possible responses to these criticisms of traditional rational choice theory. It is shown that a novel concept of norm-conditional rationalizability can neatly accommodate external norms within the standard framework of rationalizability theory. Furthermore, we illustrate that there are several ways of incorporating considerations regarding the epistemic value of opportunity sets into a generalized model of rational choice theory.
Resumo:
En un país pequeño como Austria es necesario recibir estímulo del exterior a nivel europeo respectivamente internacional para que los responsables vean lo útil que son las iniciativas nacionales en el campo de documentación educativa. Trabajar de manera comparativa en un contexto europeo ayuda a comprender mejor su propia posición y sirve de base para tomar decisiones y para desarrollar nuevos proyectos en el futuro. Ha crecido el interés en utilizar la información y documentación educativa durante las últimas décadas. Actualmente los ministros de educación ya no pueden seguir su trabajo sin recurrir a todos aquellos datos que han sido recopilados en los respectivos países y presentados a través de las redes europeas como EURYDICE, la red europea de información en educación (htttp://www.eurydice.org).
Resumo:
University students suffer from variable sleep patterns including insomnia;[1] furthermore, the highest incidence of herbal use appears to be among college graduates.[2] Our objective was to test the perception of safety and value of herbal against conventional medicine for the treatment of insomnia in a non-pharmacy student population. We used an experimental design and bespoke vignettes that relayed the same effectiveness information to test our hypothesis that students would give higher ratings of safety and value to herbal product compared to conventional medicine. We tested another hypothesis that the addition of side-effect information would lower people’s perception of the safety and value of the herbal product to a greater extent than it would with the conventional medicine.
Resumo:
This paper presents the findings from a study into the current exploitation of computer-supported collaborative working (CSCW) in design for the built environment in the UK. The research is based on responses to a web-based questionnaire. Members of various professions, including civil engineers, architects, building services engineers, and quantity surveyors, were invited to complete the questionnaire. The responses reveal important trends in the breadth and size of project teams at the same time as new pressures are emerging regarding team integration and efficiency. The findings suggest that while CSCW systems may improve project management (e.g., via project documentation) and the exchange of information between team members, it has yet to significantly support those activities that characterize integrated collaborative working between disparate specialists. The authors conclude by combining the findings with a wider discussion of the application of CSCW to design activity-appealing for CSCW to go beyond multidisciplinary working to achieve interdisciplinary working.
Resumo:
More data will be produced in the next five years than in the entire history of human kind, a digital deluge that marks the beginning of the Century of Information. Through a year-long consultation with UK researchers, a coherent strategy has been developed, which will nurture Century-of-Information Research (CIR); it crystallises the ideas developed by the e-Science Directors' Forum Strategy Working Group. This paper is an abridged version of their latest report which can be found at: http://wikis.nesc.ac.uk/escienvoy/Century_of_Information_Research_Strategy which also records the consultation process and the affiliations of the authors. This document is derived from a paper presented at the Oxford e-Research Conference 2008 and takes into account suggestions made in the ensuing panel discussion. The goals of the CIR Strategy are to facilitate the growth of UK research and innovation that is data and computationally intensive and to develop a new culture of 'digital-systems judgement' that will equip research communities, businesses, government and society as a whole, with the skills essential to compete and prosper in the Century of Information. The CIR Strategy identifies a national requirement for a balanced programme of coordination, research, infrastructure, translational investment and education to empower UK researchers, industry, government and society. The Strategy is designed to deliver an environment which meets the needs of UK researchers so that they can respond agilely to challenges, can create knowledge and skills, and can lead new kinds of research. It is a call to action for those engaged in research, those providing data and computational facilities, those governing research and those shaping education policies. The ultimate aim is to help researchers strengthen the international competitiveness of the UK research base and increase its contribution to the economy. The objectives of the Strategy are to better enable UK researchers across all disciplines to contribute world-leading fundamental research; to accelerate the translation of research into practice; and to develop improved capabilities, facilities and context for research and innovation. It envisages a culture that is better able to grasp the opportunities provided by the growing wealth of digital information. Computing has, of course, already become a fundamental tool in all research disciplines. The UK e-Science programme (2001-06)—since emulated internationally—pioneered the invention and use of new research methods, and a new wave of innovations in digital-information technologies which have enabled them. The Strategy argues that the UK must now harness and leverage its own, plus the now global, investment in digital-information technology in order to spread the benefits as widely as possible in research, education, industry and government. Implementing the Strategy would deliver the computational infrastructure and its benefits as envisaged in the Science & Innovation Investment Framework 2004-2014 (July 2004), and in the reports developing those proposals. To achieve this, the Strategy proposes the following actions: support the continuous innovation of digital-information research methods; provide easily used, pervasive and sustained e-Infrastructure for all research; enlarge the productive research community which exploits the new methods efficiently; generate capacity, propagate knowledge and develop skills via new curricula; and develop coordination mechanisms to improve the opportunities for interdisciplinary research and to make digital-infrastructure provision more cost effective. To gain the best value for money strategic coordination is required across a broad spectrum of stakeholders. A coherent strategy is essential in order to establish and sustain the UK as an international leader of well-curated national data assets and computational infrastructure, which is expertly used to shape policy, support decisions, empower researchers and to roll out the results to the wider benefit of society. The value of data as a foundation for wellbeing and a sustainable society must be appreciated; national resources must be more wisely directed to the collection, curation, discovery, widening access, analysis and exploitation of these data. Every researcher must be able to draw on skills, tools and computational resources to develop insights, test hypotheses and translate inventions into productive use, or to extract knowledge in support of governmental decision making. This foundation plus the skills developed will launch significant advances in research, in business, in professional practice and in government with many consequent benefits for UK citizens. The Strategy presented here addresses these complex and interlocking requirements.
Resumo:
Calliandra calothyrsus is a tree legume native to Mexico and Central America. The species has attracted considerable attention for its capacity to produce both fuelwood and foliage for either green manure or fodder. Its high content of proanthocyanidins (condensed tannins) and associated low digestibility has, however, limited its use as a feed for ruminants, and there is also a widespread perception that wilting the leaves further reduces their nutritive value. Nevertheless, there has been increasing uptake of calliandra as fodder in certain regions, notably the Central Highlands of Kenya. The present study, conducted in Embu, Kenya, investigated effects of provenance, wilting, cutting frequency and seasonal variation both in the laboratory (in vitro digestibility, crude protein, neutral detergent fibre, extractable and bound proanthocyanidins) and in on-station animal production trials with growing lambs and lactating goats. The local Kenyan landrace of calliandra (Embu) and a closely-related Guatemalan provenance (Patulul) were found to be significantly different, and superior, to a provenance from Nicaragua (San Ramon) in most of the laboratory traits measured, as well as in animal production and feed efficiency. Cutting frequency had no important effect on quality; and although all quality traits displayed seasonal variation there was little discernible pattern to this variation. Wilting had a much less negative effect than expected, and for lambs fed calliandra as a supplement to a low quality basal feed (maize stover), wilting was actually found to give higher live-weight gain and feed efficiency. Conversely, with a high quality basal diet (Napier grass) wilting enhanced intake but not live-weight gain, so feed efficiency was greater for fresh material. The difference between fresh and wilted leaves was not great enough to justify the current widespread recommendation that calliandra should always be fed fresh.
Resumo:
The management of information in engineering organisations is facing a particular challenge in the ever-increasing volume of information. It has been recognised that an effective methodology is required to evaluate information in order to avoid information overload and to retain the right information for reuse. By using, as a starting point, a number of the current tools and techniques which attempt to obtain ‘the value’ of information, it is proposed that an assessment or filter mechanism for information is needed to be developed. This paper addresses this issue firstly by briefly reviewing the information overload problem, the definition of value, and related research work on the value of information in various areas. Then a “characteristic” based framework of information evaluation is introduced using the key characteristics identified from related work as an example. A Bayesian Network diagram method is introduced to the framework to build the linkage between the characteristics and information value in order to quantitatively calculate the quality and value of information. The training and verification process for the model is then described using 60 real engineering documents as a sample. The model gives a reasonable accurate result and the differences between the model calculation and training judgements are summarised as the potential causes are discussed. Finally, several further issues including the challenge of the framework and the implementations of this evaluation assessment method are raised.