931 resultados para information value
Resumo:
Problem This dissertation presents a literature-based framework for communication in science (with the elements partners, purposes, message, and channel), which it then applies in and amends through an empirical study of how geoscientists use two social computing technologies (SCTs), blogging and Twitter (both general use and tweeting from conferences). How are these technologies used and what value do scientists derive from them? Method The empirical part used a two-pronged qualitative study, using (1) purposive samples of ~400 blog posts and ~1000 tweets and (2) a purposive sample of 8 geoscientist interviews. Blog posts, tweets, and interviews were coded using the framework, adding new codes as needed. The results were aggregated into 8 geoscientist case studies, and general patterns were derived through cross-case analysis. Results A detailed picture of how geoscientists use blogs and twitter emerged, including a number of new functions not served by traditional channels. Some highlights: Geoscientists use SCTs for communication among themselves as well as with the public. Blogs serve persuasion and personal knowledge management; Twitter often amplifies the signal of traditional communications such as journal articles. Blogs include tutorials for peers, reviews of basic science concepts, and book reviews. Twitter includes links to readings, requests for assistance, and discussions of politics and religion. Twitter at conferences provides live coverage of sessions. Conclusions Both blogs and Twitter are routine parts of scientists' communication toolbox, blogs for in-depth, well-prepared essays, Twitter for faster and broader interactions. Both have important roles in supporting community building, mentoring, and learning and teaching. The Framework of Communication in Science was a useful tool in studying these two SCTs in this domain. The results should encourage science administrators to facilitate SCT use of scientists in their organization and information providers to search SCT documents as an important source of information.
Resumo:
The wide-spread impact of exotic fishes especially Oreochromis niloticus and Lates niloticus together with over fishing in the Victoria and Kyoga lake basins during the 1950s and 1960s, caused endemic species such as the previously most important Oreochromis esculentus to become virtually extinct in the two lakes by the 1970s. Based on reports of presence of this native species in some satellite lakes within the two lake basins, a set of satellite lakes in the Victoria basin (Nabugabo lakes: Kayanja and Kayugi), were sampled between 1997-2002 with an objective of assessing their value as conservation sites for O. esculentus. Other satellite lakes (Mburo and Kachera) also in the Victoria basin, and Lemwa, Kawi and Nabisojjo, in the Kyoga basin, were sampled for comparison. Among the Nabugabo lakes, O. esculentus was more abundant in Lake Kayanja (20.1 %) ofthe total fish catch by weight compared to Lake Kayugi (1.4 %). The largest fish examined (38.7 cm TL) was caught in Lake Kayugi, (also the largest in all satellite lakes sampled), while the smallest (6.6 cm TL) was from Lake Kayanja. Fish from Lake Kayugi had a higher condition factor K (1.89± 0.02) than that from Lake Kayanja (1.53±0.0I), which was the second highest (compared with other satellite lakes) to Lake Kawi (1.92±0.2). Diatoms, especially Aulacoseira, which were previously known to be the best food for O. esculentus in Lake Victoria were mostly encountered (93.2 %) in fish stomachs from Lake Kayugi. In Lake Kayanja the dominant food item was the blue green algae (Planktolyngbya) while Microcystis was the most abundant diet item in fish from other satellite lakes. There were more male than female fish (ratio 1:0.91 and 1: 0.79 in lakes Kayugi and Kayanja respectively). This is comparable to the situation in Lake Victoria before the species got depleted. The highest mean fecundity was (771±218 eggs) recorded in Lake Kayugi compared to Lake Kayanja (399±143). Based on the results from Lake Kayugi, where diatoms dominated the diet of O. esculentus and where the largest, most fecund and healthy fish were found, this lake would be a most valuable site for the conservation of O. esculentus and the best source of fish, for restocking and for captive-propagation. This lake is therefore recommended for protection from over exploitation and misuse.
Resumo:
Australian forest industries have a long history of export trade of a wide range of products from woodchips (for paper manufacturing), sandalwood (essential oils, carving and incense) to high value musical instruments, flooring and outdoor furniture. For the high value group, fluctuating environmental conditions brought on by changes in temperature and relative humidity, can lead to performance problems due to consequential swelling, shrinkage and/or distortion of the wood elements. A survey determined the types of value-added products exported, including species and dimensions packaging used and export markets. Data loggers were installed with shipments to monitor temperature and relative humidity conditions. These data were converted to timber equilibrium moisture content values to provide an indication of the environment that the wood elements would be acclimatising to. The results of the initial survey indicated that primary high value wood export products included guitars, flooring, decking and outdoor furniture. The destination markets were mainly located in the northern hemisphere, particularly the United States of America, China, Hong Kong, Europe (including the United Kingdom), Japan, Korea and the Middle East. Other regions importing Australian-made wooden articles were south-east Asia, New Zealand and South Africa. Different timber species have differing rates of swelling and shrinkage, so the types of timber were also recorded during the survey. Results from this work determined that the major species were ash-type eucalypts from south-eastern Australia (commonly referred to in the market as Tasmanian oak), jarrah from Western Australia, spotted gum, hoop pine, white cypress, black butt, brush box and Sydney blue gum from Queensland and New South Wales. The environmental conditions data indicated that microclimates in shipping containers can fluctuate extensively during shipping. Conditions at the time of manufacturing were usually between 10 and 12% equilibrium moisture content, however conditions during shipping could range from 5 (very dry) to 20% (very humid). The packaging systems incorporated were reported to be efficient at protecting the wooden articles from damage during transit. The research highlighted the potential risk for wood components to ‘move’ in response to periods of drier or more humid conditions than those at the time of manufacturing, and the importance of engineering a packaging system that can account for the environmental conditions experienced in shipping containers. Examples of potential dimensional changes in wooden components were calculated based on published unit shrinkage data for key species and the climatic data returned from the logging equipment. The information highlighted the importance of good design to account for possible timber movement during shipping. A timber movement calculator was developed to allow designers to input component species, dimensions, site of manufacture and destination, to see validate their product design.
Resumo:
Most economic transactions nowadays are due to the effective exchange of information in which digital resources play a huge role. New actors are coming into existence all the time, so organizations are facing difficulties in keeping their current customers and attracting new customer segments and markets. Companies are trying to find the key to their success and creating superior customer value seems to be one solution. Digital technologies can be used to deliver value to customers in ways that extend customers’ normal conscious experiences in the context of time and space. By creating customer value, companies can gain the increased loyalty of existing customers and better ways to serve new customers effectively. Based on these assumptions, the objective of this study was to design a framework to enable organizations to create customer value in digital business. The research was carried out as a literature review and an empirical study, which consisted of a web-based survey and semi-structured interviews. The data from the empirical study was analyzed as mixed research with qualitative and quantitative methods. These methods were used since the object of the study was to gain deeper understanding about an existing phenomena. Therefore, the study used statistical procedures and value creation is described as a phenomenon. The framework was designed first based on the literature and updated based on the findings from the empirical study. As a result, relationship, understanding the customer, focusing on the core product or service, the product or service quality, incremental innovations, service range, corporate identity, and networks were chosen as the top elements of customer value creation. Measures for these elements were identified. With the measures, companies can manage the elements in value creation when dealing with present and future customers and also manage the operations of the company. In conclusion, creating customer value requires understanding the customer and a lot of information sharing, which can be eased by digital resources. Understanding the customer helps to produce products and services that fulfill customers’ needs and desires. This could result in increased sales and make it easier to establish efficient processes.
Resumo:
Collecting and analyzing consumer data is essential in today’s data-driven business environment. However, consumers are becoming more aware of the value of the information they can provide to companies, thereby being more reluctant to share it for free. Therefore, companies need to find ways to motivate consumers to disclose personal information. The main research question of the study was formed as “How can companies motivate consumers to disclose personal information?” and it was further divided into two subquestions: 1) What types of benefits motivate consumers to disclose personal information? 2) How does the disclosure context affect the consumers’ information disclosure behavior? The conceptual framework consisted of a classification of extrinsic and intrinsic benefits, and moderating factors, which were recognized on the basis of prior research in the field. The study was conducted by using qualitative research methods. The primary data was collected by interviewing ten representatives from eight companies. The data was analyzed and reported according to predetermined themes. The findings of the study confirm that consumers can be motivated to disclose personal information by offering different types of extrinsic (monetary saving, time saving, self-enhancement, and social adjustment) and intrinsic (novelty, pleasure, and altruism) benefits. However, not all the benefits are equally useful ways to convince the customer to disclose information. Moreover, different factors in the disclosure context can either alleviate or increase the effectiveness of the benefits and the consumers’ motivation to disclose personal information. Such factors include the consumer’s privacy concerns, perceived trust towards the company, the relevancy of the requested information, personalization, website elements (especially security, usability, and aesthetics of a website), and the consumer’s shopping motivation. This study has several contributions. It is essential that companies recognize the most attractive benefits regarding their business and their customers, and that they understand how the disclosure context affects the consumer’s information disclosure behavior. The likelihood of information disclosure can be increased, for example, by offering benefits that meet the consumers’ needs and preferences, improving the relevancy of the asked information, stating the reasons for data collection, creating and maintaining a trustworthy image of the company, and enhancing the quality of the company’s website.
Resumo:
We can categorically state that the information is the main ingredient of the culture of a people. Defining information as a set of processed data that are needed, have value and meaning for individuals and for developing countries where data is any fact.The information could also be considered as an international resource indispensable. The scientific and technological development is a consequence of the relevant information is handled. The "heap" is a source of information that libraries must consider and integrate them because they give rise to more information and new technology. The need for improved information systems in recent years, it has become critical because the information data grows in quantity and complexity of organization. Consequently, some libraries have tried to coordinate functions, redistribute resources, identify needs and work together to give easier access to information
Resumo:
The universities rely on the Information Technology (IT) projects to support and enhance their core strategic objectives of teaching, research, and administration. The researcher’s literature review found that the level of IT funding and resources in the universities is not adequate to meet the IT demands. The universities received more IT project requests than they could execute. As such, universities must selectively fund the IT projects. The objectives of the IT projects in the universities vary. An IT project which benefits the teaching functions may not benefit the administrative functions. As such, the selection of an IT project is challenging in the universities. To aid with the IT decision making, many universities in the United States of America (USA) have formed the IT Governance (ITG) processes. ITG is an IT decision making and accountability framework whose purpose is to align the IT efforts in an organization with its strategic objectives, realize the value of the IT investments, meet the expected performance criteria, and manage the risks and the resources (Weil & Ross, 2004). ITG in the universities is relatively new, and it is not well known how the ITG processes are aiding the nonprofit universities in selecting the right IT projects, and managing the performance of these IT projects. This research adds to the body of knowledge regarding the IT project selection under the governance structure, the maturity of the IT projects, and the IT project performance in the nonprofit universities. The case study research methodology was chosen for this exploratory research. The convenience sampling was done to choose the cases from two large, research universities with decentralized colleges, and two small, centralized universities. The data were collected on nine IT projects from these four universities using the interviews and the university documents. The multi-case analysis was complemented by the Qualitative Comparative Analysis (QCA) to systematically analyze how the IT conditions lead to an outcome. This research found that the IT projects were selected in the centralized universities in a more informed manner. ITG was more authoritative in the small centralized universities; the ITG committees were formed by including the key decision makers, the decision-making roles, and responsibilities were better defined, and the frequency of ITG communication was higher. In the centralized universities, the business units and colleges brought the IT requests to ITG committees; which in turn prioritized the IT requests and allocated the funds and the resources to the IT projects. ITG committee members in the centralized universities had a higher awareness of the university-wide IT needs, and the IT projects tended to align with the strategic objectives. On the other hand, the decentralized colleges and business units in the large universities were influential and often bypassed the ITG processes. The decentralized units often chose the “pet” IT projects, and executed them within a silo, without bringing them to the attention of the ITG committees. While these IT projects met the departmental objectives, they did not always align with the university’s strategic objectives. This research found that the IT project maturity in the university could be increased by following the project management methodologies. The IT project management maturity was found higher in the IT projects executed by the centralized university, where a full-time project manager was assigned to manage the project, and the project manager had a higher expertise in the project management. The IT project executed under the guidance of the Project Management Office (PMO) has exhibited a higher project management maturity, as the PMO set the standards and controls for the project. The IT projects managed by the decentralized colleges by a part-time project manager with lower project management expertise have exhibited a lower project management maturity. The IT projects in the decentralized colleges were often managed by the business, or technical leads, who often lacked the project management expertise. This research found that higher the IT project management maturity, the better is the project performance. The IT projects with a higher maturity had a lower project delay, lower number of missed requirements, and lower number of IT system errors. This research found that the quality of IT decision in the university could be improved by centralizing the IT decision-making processes. The IT project management maturity could be improved by following the project management methodologies. The stakeholder management and communication were found critical for the success of the IT projects in the university. It is hoped that the findings from this research would help the university leaders make the strategic IT decisions, and the university’s IT project managers make the IT project decisions.
Resumo:
"4-63."
Resumo:
Prior research shows that electronic word of mouth (eWOM) wields considerable influence over consumer behavior. However, as the volume and variety of eWOM grows, firms are faced with challenges in analyzing and responding to this information. In this dissertation, I argue that to meet the new challenges and opportunities posed by the expansion of eWOM and to more accurately measure its impacts on firms and consumers, we need to revisit our methodologies for extracting insights from eWOM. This dissertation consists of three essays that further our understanding of the value of social media analytics, especially with respect to eWOM. In the first essay, I use machine learning techniques to extract semantic structure from online reviews. These semantic dimensions describe the experiences of consumers in the service industry more accurately than traditional numerical variables. To demonstrate the value of these dimensions, I show that they can be used to substantially improve the accuracy of econometric models of firm survival. In the second essay, I explore the effects on eWOM of online deals, such as those offered by Groupon, the value of which to both consumers and merchants is controversial. Through a combination of Bayesian econometric models and controlled lab experiments, I examine the conditions under which online deals affect online reviews and provide strategies to mitigate the potential negative eWOM effects resulting from online deals. In the third essay, I focus on how eWOM can be incorporated into efforts to reduce foodborne illness, a major public health concern. I demonstrate how machine learning techniques can be used to monitor hygiene in restaurants through crowd-sourced online reviews. I am able to identify instances of moral hazard within the hygiene inspection scheme used in New York City by leveraging a dictionary specifically crafted for this purpose. To the extent that online reviews provide some visibility into the hygiene practices of restaurants, I show how losses from information asymmetry may be partially mitigated in this context. Taken together, this dissertation contributes by revisiting and refining the use of eWOM in the service sector through a combination of machine learning and econometric methodologies.
Resumo:
Esta investigación evalúa el desempeño de 73 fondos de inversión colectiva (FIC) colombianos enfocados en acciones de 2005 a 2015 -- Para cuantificar el valor generado por estos fondos en comparación con sus respectivos activos de referencia (“benchmarks”), se calcula el alfa de Jensen mediante dos metodologías de regresión: Mínimos Cuadrados Ordinarios (MCO) y Regresión por Cuantiles -- También se analiza si estos fondos muestran evidencia de “market timing” o no, utilizando dos modelos: efecto cuadrático y variable binaria interactiva -- De igual manera, nuestro estudio propone la creación de una empresa privada en Colombia que provea a los inversores de información precisa sobre las características y desempeño histórico de estos fondos de inversión colectiva, como lo hace Morningstar Inc. en Estados Unidos -- Esto permitiría a los inversores seleccionar los fondos con mejores perspectivas y, como es de esperarse, haría este mercado más eficiente y atractivo para nuevos inversores potenciales
Resumo:
Part 10: Sustainability and Trust
Resumo:
Part 8: Business Strategies Alignment
Resumo:
Part 8: Business Strategies Alignment