1000 resultados para information.
Resumo:
The study contributes to our understanding of the forces that drive the stock market by investigating how different types of investors react to new financial statement information. Using the extremely comprehensive official register of share holdings in Finland, we find that the majority of investors are more probable to sell (buy) stocks in a company after a positive (negative) earnings surprise, and show a bias towards buying after the disclosure of new financial statement information. Large investors, on the other hand, show behavior opposite to that of the majority of investors in the market. Further, foreign investors show behavior similar to that of domestic investors. We suggest investor overconfidence and asymmetric information as possible explanations for the documented behavior.
Resumo:
Inspired by the recent debate in the financial press, we set out to investigate if financial analysts warn their preferred customers of possible earnings forecast revisions. The issue is explored by monitoring investors’ trading behavior during the weeks prior to analyst earnings forecast revisions, using the unique official stock transactions data set from Finland. In summary, we do not find evidence of large investors systematically being warned of earnings forecast revisions. However, the results indicate that the very largest investors show trading behavior partly consistent with being informed of future earnings forecast revisions.
Resumo:
We study the problem of guessing the realization of a finite alphabet source, when some side information is provided, in a setting where the only knowledge the guesser has about the source and the correlated side information is that the joint source is one among a family. We define a notion of redundancy, identify a quantity that measures this redundancy, and study its properties. We then identify good guessing strategies that minimize the supremum redundancy (over the family). The minimum value measures the richness of the uncertainty class.
Resumo:
In this paper, we introduce the three-user cognitive radio channels with asymmetric transmitter cooperation, and derive achievable rate regions under several scenarios depending on the type of cooperation and decoding capability at the receivers. Two of the most natural cooperation mechanisms for the three-user channel are considered here: cumulative message sharing (CMS) and primary-only message sharing (PMS). In addition to the message sharing mechanism, the achievable rate region is critically dependent on the decoding capability at the receivers. Here, we consider two scenarios for the decoding capability, and derive an achievable rate region for each one of them by employing a combination of superposition and Gel'fand-Pinsker coding techniques. Finally, to provide a numerical example, we consider the Gaussian channel model to plot the rate regions. In terms of achievable rates, CMS turns out to be a better scheme than PMS. However, the practical aspects of implementing such message-sharing schemes remain to be investigated.
Resumo:
In this paper we address the problem of transmission of correlated sources over a fast fading multiple access channel (MAC) with partial channel state information available at both the encoders and the decoder. We provide sufficient conditions for transmission with given distortions. Next these conditions are specialized to a Gaussian MAC (GMAC). We provide the optimal power allocation strategy and compare the strategy with various levels of channel state information.
Resumo:
The research question of this thesis was how knowledge can be managed with information systems. Information systems can support but not replace knowledge management. Systems can mainly store epistemic organisational knowledge included in content, and process data and information. Certain value can be achieved by adding communication technology to systems. All communication, however, can not be managed. A new layer between communication and manageable information was named as knowformation. Knowledge management literature was surveyed, together with information species from philosophy, physics, communication theory, and information system science. Positivism, post-positivism, and critical theory were studied, but knowformation in extended organisational memory seemed to be socially constructed. A memory management model of an extended enterprise (M3.exe) and knowformation concept were findings from iterative case studies, covering data, information and knowledge management systems. The cases varied from groups towards extended organisation. Systems were investigated, and administrators, users (knowledge workers) and managers interviewed. The model building required alternative sets of data, information and knowledge, instead of using the traditional pyramid. Also the explicit-tacit dichotomy was reconsidered. As human knowledge is the final aim of all data and information in the systems, the distinction between management of information vs. management of people was harmonised. Information systems were classified as the core of organisational memory. The content of the systems is in practice between communication and presentation. Firstly, the epistemic criterion of knowledge is not required neither in the knowledge management literature, nor from the content of the systems. Secondly, systems deal mostly with containers, and the knowledge management literature with applied knowledge. Also the construction of reality based on the system content and communication supports the knowformation concept. Knowformation belongs to memory management model of an extended enterprise (M3.exe) that is divided into horizontal and vertical key dimensions. Vertically, processes deal with content that can be managed, whereas communication can be supported, mainly by infrastructure. Horizontally, the right hand side of the model contains systems, and the left hand side content, which should be independent from each other. A strategy based on the model was defined.
Resumo:
A method was developed for relative radiometric calibration of single multitemporal Landsat TM image, several multitemporal images covering each others, and several multitemporal images covering different geographic locations. The radiometricly calibrated difference images were used for detecting rapid changes on forest stands. The nonparametric Kernel method was applied for change detection. The accuracy of the change detection was estimated by inspecting the image analysis results in field. The change classification was applied for controlling the quality of the continuously updated forest stand information. The aim was to ensure that all the manmade changes and any forest damages were correctly updated including the attribute and stand delineation information. The image analysis results were compared with the registered treatments and the stand information base. The stands with discrepancies between these two information sources were recommended to be field inspected.
Resumo:
The world of mapping has changed. Earlier, only professional experts were responsible for map production, but today ordinary people without any training or experience can become map-makers. The number of online mapping sites, and the number of volunteer mappers has increased significantly. The development of the technology, such as satellite navigation systems, Web 2.0, broadband Internet connections, and smartphones, have had one of the key roles in enabling the rise of volunteered geographic information (VGI). As opening governmental data to public is a current topic in many countries, the opening of high quality geographical data has a central role in this study. The aim of this study is to investigate how is the quality of spatial data produced by volunteers by comparing it with the map data produced by public authorities, to follow what occurs when spatial data are opened for users, and to get acquainted with the user profile of these volunteer mappers. A central part of this study is OpenStreetMap project (OSM), which aim is to create a map of the entire world by volunteers. Anyone can become an OpenStreetMap contributor, and the data created by the volunteers are free to use for anyone without restricting copyrights or license charges. In this study OpenStreetMap is investigated from two viewpoints. In the first part of the study, the aim was to investigate the quality of volunteered geographic information. A pilot project was implemented by following what occurs when a high-resolution aerial imagery is released freely to the OpenStreetMap contributors. The quality of VGI was investigated by comparing the OSM datasets with the map data of The National Land Survey of Finland (NLS). The quality of OpenStreetMap data was investigated by inspecting the positional accuracy and the completeness of the road datasets, as well as the differences in the attribute datasets between the studied datasets. Also the OSM community was under analysis and the development of the map data of OpenStreetMap was investigated by visual analysis. The aim of the second part of the study was to analyse the user profile of OpenStreetMap contributors, and to investigate how the contributors act when collecting data and editing OpenStreetMap. The aim was also to investigate what motivates users to map and how is the quality of volunteered geographic information envisaged. The second part of the study was implemented by conducting a web inquiry to the OpenStreetMap contributors. The results of the study show that the quality of OpenStreetMap data compared with the data of National Land Survey of Finland can be defined as good. OpenStreetMap differs from the map of National Land Survey especially because of the amount of uncertainty, for example because of the completeness and uniformity of the map are not known. The results of the study reveal that opening spatial data increased notably the amount of the data in the study area, and both the positional accuracy and completeness improved significantly. The study confirms the earlier arguments that only few contributors have created the majority of the data in OpenStreetMap. The inquiry made for the OpenStreetMap users revealed that the data are most often collected by foot or by bicycle using GPS device, or by editing the map with the help of aerial imageries. According to the responses, the users take part to the OpenStreetMap project because they want to make maps better, and want to produce maps, which have information that is up-to-date and cannot be found from any other maps. Almost all of the users exploit the maps by themselves, most popular methods being downloading the map into a navigator or into a mobile device. The users regard the quality of OpenStreetMap as good, especially because of the up-to-dateness and the accuracy of the map.