898 resultados para information content


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Digital watermarking aims at embedding information in digital data. The watermark is usually required to be imperceptible, unremovable and to have a high information content. Unfortunately, these three requirements are contradicting. For example, having a more robust watermark makes it either more perceptible or/and less informative. For Gaussian data and additive white Gaussian noise, an optimal but also impractical scheme has already be devised. Since then, many practical schemes have tried to approach the theoretical limits. This paper investigate improvements to current state-of-the-art embedding schemes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis initially presents an 'assay' of the literature pertaining to individual differences in human-computer interaction. A series of experiments is then reported, designed to investigate the association between a variety of individual characteristics and various computer task and interface factors. Predictor variables included age, computer expertise, and psychometric tests of spatial visualisation, spatial memory, logical reasoning, associative memory, and verbal ability. These were studied in relation to a variety of computer-based tacks, including: (1) word processing and its component elements; (ii) the location of target words within passages of text; (iii) the navigation of networks and menus; (iv) command generation using menus and command line interfaces; (v) the search and selection of icons and text labels; (vi) information retrieval. A measure of self-report workload was also included in several of these experiments. The main experimental findings included: (i) an interaction between spatial ability and the manipulation of semantic but not spatial interface content; (ii) verbal ability being only predictive of certain task components of word processing; (iii) age differences in word processing and information retrieval speed but not accuracy; (iv) evidence of compensatory strategies being employed by older subjects; (v) evidence of performance strategy differences which disadvantaged high spatial subjects in conditions of low spatial information content; (vi) interactive effects of associative memory, expertise and command strategy; (vii) an association between logical reasoning and word processing but not information retrieval; (viii) an interaction between expertise and cognitive demand; and (ix) a stronger association between cognitive ability and novice performance than expert performance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

SPOT simulation imagery was acquired for a test site in the Forest of Dean in Gloucestershire, U.K. This data was qualitatively and quantitatively evaluated for its potential application in forest resource mapping and management. A variety of techniques are described for enhancing the image with the aim of providing species level discrimination within the forest. Visual interpretation of the imagery was more successful than automated classification. The heterogeneity within the forest classes, and in particular between the forest and urban class, resulted in poor discrimination using traditional `per-pixel' automated methods of classification. Different means of assessing classification accuracy are proposed. Two techniques for measuring textural variation were investigated in an attempt to improve classification accuracy. The first of these, a sequential segmentation method, was found to be beneficial. The second, a parallel segmentation method, resulted in little improvement though this may be related to a combination of resolution in size of the texture extraction area. The effect on classification accuracy of combining the SPOT simulation imagery with other data types is investigated. A grid cell encoding technique was selected as most appropriate for storing digitised topographic (elevation, slope) and ground truth data. Topographic data were shown to improve species-level classification, though with sixteen classes overall accuracies were consistently below 50%. Neither sub-division into age groups or the incorporation of principal components and a band ratio significantly improved classification accuracy. It is concluded that SPOT imagery will not permit species level classification within forested areas as diverse as the Forest of Dean. The imagery will be most useful as part of a multi-stage sampling scheme. The use of texture analysis is highly recommended for extracting maximum information content from the data. Incorporation of the imagery into a GIS will both aid discrimination and provide a useful management tool.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We address the question of how to communicate among distributed processes valuessuch as real numbers, continuous functions and geometrical solids with arbitrary precision, yet efficiently. We extend the established concept of lazy communication using streams of approximants by introducing explicit queries. We formalise this approach using protocols of a query-answer nature. Such protocols enable processes to provide valid approximations with certain accuracy and focusing on certain locality as demanded by the receiving processes through queries. A lattice-theoretic denotational semantics of channel and process behaviour is developed. Thequery space is modelled as a continuous lattice in which the top element denotes the query demanding all the information, whereas other elements denote queries demanding partial and/or local information. Answers are interpreted as elements of lattices constructed over suitable domains of approximations to the exact objects. An unanswered query is treated as an error anddenoted using the top element. The major novel characteristic of our semantic model is that it reflects the dependency of answerson queries. This enables the definition and analysis of an appropriate concept of convergence rate, by assigning an effort indicator to each query and a measure of information content to eachanswer. Thus we capture not only what function a process computes, but also how a process transforms the convergence rates from its inputs to its outputs. In future work these indicatorscan be used to capture further computational complexity measures. A robust prototype implementation of our model is available.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Divisia money is a monetary aggregate that gives each component asset an assigned weight. We use an evolutionary neural network to calculate new Divisia weights for each component utilising the Bank of England monetary data for the U.K. We propose a new monetary aggregate using our newly derived weights to carry out quantitative inflation prediction. The results show that this new monetary aggregate has better inflation forecasting performance than the traditionally constructed Bank of England Divisa money. This result is important for monetary policymakers, as improved construction of monetary aggregates will yield tighter relationships between key macroeconomic variables and ultimately, greater macroeconomic control. Research is ongoing to establish the extent of the increased information content and parameter stability of this new monetary aggregate.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The expansion of the Internet has made the task of searching a crucial one. Internet users, however, have to make a great effort in order to formulate a search query that returns the required results. Many methods have been devised to assist in this task by helping the users modify their query to give better results. In this paper we propose an interactive method for query expansion. It is based on the observation that documents are often found to contain terms with high information content, which can summarise their subject matter. We present experimental results, which demonstrate that our approach significantly shortens the time required in order to accomplish a certain task by performing web searches.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study examines the information content of alternative implied volatility measures for the 30 components of the Dow Jones Industrial Average Index from 1996 until 2007. Along with the popular Black-Scholes and \model-free" implied volatility expectations, the recently proposed corridor implied volatil- ity (CIV) measures are explored. For all pair-wise comparisons, it is found that a CIV measure that is closely related to the model-free implied volatility, nearly always delivers the most accurate forecasts for the majority of the firms. This finding remains consistent for different forecast horizons, volatility definitions, loss functions and forecast evaluation settings.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The paper is a description of information and software content of a computer knowledge bank on medical diagnostics. The classes of its users and the tasks which they can solve are described. The information content of the bank contains three ontologies: an ontology of observations in the field of medical diagnostics, an ontology of knowledge base (diseases) in medical diagnostics and an ontology of case records, and also it contains three classes of information resources for every division of medicine – observation bases, knowledge bases, and data bases (with data about patients), that correspond to these ontologies. Software content consists of editors for information of different kinds (ontologies, bases of observations, knowledge and data), and also of a program which performs medical diagnostics.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fluoroscopic images exhibit severe signal-dependent quantum noise, due to the reduced X-ray dose involved in image formation, that is generally modelled as Poisson-distributed. However, image gray-level transformations, commonly applied by fluoroscopic device to enhance contrast, modify the noise statistics and the relationship between image noise variance and expected pixel intensity. Image denoising is essential to improve quality of fluoroscopic images and their clinical information content. Simple average filters are commonly employed in real-time processing, but they tend to blur edges and details. An extensive comparison of advanced denoising algorithms specifically designed for both signal-dependent noise (AAS, BM3Dc, HHM, TLS) and independent additive noise (AV, BM3D, K-SVD) was presented. Simulated test images degraded by various levels of Poisson quantum noise and real clinical fluoroscopic images were considered. Typical gray-level transformations (e.g. white compression) were also applied in order to evaluate their effect on the denoising algorithms. Performances of the algorithms were evaluated in terms of peak-signal-to-noise ratio (PSNR), signal-to-noise ratio (SNR), mean square error (MSE), structural similarity index (SSIM) and computational time. On average, the filters designed for signal-dependent noise provided better image restorations than those assuming additive white Gaussian noise (AWGN). Collaborative denoising strategy was found to be the most effective in denoising of both simulated and real data, also in the presence of image gray-level transformations. White compression, by inherently reducing the greater noise variance of brighter pixels, appeared to support denoising algorithms in performing more effectively. © 2012 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The current research activities of the Institute of Mathematics and Informatics at the Bulgarian Academy of Sciences (IMI—BAS) include the study and application of knowledge-based methods for the creation, integration and development of multimedia digital libraries with applications in cultural heritage. This report presents IMI-BAS’s developments at the digital library management systems and portals, i.e. the Bulgarian Iconographical Digital Library, the Bulgarian Folklore Digital Library and the Bulgarian Folklore Artery, etc. developed during the several national and international projects: - "Digital Libraries with Multimedia Content and its Application in Bulgarian Cultural Heritage" (contract 8/21.07.2005 between the IMI–BAS, and the State Agency for Information Technologies and Communications; - FP6/IST/P-027451 PROJECT LOGOS "Knowledge-on-Demand for Ubiquitous Learning", EU FP6, IST, Priority 2.4.13 "Strengthening the Integration of the ICT research effort in an Enlarged Europe" - NSF project D-002-189 SINUS "Semantic Technologies for Web Services and Technology Enhanced Learning". - NSF project IO-03-03/2006 ―Development of Digital Libraries and Information Portal with Virtual Exposition "Bulgarian Folklore Heritage". The presented prototypes aims to provide flexible and effective access to the multimedia presentation of the cultural heritage artefacts and collections, maintaining different forms and format of the digitized information content and rich functionality for interaction. The developments are a result of long- standing interests and work in the technological developments in information systems, knowledge processing and content management systems. The current research activities aims at creating innovative solutions for assembling multimedia digital libraries for collaborative use in specific cultural heritage context, maintaining their semantic interoperability and creating new services for dynamic aggregation of their resources, access improvement, personification, intelligent curation of content, and content protection. The investigations are directed towards the development of distributed tools for aggregating heterogeneous content and ensuring semantic compatibility with the European digital library EUROPEANA, thus providing possibilities for pan- European access to rich digitalised collections of Bulgarian cultural heritage.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper examines investors' reactions to dividend reductions or omissions conditional on past earnings and dividend patterns for a sample of eighty-two U.S. firms that incurred an annual loss. We document that the market reaction for firms with long patterns of past earnings and dividend payouts is significantly more negative than for firms with lessestablished past earnings and dividends records. Our results can be explained by the following line of reasoning. First, consistent with DeAngelo, DeAngelo, and Skinner (1992), a loss following a long stream of earnings and dividend payments represents an unreliable indicator of future earnings. Thus, established firms have higher loss reliability than less-established firms. Second, because current earnings and dividend policy are a substitute source of means of forecasting future earnings, lower loss reliability increases the information content of dividend reductions. Therefore, given the presence of a loss, the longer the stream of prior earnings and dividend payments, (1) the lower the loss reliability and (2) the more reliably dividend cuts are perceived as an indication that earnings difficulties will persist in the future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This review discusses the connection between quantitative changes of environmental factors and oribatid communities. With the overview of available studies, it can be clearly explored how various characteristics of Oribatid communities are modified due to changes in moisture, temperature, heavy metal concentration, organic matter content and level of disturbance. The most important question concerning the application of Oribatids as indicators is to clarify what kind of information content does natural Oribatid coenological patterns possess from the aspect of bioindication. Most of the variables listed above can be directly measured, since rapid methods are available to quantify parameters of the soil. Responses of Oribatids are worth to study in a more complex approach. Even now we have an expansive knowledge on how communities change due to modifications of different factors. These pieces of information necessitate the elaboration of such methods which render Oribatid communities suitable for the task to prognosticate what extent the given site can be considered near-natural or degraded, based on the Oribatid composition of a single sample taken from the given area. Answering this problem needs extensive and coordinated work.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The main inputs to the hippocampus arise from the entorhinal cortex (EC) and form a loop involving the dentate gyrus, CA3 and CA1 hippocampal subfields and then back to EC. Since the discovery that the hippocampus is involved in memory formation in the 50's, this region and its circuitry have been extensively studied. Beyond memory, the hippocampus has also been found to play an important role in spatial navigation. In rats and mice, place cells show a close relation between firing rate and the animal position in a restricted area of the environment, the so-called place field. The firing of place cells peaks at the center of the place field and decreases when the animal moves away from it, suggesting the existence of a rate code for space. Nevertheless, many have described the emergence of hippocampal network oscillations of multiple frequencies depending on behavioral state, which are believed to be important for temporal coding. In particular, theta oscillations (5-12 Hz) exhibit a spatio-temporal relation with place cells known as phase precession, in which place cells consistently change the theta phase of spiking as the animal traverses the place field. Moreover, current theories state that CA1, the main output stream of the hippocampus, would interplay inputs from EC and CA3 through network oscillations of different frequencies, namely high gamma (60-100 Hz; HG) and low gamma (30-50 Hz; LG), respectively, which tend to be nested in different phases of the theta cycle. In the present dissertation we use a freely available online dataset to make extensive computational analyses aimed at reproducing classical and recent results about the activity of place cells in the hippocampus of freely moving rats. In particular, we revisit the debate of whether phase precession is due to changes in firing frequency or space alone, and conclude that the phenomenon cannot be explained by either factor independently but by their joint influence. We also perform novel analyses investigating further characteristics of place cells in relation to network oscillations. We show that the strength of theta modulation of spikes only marginally affects the spatial information content of place cells, while the mean spiking theta phase has no influence on spatial information. Further analyses reveal that place cells are also modulated by theta when they fire outside the place field. Moreover, we find that the firing of place cells within the theta cycle is modulated by HG and LG amplitude in both CA1 and EC, matching cross-frequency coupling results found at the local field potential level. Additionally, the phase-amplitude coupling in CA1 associated with spikes inside the place field is characterized by amplitude modulation in the 40-80 Hz range. We conclude that place cell firing is embedded in large network states reflected in local field potential oscillations and suggest that their activity might be seen as a dynamic state rather than a fixed property of the cell.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Various physical systems have dynamics that can be modeled by percolation processes. Percolation is used to study issues ranging from fluid diffusion through disordered media to fragmentation of a computer network caused by hacker attacks. A common feature of all of these systems is the presence of two non-coexistent regimes associated to certain properties of the system. For example: the disordered media can allow or not allow the flow of the fluid depending on its porosity. The change from one regime to another characterizes the percolation phase transition. The standard way of analyzing this transition uses the order parameter, a variable related to some characteristic of the system that exhibits zero value in one of the regimes and a nonzero value in the other. The proposal introduced in this thesis is that this phase transition can be investigated without the explicit use of the order parameter, but rather through the Shannon entropy. This entropy is a measure of the uncertainty degree in the information content of a probability distribution. The proposal is evaluated in the context of cluster formation in random graphs, and we apply the method to both classical percolation (Erd¨os- R´enyi) and explosive percolation. It is based in the computation of the entropy contained in the cluster size probability distribution and the results show that the transition critical point relates to the derivatives of the entropy. Furthermore, the difference between the smooth and abrupt aspects of the classical and explosive percolation transitions, respectively, is reinforced by the observation that the entropy has a maximum value in the classical transition critical point, while that correspondence does not occurs during the explosive percolation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Various physical systems have dynamics that can be modeled by percolation processes. Percolation is used to study issues ranging from fluid diffusion through disordered media to fragmentation of a computer network caused by hacker attacks. A common feature of all of these systems is the presence of two non-coexistent regimes associated to certain properties of the system. For example: the disordered media can allow or not allow the flow of the fluid depending on its porosity. The change from one regime to another characterizes the percolation phase transition. The standard way of analyzing this transition uses the order parameter, a variable related to some characteristic of the system that exhibits zero value in one of the regimes and a nonzero value in the other. The proposal introduced in this thesis is that this phase transition can be investigated without the explicit use of the order parameter, but rather through the Shannon entropy. This entropy is a measure of the uncertainty degree in the information content of a probability distribution. The proposal is evaluated in the context of cluster formation in random graphs, and we apply the method to both classical percolation (Erd¨os- R´enyi) and explosive percolation. It is based in the computation of the entropy contained in the cluster size probability distribution and the results show that the transition critical point relates to the derivatives of the entropy. Furthermore, the difference between the smooth and abrupt aspects of the classical and explosive percolation transitions, respectively, is reinforced by the observation that the entropy has a maximum value in the classical transition critical point, while that correspondence does not occurs during the explosive percolation.