952 resultados para Data recovery (Computer science)
Resumo:
In this work, we propose a novel network coding enabled NDN architecture for the delivery of scalable video. Our scheme utilizes network coding in order to address the problem that arises in the original NDN protocol, where optimal use of the bandwidth and caching resources necessitates the coordination of the forwarding decisions. To optimize the performance of the proposed network coding based NDN protocol and render it appropriate for transmission of scalable video, we devise a novel rate allocation algorithm that decides on the optimal rates of Interest messages sent by clients and intermediate nodes. This algorithm guarantees that the achieved flow of Data objects will maximize the average quality of the video delivered to the client population. To support the handling of Interest messages and Data objects when intermediate nodes perform network coding, we modify the standard NDN protocol and introduce the use of Bloom filters, which store efficiently additional information about the Interest messages and Data objects. The proposed architecture is evaluated for transmission of scalable video over PlanetLab topologies. The evaluation shows that the proposed scheme performs very close to the optimal performance
Resumo:
It is a challenge to measure the impact of releasing data to the public since the effects may not be directly linked to particular open data activities or substantial impact may only occur several years after publishing the data. This paper proposes a framework to assess the impact of releasing open data by applying the Social Return on Investment (SROI) approach. SROI was developed for organizations intended to generate social and environmental benefits thus fitting the purpose of most open data initiatives. We link the four steps of SROI (input, output, outcome, impact) with the 14 high-value data categories of the G8 Open Data Charter to create a matrix of open data examples, activities, and impacts in each of the data categories. This Impact Monitoring Framework helps data providers to navigate the impact space of open data laying out the conceptual basis for further research.
Resumo:
Quality data are not only relevant for successful Data Warehousing or Business Intelligence applications; they are also a precondition for efficient and effective use of Enterprise Resource Planning (ERP) systems. ERP professionals in all kinds of businesses are concerned with data quality issues, as a survey, conducted by the Institute of Information Systems at the University of Bern, has shown. This paper demonstrates, by using results of this survey, why data quality problems in modern ERP systems can occur and suggests how ERP researchers and practitioners can handle issues around the quality of data in an ERP software Environment.
Resumo:
Project justification is regarded as one of the major methodological deficits in Data Warehousing practice. As reasons for applying inappropriate methods, performing incomplete evaluations, or even entirely omitting justifications, the special nature of Data Warehousing benefits and the large portion of infrastructure-related activities are stated. In this paper, the economic justification of Data Warehousing projects is analyzed, and first results from a large academiaindustry collaboration project in the field of non-technical issues of Data Warehousing are presented. As conceptual foundations, the role of the Data Warehouse system in corporate application architectures is analyzed, and the specific properties of Data Warehousing projects are discussed. Based on an applicability analysis of traditional approaches to economic IT project justification, basic steps and responsibilities for the justification of Data Warehousing projects are derived.
Resumo:
Postprint
Resumo:
The robotics is one of the most active areas. We also need to join a large number of disciplines to create robots. With these premises, one problem is the management of information from multiple heterogeneous sources. Each component, hardware or software, produces data with different nature: temporal frequencies, processing needs, size, type, etc. Nowadays, technologies and software engineering paradigms such as service-oriented architectures are applied to solve this problem in other areas. This paper proposes the use of these technologies to implement a robotic control system based on services. This type of system will allow integration and collaborative work of different elements that make up a robotic system.
Resumo:
We propose an original method to geoposition an audio/video stream with multiple emitters that are at the same time receivers of the mixed signal. The achieved method is suitable for those comes where a list of positions within a designated area is encoded with a degree of precision adjusted to the visualization capabilities; and is also easily extensible to support new requirements. This method extends a previously proposed protocol, without incurring in any performance penalty.
Resumo:
3D sensors provides valuable information for mobile robotic tasks like scene classification or object recognition, but these sensors often produce noisy data that makes impossible applying classical keypoint detection and feature extraction techniques. Therefore, noise removal and downsampling have become essential steps in 3D data processing. In this work, we propose the use of a 3D filtering and down-sampling technique based on a Growing Neural Gas (GNG) network. GNG method is able to deal with outliers presents in the input data. These features allows to represent 3D spaces, obtaining an induced Delaunay Triangulation of the input space. Experiments show how the state-of-the-art keypoint detectors improve their performance using GNG output representation as input data. Descriptors extracted on improved keypoints perform better matching in robotics applications as 3D scene registration.
Resumo:
Abstract Imprecise manipulation of source code (semi-parsing) is useful for tasks such as robust parsing, error recovery, lexical analysis, and rapid development of parsers for data extraction. An island grammar precisely defines only a subset of a language syntax (islands), while the rest of the syntax (water) is defined imprecisely. Usually water is defined as the negation of islands. Albeit simple, such a definition of water is naive and impedes composition of islands. When developing an island grammar, sooner or later a language engineer has to create water tailored to each individual island. Such an approach is fragile, because water can change with any change of a grammar. It is time-consuming, because water is defined manually by an engineer and not automatically. Finally, an island surrounded by water cannot be reused because water has to be defined for every grammar individually. In this paper we propose a new technique of island parsing —- bounded seas. Bounded seas are composable, robust, reusable and easy to use because island-specific water is created automatically. Our work focuses on applications of island parsing to data extraction from source code. We have integrated bounded seas into a parser combinator framework as a demonstration of their composability and reusability.
Resumo:
"UIUCDCS-R-75-725"
Resumo:
Thesis (M.S.)--University of Illinois at Urbana-Champaign.
Resumo:
Originally presented as the author's thesis (M.S.), University of Illinois at Urbana-Champaign.
Resumo:
Thesis (M.S.)--University of Illinois at Urbana-Champaign.
Resumo:
Thesis (M. S.)--University of Illinois at Urbana-Champaign.