7 resultados para data gathering algorithm

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data gathering, either for event recognition or for monitoring applications is the primary intention for sensor network deployments. In many cases, data is acquired periodically and autonomously, and simply logged onto secondary storage (e.g. flash memory) either for delayed offline analysis or for on demand burst transfer. Moreover, operational data such as connectivity information, node and network state is typically kept as well. Naturally, measurement and/or connectivity logging comes at a cost. Space for doing so is limited. Finding a good representative model for the data and providing clever coding of information, thus data compression, may be a means to use the available space to its best. In this paper, we explore the design space for data compression for wireless sensor and mesh networks by profiling common, publicly available algorithms. Several goals such as a low overhead in terms of utilized memory and compression time as well as a decent compression ratio have to be well balanced in order to find a simple, yet effective compression scheme.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Mainstream IDEs such as Eclipse support developers in managing software projects mainly by offering static views of the source code. Such a static perspective neglects any information about runtime behavior. However, object-oriented programs heavily rely on polymorphism and late-binding, which makes them difficult to understand just based on their static structure. Developers thus resort to debuggers or profilers to study the system's dynamics. However, the information provided by these tools is volatile and hence cannot be exploited to ease the navigation of the source space. In this paper we present an approach to augment the static source perspective with dynamic metrics such as precise runtime type information, or memory and object allocation statistics. Dynamic metrics can leverage the understanding for the behavior and structure of a system. We rely on dynamic data gathering based on aspects to analyze running Java systems. By solving concrete use cases we illustrate how dynamic metrics directly available in the IDE are useful. We also comprehensively report on the efficiency of our approach to gather dynamic metrics.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Images of an object under different illumination are known to provide strong cues about the object surface. A mathematical formalization of how to recover the normal map of such a surface leads to the so-called uncalibrated photometric stereo problem. In the simplest instance, this problem can be reduced to the task of identifying only three parameters: the so-called generalized bas-relief (GBR) ambiguity. The challenge is to find additional general assumptions about the object, that identify these parameters uniquely. Current approaches are not consistent, i.e., they provide different solutions when run multiple times on the same data. To address this limitation, we propose exploiting local diffuse reflectance (LDR) maxima, i.e., points in the scene where the normal vector is parallel to the illumination direction (see Fig. 1). We demonstrate several noteworthy properties of these maxima: a closed-form solution, computational efficiency and GBR consistency. An LDR maximum yields a simple closed-form solution corresponding to a semi-circle in the GBR parameters space (see Fig. 2); because as few as two diffuse maxima in different images identify a unique solution, the identification of the GBR parameters can be achieved very efficiently; finally, the algorithm is consistent as it always returns the same solution given the same data. Our algorithm is also remarkably robust: It can obtain an accurate estimate of the GBR parameters even with extremely high levels of outliers in the detected maxima (up to 80 % of the observations). The method is validated on real data and achieves state-of-the-art results.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A robust understanding of Antarctic Ice Sheet deglacial history since the Last Glacial Maximum is important in order to constrain ice sheet and glacial-isostatic adjustment models, and to explore the forcing mechanisms responsible for ice sheet retreat. Such understanding can be derived from a broad range of geological and glaciological datasets and recent decades have seen an upsurge in such data gathering around the continent and Sub-Antarctic islands. Here, we report a new synthesis of those datasets, based on an accompanying series of reviews of the geological data, organised by sector. We present a series of timeslice maps for 20 ka, 15 ka, 10 ka and 5 ka, including grounding line position and ice sheet thickness changes, along with a clear assessment of levels of confidence. The reconstruction shows that the Antarctic Ice sheet did not everywhere reach the continental shelf edge at its maximum, that initial retreat was asynchronous, and that the spatial pattern of deglaciation was highly variable, particularly on the inner shelf. The deglacial reconstruction is consistent with a moderate overall excess ice volume and with a relatively small Antarctic contribution to meltwater pulse 1a. We discuss key areas of uncertainty both around the continent and by time interval, and we highlight potential priorities for future work. The synthesis is intended to be a resource for the modelling and glacial geological community.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

An E15 Initiative think piece: Investment incentives rank among the most important policy instruments governments employ to influence the locational decisions of multinational firms. In the wake of the recent increase in locational competition and the growing impact of investment incentives and support measures for state-owned enterprises (SOEs), the need for enhanced disciplines on investment incentives has gained political and academic salience. This think piece explores the evolution of investment incentives from a development and rule-making perspective. It summarises the existing literature and examines current practices and recent trends in FDI flows and the use of various investment incentives. This is followed by a discussion of the reasons for the observed stalemate in attempts at disciplinary rule-making. The paper concludes by putting forth recommendations for data gathering and transparency that could further the move toward improved global governance founded on the increasing complementarities of trade, investment, and competition law and policy as the core pillars of a more open, inclusive, and just world economy.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A tandem mass spectral database system consists of a library of reference spectra and a search program. State-of-the-art search programs show a high tolerance for variability in compound-specific fragmentation patterns produced by collision-induced decomposition and enable sensitive and specific 'identity search'. In this communication, performance characteristics of two search algorithms combined with the 'Wiley Registry of Tandem Mass Spectral Data, MSforID' (Wiley Registry MSMS, John Wiley and Sons, Hoboken, NJ, USA) were evaluated. The search algorithms tested were the MSMS search algorithm implemented in the NIST MS Search program 2.0g (NIST, Gaithersburg, MD, USA) and the MSforID algorithm (John Wiley and Sons, Hoboken, NJ, USA). Sample spectra were acquired on different instruments and, thus, covered a broad range of possible experimental conditions or were generated in silico. For each algorithm, more than 30,000 matches were performed. Statistical evaluation of the library search results revealed that principally both search algorithms can be combined with the Wiley Registry MSMS to create a reliable identification tool. It appears, however, that a higher degree of spectral similarity is necessary to obtain a correct match with the NIST MS Search program. This characteristic of the NIST MS Search program has a positive effect on specificity as it helps to avoid false positive matches (type I errors), but reduces sensitivity. Thus, particularly with sample spectra acquired on instruments differing in their Setup from tandem-in-space type fragmentation, a comparably higher number of false negative matches (type II errors) were observed by searching the Wiley Registry MSMS.