5 resultados para Bitrate overhead

em Universidad de Alicante


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study analyzes the repeatability, reproducibility and accuracy of a new hyperspectral system based on a pushbroom sensor as a means of measuring spectral features and color of materials and objects. The hyperspectral system consisted of a CCD camera, a spectrograph and an objective lens. An additional linear moving system allowed the mechanical scanning of the complete scene. A uniform overhead luminaire with daylight configuration was used to irradiate the scene using d:45 geometry. We followed the guidelines of the ASTM E2214-08 Standard Practice for Specifying and Verifying the Performance of Color-Measuring Instruments that define the standards and latest multidimensional procedures. The results obtained are analyzed in-depth and compared to those recently reported by other authors for spectrophotometers and multispectral systems. It can be concluded that hyperspectral systems are reliable and can be used in the industry to perform spectral and color readings with a high spatial resolution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Commercial off-the-shelf microprocessors are the core of low-cost embedded systems due to their programmability and cost-effectiveness. Recent advances in electronic technologies have allowed remarkable improvements in their performance. However, they have also made microprocessors more susceptible to transient faults induced by radiation. These non-destructive events (soft errors), may cause a microprocessor to produce a wrong computation result or lose control of a system with catastrophic consequences. Therefore, soft error mitigation has become a compulsory requirement for an increasing number of applications, which operate from the space to the ground level. In this context, this paper uses the concept of selective hardening, which is aimed to design reduced-overhead and flexible mitigation techniques. Following this concept, a novel flexible version of the software-based fault recovery technique known as SWIFT-R is proposed. Our approach makes possible to select different registers subsets from the microprocessor register file to be protected on software. Thus, design space is enriched with a wide spectrum of new partially protected versions, which offer more flexibility to designers. This permits to find the best trade-offs between performance, code size, and fault coverage. Three case studies have been developed to show the applicability and flexibility of the proposal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The design of fault tolerant systems is gaining importance in large domains of embedded applications where design constrains are as important as reliability. New software techniques, based on selective application of redundancy, have shown remarkable fault coverage with reduced costs and overheads. However, the large number of different solutions provided by these techniques, and the costly process to assess their reliability, make the design space exploration a very difficult and time-consuming task. This paper proposes the integration of a multi-objective optimization tool with a software hardening environment to perform an automatic design space exploration in the search for the best trade-offs between reliability, cost, and performance. The first tool is commanded by a genetic algorithm which can simultaneously fulfill many design goals thanks to the use of the NSGA-II multi-objective algorithm. The second is a compiler-based infrastructure that automatically produces selective protected (hardened) versions of the software and generates accurate overhead reports and fault coverage estimations. The advantages of our proposal are illustrated by means of a complex and detailed case study involving a typical embedded application, the AES (Advanced Encryption Standard).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this study was to develop an anthropometric profile on highly skilled male water polo players by specific playing positions. Also, to identify significant relationships between these features an overhead throwing speed in highly skilled male Water Polo players by specific playing positions. Methods: A total of 94 male water polo players (24.5±5.3 yrs) who were playing in the Spanish King´s cup were studied. Subjects were grouped according to their specific playing positions: 15 goalkeepers, 45 offensive wings, 20 center backs and 14 center forwards. Anthropometric assessment was made following ISAK protocols. Hand grip and throwing speed in several situations were also assessed. A one-way analysis of variance (ANOVA) was used to determine if significant differences existed among the four playing positions. Pearson product-moment correlation coefficients (r) were used to determine the relationships of all anthropometric measures with throwing speed and hand grip. The total player’s somatotype was endomorphic-mesomorphic (2.9–5.8–2.3). Center forwards exhibit important anthropometric differences compared with the other specific playing positions in elite male water polo players, but no differences were found in throwing speed by specific playing positions in each throwing conditions. Moreover, a higher number of relationships between anthropometric and throwing speed were found in wings and also in center backs but no relationships were found in center forwards. The data reflects the importance of muscle mass and upper body in the throwing skill. Coaches can use this information in order to select players for the different specific positions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Software-based techniques offer several advantages to increase the reliability of processor-based systems at very low cost, but they cause performance degradation and an increase of the code size. To meet constraints in performance and memory, we propose SETA, a new control-flow software-only technique that uses assertions to detect errors affecting the program flow. SETA is an independent technique, but it was conceived to work together with previously proposed data-flow techniques that aim at reducing performance and memory overheads. Thus, SETA is combined with such data-flow techniques and submitted to a fault injection campaign. Simulation and neutron induced SEE tests show high fault coverage at performance and memory overheads inferior to the state-of-the-art.